You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/11/25 00:12:11 UTC

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1619

See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1619/display/redirect>

Changes:


------------------------------------------
[...truncated 1.33 MB...]
19/11/25 00:12:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1574640717.82_3656bd3d-312a-4a68-846a-c4cf1c74dd3f finished.
19/11/25 00:12:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 00:12:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest4sfRF0/job_ce26435b-7f53-4efb-ae83-d7e748c7075d/MANIFEST has 0 artifact locations
19/11/25 00:12:03 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest4sfRF0/job_ce26435b-7f53-4efb-ae83-d7e748c7075d/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f7286534230> ====================
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5 on Spark master local
19/11/25 00:12:04 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/25 00:12:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5: Pipeline translated successfully. Computing outputs
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST has 0 artifact locations
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:04 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:40625
19/11/25 00:12:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:04 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42961.
19/11/25 00:12:04 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/25 00:12:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39927.
19/11/25 00:12:04 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:35461
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:04 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:04 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:04 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:05 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:42933
19/11/25 00:12:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:05 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36185.
19/11/25 00:12:05 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 00:12:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37889.
19/11/25 00:12:05 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:37753
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:05 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:05 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:05 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:06 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:44627
19/11/25 00:12:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:06 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35351.
19/11/25 00:12:06 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 00:12:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43383.
19/11/25 00:12:06 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:38823
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:06 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:06 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:07 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:32845
19/11/25 00:12:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:07 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42771.
19/11/25 00:12:07 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 00:12:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44161.
19/11/25 00:12:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:38357
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:07 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:07 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:07 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST -> 0 artifacts
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 00:12:07 INFO sdk_worker_main.main: Logging handler created.
19/11/25 00:12:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:35337
19/11/25 00:12:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 00:12:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 00:12:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 00:12:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574640723.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42673', 'job_port': u'0'}
19/11/25 00:12:07 INFO statecache.__init__: Creating state cache with size 0
19/11/25 00:12:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44931.
19/11/25 00:12:07 INFO sdk_worker.__init__: Control channel established.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 00:12:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 00:12:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41141.
19/11/25 00:12:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 00:12:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:45377
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 00:12:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 00:12:07 INFO sdk_worker.run: No more requests from control plane
19/11/25 00:12:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 00:12:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:08 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 00:12:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 00:12:08 INFO sdk_worker.run: Done consuming work.
19/11/25 00:12:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 00:12:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 00:12:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 00:12:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574640723.08_0d363e57-9d14-4ca2-8a77-50f53142ffc5 finished.
19/11/25 00:12:08 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 00:12:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/MANIFEST has 0 artifact locations
19/11/25 00:12:08 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest4sfRF0/job_e8e0b38e-f150-4a46-a53f-25adfa15a47c/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================
    _common.wait(self._state.condition.wait, _response_ready)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140129263933184)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "ap# Thread: <Thread(Thread-118, started daemon 140129280718592)>

# Thread: <_MainThread(MainThread, started 140130133227264)>
ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574640714.42_6b489e63-beb7-49c3-b55c-710eb8e0cf6f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 267.897s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 56s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/admhpvb6ipom4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_VR_Spark #1737

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1737/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1736

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1736/display/redirect?page=changes>

Changes:

[kcweaver] Version Flink job server container images

[kcweaver] [BEAM-8337] publish Flink job server container images

[ningk] [BEAM-7926] Data-centric Interactive Part1

[kcweaver] Get Flink version numbers from subdirectories

[kcweaver] Warn if Flink versions can't be listed.


------------------------------------------
[...truncated 1.55 MB...]
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46883
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40145.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40853.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42997
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41917
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39631.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40747.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:32889
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36863
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34651.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44799.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34641
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38251
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/10 01:08:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/10 01:08:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575940079.97', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58733', 'job_port': u'0'}
19/12/10 01:08:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39609.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43377.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/10 01:08:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37915
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/10 01:08:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/10 01:08:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/10 01:08:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/10 01:08:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/10 01:08:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/10 01:08:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575940079.97_99e63056-ab2b-43ae-97e2-606351b62399 finished.
19/12/10 01:08:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/10 01:08:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_be1ea735-f5f0-4aa1-aba9-404145ec7f6a","basePath":"/tmp/sparktestjR7oTh"}: {}
java.io.FileNotFoundException: /tmp/sparktestjR7oTh/job_be1ea735-f5f0-4aa1-aba9-404145ec7f6a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 437, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
==================== Timed out after 60 seconds. ====================
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 437, in __exit__
    self.run().wait_until_finish()

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140280113014528)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140280096229120)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140280892753664)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140279470356224)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-123, started daemon 140280086787840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140280892753664)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-119, started daemon 140280096229120)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 437, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140280113014528)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575940071.06_db01f780-054e-4e1f-89db-4815b11c31a8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.342s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 58s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/5fxnkvyk5jluu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1735

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1735/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-8335] Adds support for multi-output TestStream (#9953)


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 22:32:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33585
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:14 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:14 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:14 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34871.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45863.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:14 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34697
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:14 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:14 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40597
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:15 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:15 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:15 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37651.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37431.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:15 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37759
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:15 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:15 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40605
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:16 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:16 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:16 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39769.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45149.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:16 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36083
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:16 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:16 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38353
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 22:32:17 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 22:32:17 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575930732.01', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50649', 'job_port': u'0'}
19/12/09 22:32:17 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34027.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43745.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 22:32:17 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37715
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 22:32:17 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 22:32:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 22:32:17 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 22:32:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 22:32:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 22:32:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 22:32:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575930732.01_04192864-f9f4-4557-80cb-e9c06487600c finished.
19/12/09 22:32:17 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 22:32:17 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a0f58f6c-4dba-44df-919e-3ce0502a6998","basePath":"/tmp/sparktest91Ft0Y"}: {}
java.io.FileNotFoundException: /tmp/sparktest91Ft0Y/job_a0f58f6c-4dba-44df-919e-3ce0502a6998/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers

    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 139626339813120)>

# Thread: <Thread(Thread-116, started daemon 139626348205824)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139627127944960)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139626313848576)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-122, started daemon 139626322241280)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139627127944960)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575930723.54_4186c2fb-8121-47be-a20f-0933e14da306 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 294.635s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 31s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ja5j63mpm6jru

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1734

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1734/display/redirect?page=changes>

Changes:

[heejong] [BEAM-8903] handling --jar_packages experimental flag in PortableRunner


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37727
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:37 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:37 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:37 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44541.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39105.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:37 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36713
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:37 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39215
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:38 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:38 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:38 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41373.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40849.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:38 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43801
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:38 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41651
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:39 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:39 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:39 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45687.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43385.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:39 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38781
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:39 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:39 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:39 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39989
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 18:13:40 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 18:13:40 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575915215.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58009', 'job_port': u'0'}
19/12/09 18:13:40 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41587.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34507.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 18:13:40 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39195
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 18:13:40 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 18:13:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 18:13:40 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 18:13:40 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 18:13:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 18:13:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 18:13:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575915215.18_baab558b-d6c5-43ba-9edd-ed0bba7fa088 finished.
19/12/09 18:13:40 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 18:13:40 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_4de5cfcd-0dad-4e22-9d73-f8f0b1992961","basePath":"/tmp/sparktestrNVtlv"}: {}
java.io.FileNotFoundException: /tmp/sparktestrNVtlv/job_4de5cfcd-0dad-4e22-9d73-f8f0b1992961/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================
======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 139728162645760)>

# Thread: <Thread(Thread-120, started daemon 139728154253056)>

# Thread: <_MainThread(MainThread, started 139729287423744)>
==================== Timed out after 60 seconds. ====================

Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139728137467648)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-124, started daemon 139728145860352)>

# Thread: <Thread(Thread-120, started daemon 139728154253056)>

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 139728162645760)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139729287423744)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575915207.26_5894b3c6-6fb0-4395-9732-c8e5700a2208 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 344.483s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 20s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/wmzs676h6pnly

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1733

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1733/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7390] Add code snippet for Mean

[nielm] Add limit on number of mutated rows to batching/sorting stages.


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 17:50:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e on Spark master local
19/12/09 17:50:25 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 17:50:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e: Pipeline translated successfully. Computing outputs
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37601
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:25 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:25 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:25 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45333.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34645.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:25 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:44701
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:25 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:25 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:25 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:42917
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:26 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:26 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:26 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38947.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42731.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:26 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43735
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:26 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:26 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36873
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:27 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:27 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:27 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35647.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39271.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:27 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42263
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:27 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:27 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39309
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:28 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:28 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:28 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44753.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45279.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:28 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42871
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:28 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:28 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44081
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 17:50:29 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 17:50:29 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575913824.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44153', 'job_port': u'0'}
19/12/09 17:50:29 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45375.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45793.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 17:50:29 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35891
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 17:50:29 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 17:50:29 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 17:50:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 17:50:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 17:50:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 17:50:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575913824.26_55dcbf41-75bc-4cd6-90c1-110dc0bb998e finished.
19/12/09 17:50:29 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 17:50:29 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_611a5cae-20e5-408f-a1d5-b36b36a8a2a2","basePath":"/tmp/sparktestKn42fl"}: {}
java.io.FileNotFoundException: /tmp/sparktestKn42fl/job_611a5cae-20e5-408f-a1d5-b36b36a8a2a2/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
# Thread: <Thread(wait_until_finish_read, started daemon 140431663650560)>

    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-119, started daemon 140431655257856)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575913816.41_e7496de5-5756-4596-b053-c4a306ceff8b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 140432443389696)>
Ran 38 tests in 263.596s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 55s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/lb743pq37khum

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1732

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1732/display/redirect?page=changes>

Changes:

[github] Changing RowAsDictJsonCoder implementation for efficiency (#10300)

[github] Merge pull request #10151: [BEAM-7116] Remove use of KV in Schema


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 16:58:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f on Spark master local
19/12/09 16:58:31 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 16:58:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f: Pipeline translated successfully. Computing outputs
19/12/09 16:58:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40919
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:33 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:33 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:33 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36993.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39161.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:33 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46663
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:33 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:33 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37491
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:34 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:34 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:34 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33405.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41857.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46185
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35791
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:34 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:34 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:34 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38381.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44513.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38493
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:34 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:34 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:34 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39593
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:35 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:35 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:35 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44567.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35541.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:35 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41561
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:35 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:35 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33543
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 16:58:36 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 16:58:36 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575910710.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40169', 'job_port': u'0'}
19/12/09 16:58:36 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45585.
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:36269.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 16:58:36 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36271
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 16:58:36 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 16:58:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 16:58:36 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 16:58:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 16:58:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 16:58:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 16:58:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575910710.89_cf097cfe-16f5-42ec-8002-4d9af9623a9f finished.
19/12/09 16:58:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 16:58:36 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_26fbcb58-6bf1-428b-9821-76f67e85271a","basePath":"/tmp/sparktestYC2f7d"}: {}
java.io.FileNotFoundException: /tmp/sparktestYC2f7d/job_26fbcb58-6bf1-428b-9821-76f67e85271a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
==================== Timed out after 60 seconds. ====================

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575910703.23_f383ba58-4745-4b22-ac38-2d7365a9bd69 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 140595631769344)>

# Thread: <Thread(Thread-120, started daemon 140595284670208)>

----------------------------------------------------------------------
Ran 38 tests in 286.679s

# Thread: <_MainThread(MainThread, started 140596411008768)>
FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 19s
60 actionable tasks: 56 executed, 4 from cache

Publishing build scan...
https://scans.gradle.com/s/jzxwv7uqdgftq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1731

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1731/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-8895] Add BigQuery table name sanitization to BigQueryIOIT

[michal.walenia] [BEAM-8918] Split batch BQIOIT into avro and json using tests


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36339
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:14 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:14 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:14 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38649.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46525.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:14 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46527
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:14 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:14 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40829
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:15 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:15 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:15 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37033.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34659.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:15 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45653
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:15 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:15 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46305
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:16 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:16 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:16 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33491.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39537.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:16 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46153
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:16 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:16 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:16 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41155
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:53:17 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:53:17 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575895992.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35417', 'job_port': u'0'}
19/12/09 12:53:17 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37345.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38467.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:53:17 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46013
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:53:17 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:53:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:53:17 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:53:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:53:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:53:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:53:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575895992.03_c97f5a38-9831-429e-90aa-52ea4b57eb68 finished.
19/12/09 12:53:17 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 12:53:17 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_19057db3-6693-4158-bb4c-e570b713686b","basePath":"/tmp/sparktestI3AjBJ"}: {}
java.io.FileNotFoundException: /tmp/sparktestI3AjBJ/job_19057db3-6693-4158-bb4c-e570b713686b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


# Thread: <Thread(wait_until_finish_read, started daemon 140595502864128)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-119, started daemon 140595494471424)>

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140596290995968)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 140595007579904)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-125, started daemon 140595015972608)>

# Thread: <Thread(Thread-119, started daemon 140595494471424)>

# Thread: <Thread(wait_until_finish_read, started daemon 140595502864128)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140596290995968)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575895983.48_ea2a82bc-014e-4c72-9363-80a6c5c6ce41 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 312.712s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 14s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/kszdjin5vdoww

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1730

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1730/display/redirect>

Changes:


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 12:13:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc on Spark master local
19/12/09 12:13:50 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 12:13:50 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc: Pipeline translated successfully. Computing outputs
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41237
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:50 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:50 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:50 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37729.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45755.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:50 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45137
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:50 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46495
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:51 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:51 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:51 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41469.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37459.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:51 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41015
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:51 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44165
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:52 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:52 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:52 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35463.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35545.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:52 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40121
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:52 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33959
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:53 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:53 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43591.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34465.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:53 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43919
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:53 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46121
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 12:13:54 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 12:13:54 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575893629.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43627', 'job_port': u'0'}
19/12/09 12:13:54 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44817.
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38255.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 12:13:54 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34501
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:54 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 12:13:54 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 12:13:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 12:13:54 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575893629.23_eb025089-2f52-4119-bc90-75f4501347dc finished.
19/12/09 12:13:54 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 12:13:54 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_edf0ba45-d07a-4d5a-9618-2a573fb5acc8","basePath":"/tmp/sparktestOPI8ZM"}: {}
java.io.FileNotFoundException: /tmp/sparktestOPI8ZM/job_edf0ba45-d07a-4d5a-9618-2a573fb5acc8/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 139980113262336)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-120, started daemon 139980381812480)>

BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139980900333312)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575893621.28_e8f13f82-1cc0-400a-903a-80a1d1530c27 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 274.506s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/2bqu2v23gbm7c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1729

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1729/display/redirect>

Changes:


------------------------------------------
[...truncated 1.54 MB...]
19/12/09 06:12:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58 on Spark master local
19/12/09 06:12:58 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/09 06:12:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58: Pipeline translated successfully. Computing outputs
19/12/09 06:12:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35495
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:12:59 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:12:59 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:12:59 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36409.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:36613.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:12:59 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42563
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:12:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:12:59 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:12:59 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:12:59 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:12:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:12:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46503
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:00 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:00 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:00 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40351.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33413.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:00 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33735
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:00 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:00 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:00 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46761
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:01 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:01 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:01 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33025.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43703.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:01 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33801
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:01 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:01 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:01 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34999
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40177.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42649.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39097
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46309
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 06:13:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 06:13:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 06:13:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575871977.89', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54893', 'job_port': u'0'}
19/12/09 06:13:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44493.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 06:13:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39143.
19/12/09 06:13:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 06:13:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37615
19/12/09 06:13:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 06:13:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 06:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 06:13:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 06:13:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 06:13:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 06:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 06:13:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575871977.89_8789b5f2-f332-457c-8345-3c8b23e54d58 finished.
19/12/09 06:13:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 06:13:03 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_018d9b40-6542-4f5d-8665-7f436242bd62","basePath":"/tmp/sparktestP9l80U"}: {}
java.io.FileNotFoundException: /tmp/sparktestP9l80U/job_018d9b40-6542-4f5d-8665-7f436242bd62/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140230031533824)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-119, started daemon 140230039926528)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140230828058368)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575871969.54_dd5ea035-13c9-4976-a184-9f94407f49a7 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.799s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/wzutcuyg3z7so

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1728

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1728/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34137
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:42 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:42 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:42 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43567.
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33645.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:42 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41005
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:42 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:42 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:42 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40833
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:43 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:43 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:43 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41901.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44979.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:43 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34941
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:43 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:43 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:43 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33131
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:44 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:44 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:44 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35059.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46667.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:44 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41459
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:44 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:44 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:44 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36275
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/09 00:12:45 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/09 00:12:45 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575850359.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57041', 'job_port': u'0'}
19/12/09 00:12:45 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46437.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35153.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/09 00:12:45 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34015
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/09 00:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:45 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/09 00:12:45 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/09 00:12:45 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/09 00:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/09 00:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/09 00:12:45 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575850359.75_2fde8fc6-9f14-4730-8244-e1b350629713 finished.
19/12/09 00:12:45 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/09 00:12:45 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_e82fa660-183a-4189-9915-1747c8f5b470","basePath":"/tmp/sparktestv2Lc9c"}: {}
java.io.FileNotFoundException: /tmp/sparktestv2Lc9c/job_e82fa660-183a-4189-9915-1747c8f5b470/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 140220864980736)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-115, started daemon 140220848195328)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <_MainThread(MainThread, started 140221644719872)>
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140220839802624)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-121, started daemon 140220831409920)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-115, started daemon 140220848195328)>

# Thread: <_MainThread(MainThread, started 140221644719872)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140220864980736)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575850351.13_e116f408-4977-49cc-b93b-81ef9d4450ff failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 333.530s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 26s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ozswi6dgw6dyq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1727

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1727/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/08 18:22:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37313
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39133.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41259.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36747
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34205
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43965.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43973.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35863
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34991
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43483.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38133.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40907
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:32841
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 18:22:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 18:22:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575829320.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46469', 'job_port': u'0'}
19/12/08 18:22:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39443.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39489.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 18:22:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33279
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 18:22:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 18:22:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 18:22:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 18:22:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 18:22:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 18:22:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575829320.32_0df509fe-6c0d-492a-9932-d08df59871c2 finished.
19/12/08 18:22:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 18:22:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_17233b3d-973c-48b1-ac71-d98d3b7dc085","basePath":"/tmp/sparktestBWmh4O"}: {}
java.io.FileNotFoundException: /tmp/sparktestBWmh4O/job_17233b3d-973c-48b1-ac71-d98d3b7dc085/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
==================== Timed out after 60 seconds. ====================
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

# Thread: <Thread(wait_until_finish_read, started daemon 140073030014720)>

# Thread: <Thread(Thread-118, started daemon 140073021622016)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140073809753856)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140072996443904)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-124, started daemon 140073004836608)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140073809753856)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575829312.25_948cde46-6c94-43be-a48e-681386c6a93d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.515s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 28s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/nx25marmktgls

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1726

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1726/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38399
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:02 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:02 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:02 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40339.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:45805.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:02 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45337
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:02 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:02 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:02 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40855
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38645.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46093.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37971
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38295
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43681.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38131.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37679
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45865
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 12:14:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 12:14:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575807239.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34663', 'job_port': u'0'}
19/12/08 12:14:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42457.
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:36157.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 12:14:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:44057
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 12:14:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 12:14:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 12:14:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 12:14:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 12:14:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 12:14:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 12:14:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575807239.54_9fec38bf-aa74-45f2-bc0b-703d73146d76 finished.
19/12/08 12:14:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 12:14:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_0246cfcd-eb11-4757-9fba-cb3307f76a80","basePath":"/tmp/sparktest5ZyTYL"}: {}
java.io.FileNotFoundException: /tmp/sparktest5ZyTYL/job_0246cfcd-eb11-4757-9fba-cb3307f76a80/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140172986742528)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(Thread-118, started daemon 140172978349824)>

# Thread: <_MainThread(MainThread, started 140173766481664)>
==================== Timed out after 60 seconds. ====================

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140172490434304)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 140172482041600)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140172978349824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <_MainThread(MainThread, started 140173766481664)>

# Thread: <Thread(wait_until_finish_read, started daemon 140172986742528)>
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575807231.22_bd303a65-7058-424f-8826-ee7f6036d19b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 318.714s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 10s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/2sycxr64ilxo2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1725

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1725/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34663
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:29 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:29 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:29 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41547.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40901.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:29 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39839
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:29 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:29 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34029
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:31 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:31 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:31 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43189.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43093.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:31 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37945
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:31 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36469
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:31 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:31 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:31 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43227.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40105.
19/12/08 06:13:31 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:31 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40195
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:32 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35251
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 06:13:32 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 06:13:32 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575785607.19', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50757', 'job_port': u'0'}
19/12/08 06:13:32 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34203.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44549.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 06:13:32 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45315
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:32 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 06:13:32 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 06:13:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 06:13:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 06:13:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 06:13:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575785607.19_b22baeb9-aa6d-49de-90f7-74a21c6d001e finished.
19/12/08 06:13:33 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 06:13:33 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_618d0c82-712b-49f7-9f11-3830839d584f","basePath":"/tmp/sparktestB1VYcM"}: {}
java.io.FileNotFoundException: /tmp/sparktestB1VYcM/job_618d0c82-712b-49f7-9f11-3830839d584f/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(wait_until_finish_read, started daemon 140191461324544)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140191469717248)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140192257849088)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140191443752704)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-125, started daemon 140191452145408)>

# Thread: <Thread(Thread-119, started daemon 140191469717248)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <_MainThread(MainThread, started 140192257849088)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140191461324544)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575785598.81_35b40348-39b8-45d6-82a8-555efa56af67 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 316.984s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/yrykbxlrcxaj2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1724

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1724/display/redirect>

Changes:


------------------------------------------
[...truncated 1.54 MB...]
19/12/08 00:12:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8 on Spark master local
19/12/08 00:12:29 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/08 00:12:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8: Pipeline translated successfully. Computing outputs
19/12/08 00:12:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:43591
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:30 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:30 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:30 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42985.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38665.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35009
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33805
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:30 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:30 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:30 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37689.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43347.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34991
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:30 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:30 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:30 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45365
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:31 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:31 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:31 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33273.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33839.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:31 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43949
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:31 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:31 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:31 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39927
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:32 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:32 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:32 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40237.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44083.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:32 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33497
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:32 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:32 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:32 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44863
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/08 00:12:33 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/08 00:12:33 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575763948.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46801', 'job_port': u'0'}
19/12/08 00:12:33 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39803.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41221.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/08 00:12:33 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40153
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/08 00:12:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:33 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/08 00:12:33 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/08 00:12:33 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/08 00:12:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/08 00:12:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/08 00:12:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575763948.47_a44f523c-f713-44d0-9242-c7c1db0dd3a8 finished.
19/12/08 00:12:33 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/08 00:12:33 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_13eaa89c-93bd-43c0-a91d-be5dff023556","basePath":"/tmp/sparktestnQuZPt"}: {}
java.io.FileNotFoundException: /tmp/sparktestnQuZPt/job_13eaa89c-93bd-43c0-a91d-be5dff023556/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================
    _sleep(delay)

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
# Thread: <Thread(wait_until_finish_read, started daemon 140248801605376)>

# Thread: <Thread(Thread-119, started daemon 140249285326592)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.
# Thread: <_MainThread(MainThread, started 140250073458432)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575763940.79_a0303fdb-81c3-4f00-ab37-f25febb2ee7e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 278.358s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 3s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/e34banjsdqosa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1723

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1723/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35595
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:35 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:35 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:35 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35353.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:35015.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:35 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:44811
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:35 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:35 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:35 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46307
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:36 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:36 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:36 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43265.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34127.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:36 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38417
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:36 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:36 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:36 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44457
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:37 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:37 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:37 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36425.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38985.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:37 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41099
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:37 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:37 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:37 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:37845
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 18:13:38 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 18:13:38 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575742412.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52473', 'job_port': u'0'}
19/12/07 18:13:38 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:44141.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42213.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 18:13:38 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46073
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:38 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 18:13:38 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 18:13:38 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 18:13:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 18:13:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 18:13:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575742412.7_fc0501f1-bbf7-40e4-b4cd-88244992b9ac finished.
19/12/07 18:13:38 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 18:13:38 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a00a2054-dd34-473a-ab28-439051d421b3","basePath":"/tmp/sparktest0dw0n8"}: {}
java.io.FileNotFoundException: /tmp/sparktest0dw0n8/job_a00a2054-dd34-473a-ab28-439051d421b3/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139860845262592)>

# Thread: <Thread(Thread-119, started daemon 139860828477184)>

# Thread: <_MainThread(MainThread, started 139861963613952)>
==================== Timed out after 60 seconds. ====================

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139860811691776)>

# Thread: <Thread(Thread-125, started daemon 139860820084480)>

# Thread: <_MainThread(MainThread, started 139861963613952)>

# Thread: <Thread(Thread-119, started daemon 139860828477184)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139860845262592)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575742403.89_e054d60e-1537-4859-beca-9b0524632a3d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 317.608s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/kq6cuqisvewgu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1722

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1722/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39001
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:50 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:50 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:50 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:36209.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42035.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:50 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38441
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:50 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:50 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:50 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:42207
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:51 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:51 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:51 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:38437.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37521.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:51 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42193
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:51 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:51 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:51 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33479
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:52 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:52 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:52 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46247.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38005.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:52 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41935
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:52 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:52 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:52 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:52 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38287
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 12:13:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 12:13:53 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575720827.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49403', 'job_port': u'0'}
19/12/07 12:13:53 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:37181.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42243.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 12:13:53 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43893
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 12:13:53 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 12:13:53 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 12:13:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 12:13:53 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575720827.74_7af050f8-6e26-4784-a521-52bfbbe456a6 finished.
19/12/07 12:13:53 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 12:13:53 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_7cf21e16-f232-4285-88b1-59a02824a514","basePath":"/tmp/sparktestgrspuj"}: {}
java.io.FileNotFoundException: /tmp/sparktestgrspuj/job_7cf21e16-f232-4285-88b1-59a02824a514/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139823296952064)>

    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139823288559360)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139824085083904)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 139823279904512)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-125, started daemon 139823197189888)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 139824085083904)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-119, started daemon 139823288559360)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 139823296952064)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575720819.74_0b1896e5-89ab-4af0-9d70-c9e6067b7531 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 316.753s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 16s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/yj2lnh5iyppyi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1721

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1721/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46745
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:09 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:09 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:09 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33253.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38019.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:09 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34831
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:09 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:09 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:09 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38141
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:10 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:10 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:10 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42267.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:33823.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:10 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34989
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:10 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:10 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:10 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44487
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:11 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:11 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:11 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40479.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42403.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:11 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33487
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:11 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:11 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:11 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39369
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/07 06:13:12 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/07 06:13:12 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575699186.54', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59867', 'job_port': u'0'}
19/12/07 06:13:12 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:35437.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:42401.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/07 06:13:12 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42495
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 06:13:12 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 06:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 06:13:12 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 06:13:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 06:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 06:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 06:13:12 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575699186.54_9d7f8f37-7ef8-4d49-b0e8-dd9c48af57d3 finished.
19/12/07 06:13:12 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 06:13:12 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_91f55744-e8ab-477e-b75e-878b3f3179bf","basePath":"/tmp/sparktestRu7hjr"}: {}
java.io.FileNotFoundException: /tmp/sparktestRu7hjr/job_91f55744-e8ab-477e-b75e-878b3f3179bf/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru==================== Timed out after 60 seconds. ====================
nner.py", line 428, in wait_until_finish

# Thread: <Thread(wait_until_finish_read, started daemon 140490094511872)>

    for state_response in self._state_stream:
# Thread: <Thread(Thread-120, started daemon 140490077726464)>

# Thread: <_MainThread(MainThread, started 140490874251008)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140489453991680)>

# Thread: <Thread(Thread-126, started daemon 140490068547328)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-120, started daemon 140490077726464)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140490094511872)>

# Thread: <_MainThread(MainThread, started 140490874251008)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575699178.11_3d5af02c-f5b7-4581-91c6-f92e3dcdbf7b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 306.941s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 51s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/bsxs7pcdbeq2q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1720

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1720/display/redirect>

Changes:


------------------------------------------
[...truncated 1.55 MB...]

19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/07 00:53:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 00:53:28 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/07 00:53:28 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/07 00:53:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/07 00:53:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/07 00:53:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/07 00:53:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575680002.3_4f7f617c-b704-4dcd-8112-b156f7fbdd45 finished.
19/12/07 00:53:28 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/07 00:53:28 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9ed91e65-dba7-4e50-a2b7-81b49249abce","basePath":"/tmp/sparktestp2mwwi"}: {}
java.io.FileNotFoundException: /tmp/sparktestp2mwwi/job_9ed91e65-dba7-4e50-a2b7-81b49249abce/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140112964871936)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-118, started daemon 140112956479232)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140113949738752)>
==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140112939693824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-124, started daemon 140112931301120)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140113949738752)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140112956479232)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140112964871936)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140113949738752)>
======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 254, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/pipeline.py", line 412, in run
    if test_runner_api and self._verify_runner_api_compatible():
  File "apache_beam/pipeline.py", line 625, in _verify_runner_api_compatible
    self.visit(Visitor())
  File "apache_beam/pipeline.py", line 457, in visit
    self._root_transform().visit(visitor, self, visited)
  File "apache_beam/pipeline.py", line 850, in visit
    part.visit(visitor, pipeline, visited)
  File "apache_beam/pipeline.py", line 850, in visit
    part.visit(visitor, pipeline, visited)
  File "apache_beam/pipeline.py", line 850, in visit
    part.visit(visitor, pipeline, visited)
  File "apache_beam/pipeline.py", line 853, in visit
    visitor.visit_transform(self)
  File "apache_beam/pipeline.py", line 616, in visit_transform
    enable_trace=False),
  File "apache_beam/internal/pickler.py", line 250, in dumps
    s = dill.dumps(o)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 265, in dumps
    dump(obj, file, protocol, byref, fmode, recurse, **kwds)#, strictio)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 259, in dump
    Pickler(file, protocol, **_kwds).dump(obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 445, in dump
    StockPickler.dump(self, obj)
  File "/usr/lib/python2.7/pickle.py", line 224, in dump
    self.save(obj)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 425, in save_reduce
    save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 912, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 687, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 425, in save_reduce
    save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 912, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 687, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 1421, in save_function
    obj.__dict__), obj=obj)
  File "/usr/lib/python2.7/pickle.py", line 401, in save_reduce
    save(args)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 568, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 912, in save_module_dict
    StockPickler.save_dict(pickler, obj)
  File "/usr/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/usr/lib/python2.7/pickle.py", line 692, in _batch_setitems
    save(v)
  File "/usr/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/usr/lib/python2.7/pickle.py", line 425, in save_reduce
    save(state)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/usr/lib/python2.7/pickle.py", line 554, in save_tuple
    save(element)
  File "/usr/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "apache_beam/internal/pickler.py", line 215, in new_save_module_dict
    return old_save_module_dict(pickler, obj)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/dill/_dill.py",> line 908, in save_module_dict
    log.info("D2: <dict%s" % str(obj.__repr__).split('dict')[-1]) # obj
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575679991.35_1d0d3f27-1b3d-444c-a8dd-a4bce957d299 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 373.771s

FAILED (errors=4, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 16s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/cwt57637tzcys

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1719

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1719/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8835] Stage artifacts to BEAM-PIPELINE dir in zip

[kcweaver] [BEAM-8835] Check for leading slash in zip file paths.


------------------------------------------
[...truncated 1.55 MB...]
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38515
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:53 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:53 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:53 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34251.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34585.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:53 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43731
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:53 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:53 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:53 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:46421
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:54 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:54 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:54 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43927.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38375.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:54 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:45249
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:54 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:54 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:54 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35571
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:55 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:55 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:55 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34805.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40553.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:55 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39035
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:55 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:55 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:55 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40255
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 23:32:56 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 23:32:56 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575675171.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49495', 'job_port': u'0'}
19/12/06 23:32:56 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33531.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39547.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 23:32:56 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33261
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 23:32:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 23:32:56 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 23:32:56 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 23:32:56 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 23:32:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 23:32:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 23:32:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575675171.25_4b70ead1-cbbe-4eec-b1f8-ecf7677c4c24 finished.
19/12/06 23:32:56 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 23:32:56 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9f770abf-81b2-49ea-9437-83927ccdcd6b","basePath":"/tmp/sparktestT9s7ew"}: {}
java.io.FileNotFoundException: /tmp/sparktestT9s7ew/job_9f770abf-81b2-49ea-9437-83927ccdcd6b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================
======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140529675323136)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140529683715840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140530463454976)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140529174046464)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140529182439168)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140530463454976)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140529683715840)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140529675323136)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575675162.82_26fe060c-fe29-48fc-b1b7-2d1325f58c83 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 307.973s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 54s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/6wwdhdk25y4ns

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1718

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1718/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8882] Implement Impulse() for BundleBasedRunner.

[robertwb] [BEAM-8882] Make Create fn-api agnostic.

[robertwb] [BEAM-8882] Fully specify types for Create composite.

[robertwb] [BEAM-8882] Make Read fn-api agnostic.

[robertwb] [BEAM-8882] Cleanup always-on use_sdf_bounded_source option.

[robertwb] [BEAM-8882] Annotate ParDo and CombineValues operations with proto

[robertwb] [BEAM-8882] Unconditionally populate pipeline_proto_coder_id.

[robertwb] [BEAM-8882] Fix overly-sensitive tests.

[robertwb] Fix sdf tests from create.

[robertwb] [BEAM-8882] Avoid attaching unrecognized properties.

[robertwb] [BEAM-8882] Accommodations for JRH.

[robertwb] Minor cleanup.


------------------------------------------
[...truncated 1.55 MB...]
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:38201
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:26 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:26 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:26 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33141.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 259-1
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:34331.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:26 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:41267
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:26 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:26 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:26 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35965
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:27 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:27 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:27 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42357.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 260-1
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40239.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:27 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:36575
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:27 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:27 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:27 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:40563
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:28 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:28 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:28 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:34415.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43715.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:28 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:37109
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:28 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:28 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 21:10:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34271
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 21:10:28 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 21:10:28 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 21:10:28 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575666623.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:51251', 'job_port': u'0'}
19/12/06 21:10:29 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:41027.
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:32887.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 21:10:29 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34709
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 21:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:29 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 21:10:29 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 21:10:29 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 21:10:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 21:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 21:10:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575666623.37_559df4e3-7cea-4b5d-b314-dd2017c85472 finished.
19/12/06 21:10:29 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 21:10:29 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6c5c5016-73d5-4c41-b5ca-6e456a6be577","basePath":"/tmp/sparktest9RLLdl"}: {}
java.io.FileNotFoundException: /tmp/sparktest9RLLdl/job_6c5c5016-73d5-4c41-b5ca-6e456a6be577/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 330, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139621433284352)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-120, started daemon 139621156341504)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 139621951805184)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139620528420608)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-126, started daemon 139621146113792)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 502, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(Thread-120, started daemon 139621156341504)>

  File "apache_beam/pipeline.py",# Thread: <_MainThread(MainThread, started 139621951805184)>

# Thread: <Thread(wait_until_finish_read, started daemon 139621433284352)>
 line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575666614.49_8ad695cc-bbcd-4006-97ac-8582fc2befb5 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 317.889s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 47s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/3zjrh3xuz3wdg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1717

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1717/display/redirect?page=changes>

Changes:

[github] [BEAM-3865] Stronger trigger tests. (#10192)

[pabloem] Merge pull request #10236 from [BEAM-8335] Add method to

[bhulette] [BEAM-8427] Add MongoDB to SQL documentation (#10273)


------------------------------------------
[...truncated 1.57 MB...]
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45687
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:12 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:12 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:12 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45583.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:41647.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:12 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:35997
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:12 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:12 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:12 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:44189
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:13 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:13 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:13 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:42135.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:38083.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:13 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34647
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:13 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:13 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:13 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:39045
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:14 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:14 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:14 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:39675.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:40781.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:14 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39433
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:14 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:14 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:14 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:33019
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 19:40:15 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 19:40:15 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575661209.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52911', 'job_port': u'0'}
19/12/06 19:40:15 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46585.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43997.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 19:40:15 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:40133
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 19:40:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:15 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 19:40:15 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 19:40:15 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 19:40:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 19:40:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 19:40:15 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575661209.51_72651fdb-c511-4310-9cda-6ee9d11c3974 finished.
19/12/06 19:40:15 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 19:40:15 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d6df69a8-2454-4191-869e-a94145a1f196","basePath":"/tmp/sparktesttmujUC"}: {}
java.io.FileNotFoundException: /tmp/sparktesttmujUC/job_d6df69a8-2454-4191-869e-a94145a1f196/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140075123865344)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140075115472640)>

# Thread: <_MainThread(MainThread, started 140076255598336)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140075107079936)>

# Thread: <Thread(Thread-126, started daemon 140075098687232)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 140075115472640)>

# Thread: <_MainThread(MainThread, started 140076255598336)>

# Thread: <Thread(wait_until_finish_read, started daemon 140075123865344)>
======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575661198.63_d2f5041d-0931-4bd6-9b40-1f831db94f79 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 356.585s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 57s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/cpdwevcwp6el2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1716

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1716/display/redirect>

Changes:


------------------------------------------
[...truncated 1.57 MB...]
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:34209
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:17 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:17 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:17 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33981.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43969.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:17 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:42679
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:17 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:17 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:17 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:41861
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:18 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:18 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:18 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45809.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46001.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:18 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39725
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:18 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:18 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:18 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35441
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:19 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:19 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:19 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:43757.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44767.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:19 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:34139
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:19 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:19 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:19 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:45767
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 18:30:20 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 18:30:20 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575657014.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56045', 'job_port': u'0'}
19/12/06 18:30:20 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45905.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:39533.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 18:30:20 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:43563
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 18:30:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:20 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 18:30:20 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 18:30:20 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 18:30:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 18:30:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 18:30:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575657014.84_299a4f63-3b48-4938-89ac-726af1f2d688 finished.
19/12/06 18:30:20 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 18:30:20 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d0d69e61-f0f8-49ac-8496-e0008815fd60","basePath":"/tmp/sparktesthDBrAK"}: {}
java.io.FileNotFoundException: /tmp/sparktesthDBrAK/job_d0d69e61-f0f8-49ac-8496-e0008815fd60/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
==================== Timed out after 60 seconds. ====================

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(wait_until_finish_read, started daemon 139855050311424)>

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-119, started daemon 139855058704128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139855838443264)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139855024346880)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-125, started daemon 139855033001728)>

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 139855838443264)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575657005.37_c664e3af-7229-470d-9571-49c7fa82b667 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 139855058704128)>

----------------------------------------------------------------------
Ran 38 tests in 313.602s
# Thread: <Thread(wait_until_finish_read, started daemon 139855050311424)>

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 41s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/zgdbqgzvduc2u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1715

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1715/display/redirect?page=changes>

Changes:

[github] [BEAM-8882] Fully populate log messages. (#10292)


------------------------------------------
[...truncated 1.57 MB...]
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36453
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:03 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:03 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:03 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:40669.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:44081.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:03 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:38891
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:03 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:03 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:03 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:35541
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:04 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:45303.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:46073.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:04 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:33657
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:04 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:04 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 16:44:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:32887
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:04 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:04 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:04 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:33489.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:37463.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:39629
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:106: Logging handler created.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:88: Status HTTP server running at localhost:36069
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:132: semi_persistent_directory: /tmp
19/12/06 16:44:05 WARN <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:213: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 16:44:05 WARN apache_beam/options/pipeline_options.py:268: Discarding unparseable args: [u'--app_name=test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:144: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575650640.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53481', 'job_port': u'0'}
19/12/06 16:44:05 INFO apache_beam/runners/worker/statecache.py:137: Creating state cache with size 0
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:71: Creating insecure control channel for localhost:46315.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:79: Control channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:96: Initializing SDKHarness with unbounded number of workers.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:449: Creating insecure state channel for localhost:43601.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:456: State channel established.
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:354: Creating client data channel for localhost:46643
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:122: No more requests from control plane
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:123: SDK Harness waiting for in-flight requests to complete
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO apache_beam/runners/worker/data_plane.py:376: Closing all cached grpc data channels.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:467: Closing all cached gRPC state handlers.
19/12/06 16:44:05 INFO apache_beam/runners/worker/sdk_worker.py:133: Done consuming work.
19/12/06 16:44:05 INFO <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker_main.py>:157: Python sdk harness exiting.
19/12/06 16:44:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 16:44:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 16:44:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575650640.76_8722b242-e5cb-4621-90cf-5c0dd53bfa11 finished.
19/12/06 16:44:05 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 16:44:05 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_5fb2d848-b25b-4446-a59c-ca30492a1ff3","basePath":"/tmp/sparktestZ5fHLe"}: {}
java.io.FileNotFoundException: /tmp/sparktestZ5fHLe/job_5fb2d848-b25b-4446-a59c-ca30492a1ff3/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139975643948800)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139975627163392)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139976423687936)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 139975610115840)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-125, started daemon 139975618770688)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-119, started daemon 139975627163392)>

t(''.join(data))))
# Thread: <Thread(wait_until_finish_read, started daemon 139975643948800)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139976423687936)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575650631.64_e514e5ff-cb86-4928-bd5b-1691c301aa08 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 299.131s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 38s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/nqxdsorp4qtic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1714

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1714/display/redirect?page=changes>

Changes:

[thw] [BEAM-8815] Define the no artifacts retrieval token in proto


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 15:02:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:44801
19/12/06 15:02:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:39 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35543.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 15:02:39 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36629.
19/12/06 15:02:39 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:39027
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:39 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:39 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:39 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 15:02:40 INFO sdk_worker_main.main: Logging handler created.
19/12/06 15:02:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:40181
19/12/06 15:02:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:40 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33239.
19/12/06 15:02:40 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 15:02:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44197.
19/12/06 15:02:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:34911
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:40 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:40 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 15:02:41 INFO sdk_worker_main.main: Logging handler created.
19/12/06 15:02:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:40339
19/12/06 15:02:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:41 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39205.
19/12/06 15:02:41 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 15:02:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40931.
19/12/06 15:02:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:38511
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:41 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:41 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 15:02:42 INFO sdk_worker_main.main: Logging handler created.
19/12/06 15:02:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:42647
19/12/06 15:02:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 15:02:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 15:02:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 15:02:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575644557.22', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39481', 'job_port': u'0'}
19/12/06 15:02:42 INFO statecache.__init__: Creating state cache with size 0
19/12/06 15:02:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36887.
19/12/06 15:02:42 INFO sdk_worker.__init__: Control channel established.
19/12/06 15:02:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 15:02:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46071.
19/12/06 15:02:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 15:02:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:46393
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 15:02:42 INFO sdk_worker.run: No more requests from control plane
19/12/06 15:02:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 15:02:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 15:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 15:02:42 INFO sdk_worker.run: Done consuming work.
19/12/06 15:02:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 15:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 15:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 15:02:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575644557.22_32b1c34b-42e2-4712-8a4c-938886205b3a finished.
19/12/06 15:02:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 15:02:42 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d550efff-4571-492d-97fc-c587de27aa8f","basePath":"/tmp/sparktestEq_moi"}: {}
java.io.FileNotFoundException: /tmp/sparktestEq_moi/job_d550efff-4571-492d-97fc-c587de27aa8f/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139883444164352)>

# Thread: <Thread(Thread-118, started daemon 139883452557056)>

# Thread: <_MainThread(MainThread, started 139884578526976)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139883427378944)>

# Thread: <Thread(Thread-124, started daemon 139883435771648)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 139883452557056)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139883444164352)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <_MainThread(MainThread, started 139884578526976)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575644548.36_5f789389-1546-47c2-8c0f-2c10dfb6c283 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 292.611s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/khdttcvzjbyo4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1713

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1713/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 12:13:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:35033
19/12/06 12:13:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:01 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37445.
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 12:13:01 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44233.
19/12/06 12:13:01 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:37487
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:01 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:01 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:01 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 12:13:02 INFO sdk_worker_main.main: Logging handler created.
19/12/06 12:13:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:35471
19/12/06 12:13:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:02 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40435.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 12:13:02 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38373.
19/12/06 12:13:02 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:38359
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:02 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:02 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:02 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 12:13:03 INFO sdk_worker_main.main: Logging handler created.
19/12/06 12:13:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:36059
19/12/06 12:13:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:03 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36479.
19/12/06 12:13:03 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 12:13:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37909.
19/12/06 12:13:03 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:36609
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:03 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:03 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:03 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 12:13:04 INFO sdk_worker_main.main: Logging handler created.
19/12/06 12:13:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:45571
19/12/06 12:13:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 12:13:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 12:13:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 12:13:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575634379.46', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58649', 'job_port': u'0'}
19/12/06 12:13:04 INFO statecache.__init__: Creating state cache with size 0
19/12/06 12:13:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38375.
19/12/06 12:13:04 INFO sdk_worker.__init__: Control channel established.
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 12:13:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 12:13:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34801.
19/12/06 12:13:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 12:13:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:36575
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 12:13:04 INFO sdk_worker.run: No more requests from control plane
19/12/06 12:13:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 12:13:04 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 12:13:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 12:13:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:04 INFO sdk_worker.run: Done consuming work.
19/12/06 12:13:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 12:13:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 12:13:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 12:13:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575634379.46_c7bc8eb9-7b79-4bc4-9972-ddd0ea5e8eb1 finished.
19/12/06 12:13:04 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 12:13:04 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_03524732-eacf-494a-b758-0737663708e5","basePath":"/tmp/sparktestlqLG_J"}: {}
java.io.FileNotFoundException: /tmp/sparktestlqLG_J/job_03524732-eacf-494a-b758-0737663708e5/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
==================== Timed out after 60 seconds. ====================

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140124411623168)>

# Thread: <Thread(Thread-119, started daemon 140124394837760)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140125191362304)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140123903616768)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140123912009472)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140124394837760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140124411623168)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
# Thread: <_MainThread(MainThread, started 140125191362304)>
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575634370.77_c91c082a-36ac-410e-9288-90659c1f0960 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.282s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 35s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/xoj7shvaoc5xm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1712

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1712/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 06:13:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:36343
19/12/06 06:13:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:04 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45557.
19/12/06 06:13:04 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 06:13:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41109.
19/12/06 06:13:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:44495
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:04 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:04 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:04 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 06:13:05 INFO sdk_worker_main.main: Logging handler created.
19/12/06 06:13:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:36869
19/12/06 06:13:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:05 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35999.
19/12/06 06:13:05 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 06:13:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41013.
19/12/06 06:13:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:38953
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:05 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:05 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 06:13:06 INFO sdk_worker_main.main: Logging handler created.
19/12/06 06:13:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:43325
19/12/06 06:13:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:06 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44949.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 06:13:06 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45649.
19/12/06 06:13:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:41315
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:06 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:06 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 06:13:07 INFO sdk_worker_main.main: Logging handler created.
19/12/06 06:13:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:34165
19/12/06 06:13:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 06:13:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 06:13:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 06:13:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575612781.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37549', 'job_port': u'0'}
19/12/06 06:13:07 INFO statecache.__init__: Creating state cache with size 0
19/12/06 06:13:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39567.
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 06:13:07 INFO sdk_worker.__init__: Control channel established.
19/12/06 06:13:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 06:13:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33725.
19/12/06 06:13:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 06:13:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:46585
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 06:13:07 INFO sdk_worker.run: No more requests from control plane
19/12/06 06:13:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 06:13:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 06:13:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 06:13:07 INFO sdk_worker.run: Done consuming work.
19/12/06 06:13:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 06:13:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 06:13:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 06:13:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575612781.21_a7ff636c-d605-487d-98d5-925b6eadc0aa finished.
19/12/06 06:13:07 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 06:13:07 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_5900e238-b9e9-4f4c-9664-f57ca949ab78","basePath":"/tmp/sparktestAV69Ci"}: {}
java.io.FileNotFoundException: /tmp/sparktestAV69Ci/job_5900e238-b9e9-4f4c-9664-f57ca949ab78/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 140092687054592)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 140092678661888)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140093466793728)>
==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140092652173056)>

# Thread: <Thread(Thread-125, started daemon 140092660565760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-119, started daemon 140092678661888)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140093466793728)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140092687054592)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575612770.33_024dc405-19d1-42e4-83b6-527821825d8c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.838s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 48s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/drkdrb4r7ofsc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1711

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1711/display/redirect?page=changes>

Changes:

[dcavazos] [BEAM-7390] Add code snippet for Sample


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 04:11:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:40705
19/12/06 04:11:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:24 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35759.
19/12/06 04:11:24 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 04:11:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42881.
19/12/06 04:11:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:35091
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:24 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:24 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:24 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 04:11:25 INFO sdk_worker_main.main: Logging handler created.
19/12/06 04:11:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:38703
19/12/06 04:11:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:25 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35411.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 04:11:25 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41051.
19/12/06 04:11:25 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:38473
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:25 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:25 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 04:11:26 INFO sdk_worker_main.main: Logging handler created.
19/12/06 04:11:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:41785
19/12/06 04:11:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:26 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34471.
19/12/06 04:11:26 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 04:11:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39499.
19/12/06 04:11:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:34769
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:26 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:26 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 04:11:27 INFO sdk_worker_main.main: Logging handler created.
19/12/06 04:11:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:46431
19/12/06 04:11:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 04:11:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 04:11:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 04:11:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575605481.94', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35249', 'job_port': u'0'}
19/12/06 04:11:27 INFO statecache.__init__: Creating state cache with size 0
19/12/06 04:11:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34043.
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 04:11:27 INFO sdk_worker.__init__: Control channel established.
19/12/06 04:11:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 04:11:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37589.
19/12/06 04:11:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 04:11:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:46277
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 04:11:27 INFO sdk_worker.run: No more requests from control plane
19/12/06 04:11:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 04:11:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 04:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 04:11:27 INFO sdk_worker.run: Done consuming work.
19/12/06 04:11:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 04:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 04:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 04:11:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575605481.94_d73259ad-0335-4aff-abe3-c53837469793 finished.
19/12/06 04:11:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 04:11:27 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_c76589f2-c316-4631-8700-6c8d6af4f466","basePath":"/tmp/sparktestJgovL2"}: {}
java.io.FileNotFoundException: /tmp/sparktestJgovL2/job_c76589f2-c316-4631-8700-6c8d6af4f466/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139846131635968)>

# Thread: <Thread(Thread-118, started daemon 139846140028672)>

# Thread: <_MainThread(MainThread, started 139847121999616)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139846106457856)>

# Thread: <Thread(Thread-124, started daemon 139846114850560)>

# Thread: <Thread(Thread-118, started daemon 139846140028672)>

  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575605471.94_fed4c879-3b71-4d17-80cb-f0a593187bb8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 343.016s

FAILED (errors=3, skipped=9)
# Thread: <_MainThread(MainThread, started 139847121999616)>

# Thread: <Thread(wait_until_finish_read, started daemon 139846131635968)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 28s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/jnq5e2qoft5uc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1710

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1710/display/redirect?page=changes>

Changes:

[pabloem] Reactivating test while preventing timeouts.


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 01:32:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:44509
19/12/06 01:32:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:43 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41993.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 01:32:43 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33509.
19/12/06 01:32:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:44547
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:43 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:43 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 01:32:44 INFO sdk_worker_main.main: Logging handler created.
19/12/06 01:32:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:43941
19/12/06 01:32:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:44 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39307.
19/12/06 01:32:44 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 01:32:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34719.
19/12/06 01:32:44 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:35213
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:44 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:44 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:44 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 01:32:45 INFO sdk_worker_main.main: Logging handler created.
19/12/06 01:32:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:38001
19/12/06 01:32:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:45 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39101.
19/12/06 01:32:45 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 01:32:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38075.
19/12/06 01:32:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:35437
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:45 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:45 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 01:32:46 INFO sdk_worker_main.main: Logging handler created.
19/12/06 01:32:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:41533
19/12/06 01:32:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 01:32:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 01:32:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 01:32:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575595960.71', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55489', 'job_port': u'0'}
19/12/06 01:32:46 INFO statecache.__init__: Creating state cache with size 0
19/12/06 01:32:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42749.
19/12/06 01:32:46 INFO sdk_worker.__init__: Control channel established.
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 01:32:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 01:32:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42937.
19/12/06 01:32:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 01:32:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:35241
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 01:32:46 INFO sdk_worker.run: No more requests from control plane
19/12/06 01:32:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 01:32:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 01:32:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 01:32:46 INFO sdk_worker.run: Done consuming work.
19/12/06 01:32:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 01:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 01:32:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 01:32:46 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575595960.71_fe0bf38e-ebf1-48c2-bc54-a93860f22f0c finished.
19/12/06 01:32:46 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 01:32:46 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_2aeba5fd-7e7a-4733-ab9f-79a4263c2f12","basePath":"/tmp/sparktest6aBpvu"}: {}
java.io.FileNotFoundException: /tmp/sparktest6aBpvu/job_2aeba5fd-7e7a-4733-ab9f-79a4263c2f12/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

# Thread: <Thread(wait_until_finish_read, started daemon 139858043000576)>
----------------------------------------------------------------------
Traceback (most recent call last):

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(Thread-119, started daemon 139857682753280)>

# Thread: <_MainThread(MainThread, started 139858822739712)>
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139857674360576)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-125, started daemon 139857665967872)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <_MainThread(MainThread, started 139858822739712)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 139857682753280)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 139858043000576)>
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575595950.1_10c6bc26-7849-46c4-98f7-97b60862db9c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 340.678s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 24s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/fujyacowm4com

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1709

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1709/display/redirect?page=changes>

Changes:

[rohde.samuel] fix assert equals_to_per_window to actually assert window's existence

[robertwb] Fix [BEAM-8581] and [BEAM-8582]


------------------------------------------
[...truncated 1.32 MB...]
19/12/06 00:21:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:38575
19/12/06 00:21:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:34 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37103.
19/12/06 00:21:34 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/06 00:21:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38737.
19/12/06 00:21:34 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:37769
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:34 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:34 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:34 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 00:21:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 00:21:34 INFO sdk_worker_main.main: Logging handler created.
19/12/06 00:21:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:35327
19/12/06 00:21:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:34 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43735.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/06 00:21:35 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41325.
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:35129
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:35 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:35 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 00:21:35 INFO sdk_worker_main.main: Logging handler created.
19/12/06 00:21:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:40045
19/12/06 00:21:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:35 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39807.
19/12/06 00:21:35 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34131.
19/12/06 00:21:35 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:35547
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:35 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:35 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/06 00:21:36 INFO sdk_worker_main.main: Logging handler created.
19/12/06 00:21:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:45025
19/12/06 00:21:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/06 00:21:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/06 00:21:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/06 00:21:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575591691.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49761', 'job_port': u'0'}
19/12/06 00:21:36 INFO statecache.__init__: Creating state cache with size 0
19/12/06 00:21:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38377.
19/12/06 00:21:36 INFO sdk_worker.__init__: Control channel established.
19/12/06 00:21:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/06 00:21:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38101.
19/12/06 00:21:36 INFO sdk_worker.create_state_handler: State channel established.
19/12/06 00:21:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:42459
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/06 00:21:36 INFO sdk_worker.run: No more requests from control plane
19/12/06 00:21:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/06 00:21:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:36 INFO data_plane.close: Closing all cached grpc data channels.
19/12/06 00:21:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/06 00:21:36 INFO sdk_worker.run: Done consuming work.
19/12/06 00:21:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/06 00:21:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/06 00:21:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/06 00:21:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575591691.65_2c0c943b-ff73-4b92-a6b2-9736fe06329b finished.
19/12/06 00:21:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/06 00:21:36 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_e2f2f881-2994-4000-a47d-3ab4cb74be84","basePath":"/tmp/sparktestcztjUX"}: {}
java.io.FileNotFoundException: /tmp/sparktestcztjUX/job_e2f2f881-2994-4000-a47d-3ab4cb74be84/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140450396632832)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140450388240128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140451525756672)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140450379847424)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140450371454720)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140451525756672)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140450388240128)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140450396632832)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575591682.57_d1d435f9-6a11-4a28-af77-d1d923fda54a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 308.378s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 44s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/qp2lkpd26kzc4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1708

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1708/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-4287] Fix to use the residual instead of the current restriction


------------------------------------------
[...truncated 1.31 MB...]
19/12/05 23:38:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04 on Spark master local
19/12/05 23:38:22 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/05 23:38:22 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04: Pipeline translated successfully. Computing outputs
19/12/05 23:38:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:23 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:43667
19/12/05 23:38:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:23 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35411.
19/12/05 23:38:23 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/05 23:38:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37577.
19/12/05 23:38:23 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:45313
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:23 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:23 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:23 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:24 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:33523
19/12/05 23:38:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:24 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41079.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 23:38:24 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34019.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:39405
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:24 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:24 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:24 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:24 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:34771
19/12/05 23:38:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:24 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45277.
19/12/05 23:38:24 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 23:38:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35493.
19/12/05 23:38:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:42569
19/12/05 23:38:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:25 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:25 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:25 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:43009
19/12/05 23:38:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:25 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45219.
19/12/05 23:38:25 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 23:38:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37565.
19/12/05 23:38:25 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:33097
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:25 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:25 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:38:26 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:38:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:46749
19/12/05 23:38:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:38:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:38:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:38:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575589101.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42559', 'job_port': u'0'}
19/12/05 23:38:26 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:38:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34125.
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 23:38:26 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:38:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:38:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42031.
19/12/05 23:38:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:38:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:39039
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:38:26 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:38:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:38:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:38:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:38:26 INFO sdk_worker.run: Done consuming work.
19/12/05 23:38:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:38:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:38:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:38:26 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575589101.32_48e29f14-cd99-473e-aa14-5eeb390e3b04 finished.
19/12/05 23:38:26 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 23:38:26 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6a461650-c265-44c1-a82e-0e9cbfb14224","basePath":"/tmp/sparktestB38E19"}: {}
java.io.FileNotFoundException: /tmp/sparktestB38E19/job_6a461650-c265-44c1-a82e-0e9cbfb14224/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139977121392384)>

# Thread: <Thread(Thread-119, started daemon 139977129785088)>

# Thread: <_MainThread(MainThread, started 139977909524224)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575589092.24_0275e6c0-7aa0-464e-a4b1-c7ab99cff185 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 289.074s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 3s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/eefmoylgfd5cc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1707

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1707/display/redirect?page=changes>

Changes:

[chadrik] Make local job service accessible from external machines

[chadrik] Provide methods to override bind and service addresses independently

[chadrik] Fix lint


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 23:24:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:40 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:33803
19/12/05 23:24:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:40 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36609.
19/12/05 23:24:40 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 23:24:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33897.
19/12/05 23:24:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:40791
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:40 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:40 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:41 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:34059
19/12/05 23:24:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:41 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44965.
19/12/05 23:24:41 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 23:24:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34333.
19/12/05 23:24:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:46269
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:41 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:41 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:42 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:39059
19/12/05 23:24:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:42 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33427.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 23:24:42 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37655.
19/12/05 23:24:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:42287
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:42 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:42 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:24:43 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:24:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:40741
19/12/05 23:24:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:24:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:24:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:24:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575588277.65', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36969', 'job_port': u'0'}
19/12/05 23:24:43 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:24:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44297.
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 23:24:43 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:24:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:24:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42911.
19/12/05 23:24:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:24:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:33745
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:24:43 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:24:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:24:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:24:43 INFO sdk_worker.run: Done consuming work.
19/12/05 23:24:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:24:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575588277.65_ebdb9e65-542f-4335-8e09-cdc0f23e9373 finished.
19/12/05 23:24:43 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 23:24:43 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_04fc138d-2f80-4963-84f5-83cd44e1efbb","basePath":"/tmp/sparktest0N3sLO"}: {}
java.io.FileNotFoundException: /tmp/sparktest0N3sLO/job_04fc138d-2f80-4963-84f5-83cd44e1efbb/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 140385489295104)>

# Thread: <Thread(Thread-120, started daemon 140385472509696)>

BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140386269034240)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis==================== Timed out after 60 seconds. ====================

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140385463068416)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575588268.01_c4f0e71b-49d4-424a-8f44-e1a6b2af3a63 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 300.667s

# Thread: <Thread(Thread-126, started daemon 140385454675712)>

# Thread: <_MainThread(MainThread, started 140386269034240)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 4s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/myewgw62ojru6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1706

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1706/display/redirect?page=changes>

Changes:

[kirillkozlov] MongoDb project push-down, needs tests

[kirillkozlov] Add tests for MongoDb project push-down

[kirillkozlov] Added cleanup for tests

[kirillkozlov] rebase

[kirillkozlov] Check last executed query


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 23:02:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:45937
19/12/05 23:02:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:44 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33335.
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 23:02:44 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39833.
19/12/05 23:02:44 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:38181
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:44 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:44 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:44 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:02:45 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:02:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:35673
19/12/05 23:02:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:45 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43425.
19/12/05 23:02:45 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 23:02:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38385.
19/12/05 23:02:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:39393
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:45 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:45 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:02:46 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:02:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:37585
19/12/05 23:02:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45907.
19/12/05 23:02:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 23:02:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40309.
19/12/05 23:02:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:38915
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:46 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 23:02:47 INFO sdk_worker_main.main: Logging handler created.
19/12/05 23:02:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:46559
19/12/05 23:02:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 23:02:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 23:02:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 23:02:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575586962.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44773', 'job_port': u'0'}
19/12/05 23:02:47 INFO statecache.__init__: Creating state cache with size 0
19/12/05 23:02:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39269.
19/12/05 23:02:47 INFO sdk_worker.__init__: Control channel established.
19/12/05 23:02:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 23:02:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34417.
19/12/05 23:02:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 23:02:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:37239
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 23:02:47 INFO sdk_worker.run: No more requests from control plane
19/12/05 23:02:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 23:02:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 23:02:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 23:02:47 INFO sdk_worker.run: Done consuming work.
19/12/05 23:02:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 23:02:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 23:02:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 23:02:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575586962.28_74ff277e-b467-4278-aadf-89acb49c4c7f finished.
19/12/05 23:02:47 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 23:02:47 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_973ef860-5f13-450e-8fdb-b55275568b20","basePath":"/tmp/sparktestImSSir"}: {}
java.io.FileNotFoundException: /tmp/sparktestImSSir/job_973ef860-5f13-450e-8fdb-b55275568b20/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140425681368832)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

# Thread: <Thread(Thread-120, started daemon 140425664583424)>

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140426461107968)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140425655666432)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-126, started daemon 140425647273728)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(Thread-120, started daemon 140425664583424)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140425681368832)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575586953.04_293215c7-e50c-49fe-ae68-db900db84601 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 140426461107968)>
----------------------------------------------------------------------
Ran 38 tests in 315.560s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/tbll5ozwksdv2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1705

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1705/display/redirect?page=changes>

Changes:

[github] Merge pull request #10278: [BEAM-7274] Support recursive type


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 21:49:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:31 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41209.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 21:49:31 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41709.
19/12/05 21:49:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:40809
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:31 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:31 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 21:49:32 INFO sdk_worker_main.main: Logging handler created.
19/12/05 21:49:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:37051
19/12/05 21:49:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:32 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38905.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 21:49:32 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34223.
19/12/05 21:49:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:39591
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: 1 Beam Fn Logging clients still connected during shutdown.
19/12/05 21:49:32 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:32 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 21:49:33 INFO sdk_worker_main.main: Logging handler created.
19/12/05 21:49:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:33783
19/12/05 21:49:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:33 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39661.
19/12/05 21:49:33 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 21:49:33 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38687.
19/12/05 21:49:33 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:33 INFO data_plane.create_data_channel: Creating client data channel for localhost:35303
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:33 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:33 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:33 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 21:49:35 INFO sdk_worker_main.main: Logging handler created.
19/12/05 21:49:35 INFO sdk_worker_main.start: Status HTTP server running at localhost:44721
19/12/05 21:49:35 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 21:49:35 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 21:49:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 21:49:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575582568.24', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60929', 'job_port': u'0'}
19/12/05 21:49:35 INFO statecache.__init__: Creating state cache with size 0
19/12/05 21:49:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36211.
19/12/05 21:49:35 INFO sdk_worker.__init__: Control channel established.
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 21:49:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 21:49:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35419.
19/12/05 21:49:35 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 21:49:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:46257
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 21:49:35 INFO sdk_worker.run: No more requests from control plane
19/12/05 21:49:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 21:49:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 21:49:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 21:49:35 INFO sdk_worker.run: Done consuming work.
19/12/05 21:49:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 21:49:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 21:49:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 21:49:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575582568.24_18838da5-f0c0-4a56-84b1-61ef8c859b20 finished.
19/12/05 21:49:35 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 21:49:35 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_2cb1f1bc-ec9c-41f8-928e-4f84e5585b15","basePath":"/tmp/sparktestTiQJHU"}: {}
java.io.FileNotFoundException: /tmp/sparktestTiQJHU/job_2cb1f1bc-ec9c-41f8-928e-4f84e5585b15/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140383749138176)>

# Thread: <Thread(Thread-116, started daemon 140383740745472)>

# Thread: <_MainThread(MainThread, started 140384878999296)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140383723960064)>

# Thread: <Thread(Thread-122, started daemon 140383732352768)>

# Thread: <Thread(Thread-116, started daemon 140383740745472)>

# Thread: <_MainThread(MainThread, started 140384878999296)>

# Thread: <Thread(wait_until_finish_read, started daemon 140383749138176)>

======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575582556.57_edf6f650-03ae-48b9-858f-23d1f7c74346 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 373.944s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 30s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/rrp7aygltzzve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1704

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1704/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 18:32:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:38127
19/12/05 18:32:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:42 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46433.
19/12/05 18:32:42 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 18:32:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43869.
19/12/05 18:32:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:42865
19/12/05 18:32:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:43 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:43 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 18:32:44 INFO sdk_worker_main.main: Logging handler created.
19/12/05 18:32:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:35857
19/12/05 18:32:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:44 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44565.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 18:32:44 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40453.
19/12/05 18:32:44 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:45579
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:44 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:44 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:44 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 18:32:45 INFO sdk_worker_main.main: Logging handler created.
19/12/05 18:32:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:40913
19/12/05 18:32:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:45 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35943.
19/12/05 18:32:45 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 18:32:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45179.
19/12/05 18:32:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:38599
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:45 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:45 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 18:32:46 INFO sdk_worker_main.main: Logging handler created.
19/12/05 18:32:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:38977
19/12/05 18:32:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 18:32:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 18:32:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 18:32:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575570760.32', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53783', 'job_port': u'0'}
19/12/05 18:32:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 18:32:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34881.
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 18:32:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 18:32:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 18:32:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46603.
19/12/05 18:32:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 18:32:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:37179
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 18:32:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 18:32:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 18:32:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 18:32:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 18:32:46 INFO sdk_worker.run: Done consuming work.
19/12/05 18:32:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 18:32:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 18:32:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 18:32:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575570760.32_861b4ca6-88b5-4795-8169-271b5f8ae1ef finished.
19/12/05 18:32:47 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 18:32:47 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_20ad8490-0c89-40ee-a22c-589bb52f7a7d","basePath":"/tmp/sparktestkd29Mf"}: {}
java.io.FileNotFoundException: /tmp/sparktestkd29Mf/job_20ad8490-0c89-40ee-a22c-589bb52f7a7d/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140152693319424)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-116, started daemon 140152676534016)>

======================================================================
# Thread: <_MainThread(MainThread, started 140153679021824)>
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
==================== Timed out after 60 seconds. ====================

    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140152659748608)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-122, started daemon 140152668141312)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140153679021824)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-116, started daemon 140152676534016)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 140152693319424)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575570749.81_9933592b-5e39-4ea2-ba36-ab218142165f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 379.471s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 52s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/bqxfjrkcycdoi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1703

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1703/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-8861] Disallow self-signed certificates by default in


------------------------------------------
[...truncated 1.31 MB...]
19/12/05 16:42:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 16:42:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks'] 
19/12/05 16:42:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575564128.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49997', 'job_port': u'0'}
19/12/05 16:42:13 INFO statecache.__init__: Creating state cache with size 0
19/12/05 16:42:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43467.
19/12/05 16:42:13 INFO sdk_worker.__init__: Control channel established.
19/12/05 16:42:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 16:42:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 256-1
19/12/05 16:42:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39263.
19/12/05 16:42:13 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 16:42:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:40081
19/12/05 16:42:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 16:42:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 16:42:14 INFO sdk_worker.run: No more requests from control plane
19/12/05 16:42:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 16:42:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:14 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 16:42:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 16:42:14 INFO sdk_worker.run: Done consuming work.
19/12/05 16:42:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 16:42:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 16:42:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:14 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 16:42:15 INFO sdk_worker_main.main: Logging handler created.
19/12/05 16:42:15 INFO sdk_worker_main.start: Status HTTP server running at localhost:41233
19/12/05 16:42:15 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 16:42:15 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 16:42:15 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks'] 
19/12/05 16:42:15 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575564128.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49997', 'job_port': u'0'}
19/12/05 16:42:15 INFO statecache.__init__: Creating state cache with size 0
19/12/05 16:42:15 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37317.
19/12/05 16:42:15 INFO sdk_worker.__init__: Control channel established.
19/12/05 16:42:15 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 257-1
19/12/05 16:42:15 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44115.
19/12/05 16:42:15 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 16:42:15 INFO data_plane.create_data_channel: Creating client data channel for localhost:33059
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 16:42:15 INFO sdk_worker.run: No more requests from control plane
19/12/05 16:42:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 16:42:15 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 16:42:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 16:42:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:15 INFO sdk_worker.run: Done consuming work.
19/12/05 16:42:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 16:42:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 16:42:16 INFO sdk_worker_main.main: Logging handler created.
19/12/05 16:42:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:38479
19/12/05 16:42:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 16:42:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 16:42:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=29', u'--enable_spark_metric_sinks'] 
19/12/05 16:42:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575564128.1', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49997', 'job_port': u'0'}
19/12/05 16:42:16 INFO statecache.__init__: Creating state cache with size 0
19/12/05 16:42:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39863.
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 258-1
19/12/05 16:42:16 INFO sdk_worker.__init__: Control channel established.
19/12/05 16:42:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 16:42:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36305.
19/12/05 16:42:16 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 16:42:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:33037
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 16:42:16 INFO sdk_worker.run: No more requests from control plane
19/12/05 16:42:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 16:42:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:16 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 16:42:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 16:42:16 INFO sdk_worker.run: Done consuming work.
19/12/05 16:42:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 16:42:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 16:42:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 16:42:16 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575564128.1_eb5f6f58-c67d-4d6d-9c30-7758bbd527e5 finished.
19/12/05 16:42:16 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 16:42:16 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d848e517-2981-45a5-ade7-0da5f59aa064","basePath":"/tmp/sparktestTFVGYL"}: {}
java.io.FileNotFoundException: /tmp/sparktestTFVGYL/job_d848e517-2981-45a5-ade7-0da5f59aa064/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140607776327424)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140607699810048)>

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140608556066560)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
# Thread: <Thread(wait_until_finish_read, started daemon 140607674631936)>

    equal_to([('a', 'a'), ('a', 'b')# Thread: <Thread(Thread-125, started daemon 140607683024640)>

# Thread: <_MainThread(MainThread, started 140608556066560)>

, ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 140607699810048)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140607776327424)>
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140607657846528)>

# Thread: <Thread(Thread-131, started daemon 140607666239232)>

# Thread: <_MainThread(MainThread, started 140608556066560)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575564114.56_6ecc68d8-2713-4bed-8849-a62f2bfb686a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 397.624s

FAILED (errors=4, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/x2n2ne2pzuiqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1702

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1702/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 12:26:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:33713
19/12/05 12:26:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 12:26:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 12:26:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 12:26:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575548774.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32945', 'job_port': u'0'}
19/12/05 12:26:17 INFO statecache.__init__: Creating state cache with size 0
19/12/05 12:26:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38503.
19/12/05 12:26:17 INFO sdk_worker.__init__: Control channel established.
19/12/05 12:26:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 12:26:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 12:26:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39167.
19/12/05 12:26:17 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 12:26:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:41327
19/12/05 12:26:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 12:26:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 12:26:17 INFO sdk_worker.run: No more requests from control plane
19/12/05 12:26:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 12:26:17 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 12:26:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 12:26:17 INFO sdk_worker.run: Done consuming work.
19/12/05 12:26:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 12:26:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 12:26:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 12:26:18 INFO sdk_worker_main.main: Logging handler created.
19/12/05 12:26:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:37165
19/12/05 12:26:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 12:26:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 12:26:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 12:26:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575548774.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32945', 'job_port': u'0'}
19/12/05 12:26:18 INFO statecache.__init__: Creating state cache with size 0
19/12/05 12:26:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43691.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 12:26:18 INFO sdk_worker.__init__: Control channel established.
19/12/05 12:26:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 12:26:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35757.
19/12/05 12:26:18 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 12:26:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:37521
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 12:26:18 INFO sdk_worker.run: No more requests from control plane
19/12/05 12:26:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 12:26:18 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 12:26:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 12:26:18 INFO sdk_worker.run: Done consuming work.
19/12/05 12:26:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 12:26:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 12:26:19 INFO sdk_worker_main.main: Logging handler created.
19/12/05 12:26:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:42201
19/12/05 12:26:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 12:26:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 12:26:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 12:26:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575548774.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32945', 'job_port': u'0'}
19/12/05 12:26:19 INFO statecache.__init__: Creating state cache with size 0
19/12/05 12:26:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38693.
19/12/05 12:26:19 INFO sdk_worker.__init__: Control channel established.
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 12:26:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 12:26:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38489.
19/12/05 12:26:19 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 12:26:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:38605
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 12:26:19 INFO sdk_worker.run: No more requests from control plane
19/12/05 12:26:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 12:26:19 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 12:26:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 12:26:19 INFO sdk_worker.run: Done consuming work.
19/12/05 12:26:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 12:26:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 12:26:20 INFO sdk_worker_main.main: Logging handler created.
19/12/05 12:26:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:45411
19/12/05 12:26:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 12:26:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 12:26:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 12:26:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575548774.92', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32945', 'job_port': u'0'}
19/12/05 12:26:20 INFO statecache.__init__: Creating state cache with size 0
19/12/05 12:26:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33957.
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 12:26:20 INFO sdk_worker.__init__: Control channel established.
19/12/05 12:26:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 12:26:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35965.
19/12/05 12:26:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 12:26:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:38135
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 12:26:20 INFO sdk_worker.run: No more requests from control plane
19/12/05 12:26:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 12:26:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:20 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 12:26:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 12:26:20 INFO sdk_worker.run: Done consuming work.
19/12/05 12:26:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 12:26:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 12:26:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 12:26:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575548774.92_076372a3-5ec6-4402-8fff-965f08690344 finished.
19/12/05 12:26:20 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 12:26:20 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_431ac23b-0ea2-4755-bddb-9cd9d8522d5f","basePath":"/tmp/sparktestpH07nU"}: {}
java.io.FileNotFoundException: /tmp/sparktestpH07nU/job_431ac23b-0ea2-4755-bddb-9cd9d8522d5f/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139928603416320)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-120, started daemon 139928595023616)>

  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 139929391048448)>
==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139928585844480)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-126, started daemon 139928577451776)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139929391048448)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139928603416320)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
# Thread: <Thread(Thread-120, started daemon 139928595023616)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575548765.75_b3efb9cb-9ad7-4736-b797-0181a01a1729 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.042s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 56s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/36dpr4ou7alai

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1701

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1701/display/redirect?page=changes>

Changes:

[lgajowy] [BEAM-6627] Add size reporting to JdbcIOIT (#10267)


------------------------------------------
[...truncated 1.39 MB...]
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_windowed_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 181, in test_pardo_windowed_side_inputs
    label='windowed')
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_read (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 578, in test_read
    equal_to(['a', 'b', 'c']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575543622.22_265bae3e-05be-4ee5-b555-1f2776f8ffd2 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 703.331s

FAILED (errors=6, skipped=9)

# Thread: <Thread(wait_until_finish_read, started daemon 140102398502656)>

# Thread: <Thread(Thread-93, started daemon 140102406895360)>

# Thread: <_MainThread(MainThread, started 140103195027200)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140102381717248)>

# Thread: <Thread(Thread-97, started daemon 140102390109952)>

# Thread: <_MainThread(MainThread, started 140103195027200)>

# Thread: <Thread(Thread-93, started daemon 140102406895360)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102398502656)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140101757691648)>

# Thread: <Thread(Thread-102, started daemon 140101766084352)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102381717248)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102398502656)>

# Thread: <Thread(Thread-93, started daemon 140102406895360)>

# Thread: <_MainThread(MainThread, started 140103195027200)>

# Thread: <Thread(Thread-97, started daemon 140102390109952)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140101740906240)>

# Thread: <Thread(Thread-107, started daemon 140101749298944)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102381717248)>

# Thread: <Thread(Thread-97, started daemon 140102390109952)>

# Thread: <Thread(Thread-93, started daemon 140102406895360)>

# Thread: <Thread(wait_until_finish_read, started daemon 140101757691648)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102398502656)>

# Thread: <_MainThread(MainThread, started 140103195027200)>

# Thread: <Thread(Thread-102, started daemon 140101766084352)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140101724120832)>

# Thread: <Thread(Thread-111, started daemon 140101732513536)>

# Thread: <Thread(wait_until_finish_read, started daemon 140101740906240)>

# Thread: <Thread(Thread-107, started daemon 140101749298944)>

# Thread: <Thread(wait_until_finish_read, started daemon 140101757691648)>

# Thread: <Thread(Thread-97, started daemon 140102390109952)>

# Thread: <Thread(wait_until_finish_read, started daemon 140102381717248)>

# Thread: <_MainThread(MainThread, started 140103195027200)>

# Thread: <Thread(Thread-102, started daemon 140101766084352)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 31s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ibemntcczq5uy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1700

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1700/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 06:15:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:45983
19/12/05 06:15:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 06:15:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 06:15:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 06:15:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575526503.05', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60481', 'job_port': u'0'}
19/12/05 06:15:05 INFO statecache.__init__: Creating state cache with size 0
19/12/05 06:15:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43029.
19/12/05 06:15:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 06:15:05 INFO sdk_worker.__init__: Control channel established.
19/12/05 06:15:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 06:15:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39189.
19/12/05 06:15:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 06:15:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:37847
19/12/05 06:15:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 06:15:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 06:15:05 INFO sdk_worker.run: No more requests from control plane
19/12/05 06:15:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 06:15:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 06:15:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 06:15:05 INFO sdk_worker.run: Done consuming work.
19/12/05 06:15:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 06:15:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 06:15:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 06:15:06 INFO sdk_worker_main.main: Logging handler created.
19/12/05 06:15:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:44249
19/12/05 06:15:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 06:15:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 06:15:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 06:15:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575526503.05', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60481', 'job_port': u'0'}
19/12/05 06:15:06 INFO statecache.__init__: Creating state cache with size 0
19/12/05 06:15:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44861.
19/12/05 06:15:06 INFO sdk_worker.__init__: Control channel established.
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 06:15:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 06:15:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46403.
19/12/05 06:15:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 06:15:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:42515
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 06:15:06 INFO sdk_worker.run: No more requests from control plane
19/12/05 06:15:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 06:15:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 06:15:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 06:15:06 INFO sdk_worker.run: Done consuming work.
19/12/05 06:15:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 06:15:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 06:15:07 INFO sdk_worker_main.main: Logging handler created.
19/12/05 06:15:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:44225
19/12/05 06:15:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 06:15:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 06:15:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 06:15:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575526503.05', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60481', 'job_port': u'0'}
19/12/05 06:15:07 INFO statecache.__init__: Creating state cache with size 0
19/12/05 06:15:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38295.
19/12/05 06:15:07 INFO sdk_worker.__init__: Control channel established.
19/12/05 06:15:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 06:15:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44447.
19/12/05 06:15:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 06:15:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:32939
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 06:15:07 INFO sdk_worker.run: No more requests from control plane
19/12/05 06:15:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 06:15:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 06:15:07 INFO sdk_worker.run: Done consuming work.
19/12/05 06:15:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 06:15:08 INFO sdk_worker_main.main: Logging handler created.
19/12/05 06:15:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:39475
19/12/05 06:15:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 06:15:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 06:15:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 06:15:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575526503.05', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60481', 'job_port': u'0'}
19/12/05 06:15:08 INFO statecache.__init__: Creating state cache with size 0
19/12/05 06:15:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39067.
19/12/05 06:15:08 INFO sdk_worker.__init__: Control channel established.
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 06:15:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 06:15:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43991.
19/12/05 06:15:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 06:15:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:39923
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 06:15:08 INFO sdk_worker.run: No more requests from control plane
19/12/05 06:15:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 06:15:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 06:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 06:15:08 INFO sdk_worker.run: Done consuming work.
19/12/05 06:15:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 06:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 06:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 06:15:08 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575526503.05_79ee8e7c-9018-49c7-8de9-5467b0b12eef finished.
19/12/05 06:15:08 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 06:15:08 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_8611292d-c435-420c-a625-f81a77e88428","basePath":"/tmp/sparktestWBHZhZ"}: {}
java.io.FileNotFoundException: /tmp/sparktestWBHZhZ/job_8611292d-c435-420c-a625-f81a77e88428/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <Thread(wait_until_finish_read, started daemon 140445623510784)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-117, started daemon 140445631903488)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140446758721280)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140445606725376)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-123, started daemon 140445615118080)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140446758721280)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):

# Thread: <Thread(Thread-117, started daemon 140445631903488)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(wait_until_finish_read, started daemon 140445623510784)>
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575526492.68_dad866b2-ace0-4dd5-82bb-bdab8f8cc326 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 337.066s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 19s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/m64j4wz465avm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1699

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1699/display/redirect?page=changes>

Changes:

[github] Merge pull request #10247: [BEAM-7274] In preparation for


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 05:27:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:40601
19/12/05 05:27:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 05:27:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 05:27:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 05:27:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575523663.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53773', 'job_port': u'0'}
19/12/05 05:27:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 05:27:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38243.
19/12/05 05:27:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 05:27:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 05:27:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35255.
19/12/05 05:27:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 05:27:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:38785
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 05:27:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 05:27:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 05:27:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 05:27:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 05:27:46 INFO sdk_worker.run: Done consuming work.
19/12/05 05:27:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 05:27:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 05:27:47 INFO sdk_worker_main.main: Logging handler created.
19/12/05 05:27:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:45121
19/12/05 05:27:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 05:27:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 05:27:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 05:27:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575523663.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53773', 'job_port': u'0'}
19/12/05 05:27:47 INFO statecache.__init__: Creating state cache with size 0
19/12/05 05:27:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43489.
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 05:27:47 INFO sdk_worker.__init__: Control channel established.
19/12/05 05:27:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 05:27:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43993.
19/12/05 05:27:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 05:27:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:35363
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 05:27:47 INFO sdk_worker.run: No more requests from control plane
19/12/05 05:27:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 05:27:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 05:27:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 05:27:47 INFO sdk_worker.run: Done consuming work.
19/12/05 05:27:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 05:27:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 05:27:48 INFO sdk_worker_main.main: Logging handler created.
19/12/05 05:27:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:42793
19/12/05 05:27:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 05:27:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 05:27:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 05:27:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575523663.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53773', 'job_port': u'0'}
19/12/05 05:27:48 INFO statecache.__init__: Creating state cache with size 0
19/12/05 05:27:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44621.
19/12/05 05:27:48 INFO sdk_worker.__init__: Control channel established.
19/12/05 05:27:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 05:27:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41443.
19/12/05 05:27:48 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 05:27:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:40193
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 05:27:48 INFO sdk_worker.run: No more requests from control plane
19/12/05 05:27:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 05:27:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:48 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 05:27:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 05:27:48 INFO sdk_worker.run: Done consuming work.
19/12/05 05:27:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 05:27:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 05:27:49 INFO sdk_worker_main.main: Logging handler created.
19/12/05 05:27:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:37829
19/12/05 05:27:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 05:27:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 05:27:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 05:27:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575523663.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53773', 'job_port': u'0'}
19/12/05 05:27:49 INFO statecache.__init__: Creating state cache with size 0
19/12/05 05:27:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36985.
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 05:27:49 INFO sdk_worker.__init__: Control channel established.
19/12/05 05:27:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 05:27:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40823.
19/12/05 05:27:49 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 05:27:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:34525
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 05:27:49 INFO sdk_worker.run: No more requests from control plane
19/12/05 05:27:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 05:27:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:49 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 05:27:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 05:27:49 INFO sdk_worker.run: Done consuming work.
19/12/05 05:27:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 05:27:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 05:27:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 05:27:49 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575523663.25_3e008bd1-e458-43a9-990c-a867ea567448 finished.
19/12/05 05:27:49 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 05:27:49 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_489573cc-5c3a-4bd5-b29e-4382134f7bc6","basePath":"/tmp/sparktestZzLP69"}: {}
java.io.FileNotFoundException: /tmp/sparktestZzLP69/job_489573cc-5c3a-4bd5-b29e-4382134f7bc6/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 139802192115456)>

# Thread: <Thread(Thread-119, started daemon 139802273539840)>

# Thread: <_MainThread(MainThread, started 139803060610816)>
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 139802175330048)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575523652.41_bc04077e-d636-4176-b465-dc5537d9bd76 failed in state FAILED: java.lang.UnsupportedOperationException: The A# Thread: <Thread(Thread-125, started daemon 139802183722752)>

# Thread: <_MainThread(MainThread, started 139803060610816)>

ctiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 139802273539840)>

----------------------------------------------------------------------
Ran 38 tests in 345.521s

# Thread: <Thread(wait_until_finish_read, started daemon 139802192115456)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 13s
60 actionable tasks: 51 executed, 9 from cache

Publishing build scan...
https://scans.gradle.com/s/bwyyljxtbj5tu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1698

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1698/display/redirect?page=changes>

Changes:

[chamikara] [BEAM-8884] Fix mongodb splitVector command result type issue (#10282)


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 02:08:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:41881
19/12/05 02:08:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 02:08:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 02:08:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 02:08:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575511716.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41481', 'job_port': u'0'}
19/12/05 02:08:38 INFO statecache.__init__: Creating state cache with size 0
19/12/05 02:08:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38465.
19/12/05 02:08:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 02:08:38 INFO sdk_worker.__init__: Control channel established.
19/12/05 02:08:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 02:08:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43791.
19/12/05 02:08:38 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 02:08:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:33035
19/12/05 02:08:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 02:08:39 INFO sdk_worker.run: No more requests from control plane
19/12/05 02:08:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 02:08:39 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 02:08:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 02:08:39 INFO sdk_worker.run: Done consuming work.
19/12/05 02:08:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 02:08:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 02:08:39 INFO sdk_worker_main.main: Logging handler created.
19/12/05 02:08:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:46663
19/12/05 02:08:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 02:08:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 02:08:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 02:08:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575511716.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41481', 'job_port': u'0'}
19/12/05 02:08:39 INFO statecache.__init__: Creating state cache with size 0
19/12/05 02:08:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42851.
19/12/05 02:08:39 INFO sdk_worker.__init__: Control channel established.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 02:08:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 02:08:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40801.
19/12/05 02:08:39 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 02:08:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:33585
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 02:08:39 INFO sdk_worker.run: No more requests from control plane
19/12/05 02:08:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 02:08:39 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 02:08:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 02:08:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:39 INFO sdk_worker.run: Done consuming work.
19/12/05 02:08:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 02:08:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 02:08:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 02:08:40 INFO sdk_worker_main.main: Logging handler created.
19/12/05 02:08:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:35083
19/12/05 02:08:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 02:08:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 02:08:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 02:08:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575511716.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41481', 'job_port': u'0'}
19/12/05 02:08:40 INFO statecache.__init__: Creating state cache with size 0
19/12/05 02:08:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35103.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 02:08:40 INFO sdk_worker.__init__: Control channel established.
19/12/05 02:08:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 02:08:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34895.
19/12/05 02:08:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 02:08:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:46319
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 02:08:40 INFO sdk_worker.run: No more requests from control plane
19/12/05 02:08:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 02:08:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 02:08:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 02:08:40 INFO sdk_worker.run: Done consuming work.
19/12/05 02:08:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 02:08:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 02:08:41 INFO sdk_worker_main.main: Logging handler created.
19/12/05 02:08:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:33173
19/12/05 02:08:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 02:08:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 02:08:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 02:08:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575511716.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41481', 'job_port': u'0'}
19/12/05 02:08:41 INFO statecache.__init__: Creating state cache with size 0
19/12/05 02:08:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33569.
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 02:08:41 INFO sdk_worker.__init__: Control channel established.
19/12/05 02:08:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 02:08:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45517.
19/12/05 02:08:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 02:08:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:34543
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 02:08:41 INFO sdk_worker.run: No more requests from control plane
19/12/05 02:08:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 02:08:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 02:08:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 02:08:41 INFO sdk_worker.run: Done consuming work.
19/12/05 02:08:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 02:08:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 02:08:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 02:08:41 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575511716.49_a0b84da8-ec4d-49d8-bef1-47b0d83c9ea3 finished.
19/12/05 02:08:41 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 02:08:41 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_cb1e17ec-87ad-4716-91ab-cef3cf84f2fa","basePath":"/tmp/sparktestfoljaI"}: {}
java.io.FileNotFoundException: /tmp/sparktestfoljaI/job_cb1e17ec-87ad-4716-91ab-cef3cf84f2fa/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140578402621184)>

# Thread: <Thread(Thread-120, started daemon 140578411013888)>

# Thread: <_MainThread(MainThread, started 140579199145728)>
==================== Timed out after 60 seconds. ====================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(wait_until_finish_read, started daemon 140578393704192)>

# Thread: <Thread(Thread-126, started daemon 140578385311488)>

Traceback (most recent call last):
# Thread: <Thread(Thread-120, started daemon 140578411013888)>

# Thread: <_MainThread(MainThread, started 140579199145728)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))

# Thread: <Thread(wait_until_finish_read, started daemon 140578402621184)>
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575511707.59_240e3073-c14d-4461-a4b9-15d04bc20a71 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.358s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 18s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ovlfydy6mzbca

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1697

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1697/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/05 00:38:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:35863
19/12/05 00:38:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 00:38:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 00:38:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 00:38:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575506322.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36133', 'job_port': u'0'}
19/12/05 00:38:45 INFO statecache.__init__: Creating state cache with size 0
19/12/05 00:38:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38485.
19/12/05 00:38:45 INFO sdk_worker.__init__: Control channel established.
19/12/05 00:38:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/05 00:38:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44171.
19/12/05 00:38:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 00:38:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:33657
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 00:38:45 INFO sdk_worker.run: No more requests from control plane
19/12/05 00:38:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 00:38:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 00:38:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 00:38:45 INFO sdk_worker.run: Done consuming work.
19/12/05 00:38:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 00:38:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 00:38:46 INFO sdk_worker_main.main: Logging handler created.
19/12/05 00:38:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:40211
19/12/05 00:38:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 00:38:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 00:38:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 00:38:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575506322.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36133', 'job_port': u'0'}
19/12/05 00:38:46 INFO statecache.__init__: Creating state cache with size 0
19/12/05 00:38:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36701.
19/12/05 00:38:46 INFO sdk_worker.__init__: Control channel established.
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/05 00:38:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 00:38:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46001.
19/12/05 00:38:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 00:38:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:38391
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 00:38:46 INFO sdk_worker.run: No more requests from control plane
19/12/05 00:38:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 00:38:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 00:38:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 00:38:46 INFO sdk_worker.run: Done consuming work.
19/12/05 00:38:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 00:38:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 00:38:47 INFO sdk_worker_main.main: Logging handler created.
19/12/05 00:38:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:35551
19/12/05 00:38:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 00:38:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 00:38:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 00:38:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575506322.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36133', 'job_port': u'0'}
19/12/05 00:38:47 INFO statecache.__init__: Creating state cache with size 0
19/12/05 00:38:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44557.
19/12/05 00:38:47 INFO sdk_worker.__init__: Control channel established.
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/05 00:38:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 00:38:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45465.
19/12/05 00:38:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 00:38:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:33819
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 00:38:47 INFO sdk_worker.run: No more requests from control plane
19/12/05 00:38:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 00:38:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 00:38:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 00:38:47 INFO sdk_worker.run: Done consuming work.
19/12/05 00:38:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 00:38:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/05 00:38:48 INFO sdk_worker_main.main: Logging handler created.
19/12/05 00:38:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:36921
19/12/05 00:38:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/05 00:38:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/05 00:38:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/05 00:38:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575506322.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36133', 'job_port': u'0'}
19/12/05 00:38:48 INFO statecache.__init__: Creating state cache with size 0
19/12/05 00:38:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43365.
19/12/05 00:38:48 INFO sdk_worker.__init__: Control channel established.
19/12/05 00:38:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/05 00:38:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33921.
19/12/05 00:38:48 INFO sdk_worker.create_state_handler: State channel established.
19/12/05 00:38:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:45487
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/05 00:38:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:48 INFO sdk_worker.run: No more requests from control plane
19/12/05 00:38:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/05 00:38:48 INFO data_plane.close: Closing all cached grpc data channels.
19/12/05 00:38:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/05 00:38:48 INFO sdk_worker.run: Done consuming work.
19/12/05 00:38:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/05 00:38:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/05 00:38:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/05 00:38:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575506322.25_d3811400-701b-4125-a7e2-00b263c5c8ca finished.
19/12/05 00:38:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/05 00:38:48 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a0f9c8ed-da6e-4a7e-a56d-d25fdc1f0531","basePath":"/tmp/sparktestqHzjRx"}: {}
java.io.FileNotFoundException: /tmp/sparktestqHzjRx/job_a0f9c8ed-da6e-4a7e-a56d-d25fdc1f0531/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575506311.01_1778b131-c2d7-4e74-ab57-97ce7aefe341 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 374.866s

FAILED (errors=3, skipped=9)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139975247197952)>

# Thread: <Thread(Thread-119, started daemon 139975255590656)>

# Thread: <_MainThread(MainThread, started 139976035329792)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139975221757696)>

# Thread: <Thread(Thread-125, started daemon 139975230150400)>

# Thread: <_MainThread(MainThread, started 139976035329792)>

# Thread: <Thread(wait_until_finish_read, started daemon 139975247197952)>

# Thread: <Thread(Thread-119, started daemon 139975255590656)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 37s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/hg46xzojugorw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1696

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1696/display/redirect?page=changes>

Changes:

[sniemitz] [BEAM-8809] Make the constructor for AvroWriteRequest public

[wenjialiu] [BEAM-8575] test_flatten_no_pcollection raises an exception and should


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 23:57:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:36423
19/12/04 23:57:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 23:57:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 23:57:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 23:57:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575503877.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35557', 'job_port': u'0'}
19/12/04 23:57:59 INFO statecache.__init__: Creating state cache with size 0
19/12/04 23:57:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39919.
19/12/04 23:57:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 23:57:59 INFO sdk_worker.__init__: Control channel established.
19/12/04 23:57:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 23:57:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39493.
19/12/04 23:57:59 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 23:57:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:46543
19/12/04 23:57:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 23:58:00 INFO sdk_worker.run: No more requests from control plane
19/12/04 23:58:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 23:58:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:00 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 23:58:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 23:58:00 INFO sdk_worker.run: Done consuming work.
19/12/04 23:58:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 23:58:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 23:58:00 INFO sdk_worker_main.main: Logging handler created.
19/12/04 23:58:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:45569
19/12/04 23:58:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 23:58:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 23:58:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 23:58:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575503877.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35557', 'job_port': u'0'}
19/12/04 23:58:00 INFO statecache.__init__: Creating state cache with size 0
19/12/04 23:58:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34677.
19/12/04 23:58:00 INFO sdk_worker.__init__: Control channel established.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 23:58:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 23:58:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33483.
19/12/04 23:58:00 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 23:58:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:44843
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 23:58:00 INFO sdk_worker.run: No more requests from control plane
19/12/04 23:58:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 23:58:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:00 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 23:58:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 23:58:00 INFO sdk_worker.run: Done consuming work.
19/12/04 23:58:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 23:58:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 23:58:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 23:58:01 INFO sdk_worker_main.main: Logging handler created.
19/12/04 23:58:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:37499
19/12/04 23:58:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 23:58:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 23:58:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 23:58:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575503877.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35557', 'job_port': u'0'}
19/12/04 23:58:01 INFO statecache.__init__: Creating state cache with size 0
19/12/04 23:58:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33519.
19/12/04 23:58:01 INFO sdk_worker.__init__: Control channel established.
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 23:58:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 23:58:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37421.
19/12/04 23:58:01 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 23:58:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:43739
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 23:58:01 INFO sdk_worker.run: No more requests from control plane
19/12/04 23:58:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 23:58:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:01 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 23:58:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 23:58:01 INFO sdk_worker.run: Done consuming work.
19/12/04 23:58:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 23:58:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 23:58:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 23:58:02 INFO sdk_worker_main.main: Logging handler created.
19/12/04 23:58:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:41013
19/12/04 23:58:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 23:58:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 23:58:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 23:58:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575503877.23', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35557', 'job_port': u'0'}
19/12/04 23:58:02 INFO statecache.__init__: Creating state cache with size 0
19/12/04 23:58:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33325.
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 23:58:02 INFO sdk_worker.__init__: Control channel established.
19/12/04 23:58:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 23:58:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35155.
19/12/04 23:58:02 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 23:58:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:36949
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 23:58:02 INFO sdk_worker.run: No more requests from control plane
19/12/04 23:58:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 23:58:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:02 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 23:58:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 23:58:02 INFO sdk_worker.run: Done consuming work.
19/12/04 23:58:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 23:58:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 23:58:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 23:58:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575503877.23_fd72d4ab-83ad-4431-9734-4220f7a89707 finished.
19/12/04 23:58:02 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 23:58:02 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_3e9d0f1c-e443-4f53-b963-a1cc1832e6de","basePath":"/tmp/sparktestsNnBqU"}: {}
java.io.FileNotFoundException: /tmp/sparktestsNnBqU/job_3e9d0f1c-e443-4f53-b963-a1cc1832e6de/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru==================== Timed out after 60 seconds. ====================

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
# Thread: <Thread(wait_until_finish_read, started daemon 139849474766592)>

    return self._next()
# Thread: <Thread(Thread-117, started daemon 139849483159296)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139850262898432)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139849457456896)>

# Thread: <Thread(Thread-123, started daemon 139849449064192)>

# Thread: <Thread(Thread-117, started daemon 139849483159296)>

# Thread: <_MainThread(MainThread, started 139850262898432)>

# Thread: <Thread(wait_until_finish_read, started daemon 139849474766592)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575503867.54_117f3f08-99e8-4df1-b1e1-e9fecf799d27 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 315.132s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 55s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/e36g6hotwqfh2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1695

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1695/display/redirect?page=changes>

Changes:

[ehudm] Moving to 2.19.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 1.31 MB...]
19/12/04 22:27:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc on Spark master local
19/12/04 22:27:27 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/04 22:27:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc: Pipeline translated successfully. Computing outputs
19/12/04 22:27:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:28 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:45975
19/12/04 22:27:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:28 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44877.
19/12/04 22:27:28 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/04 22:27:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39551.
19/12/04 22:27:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:38497
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:28 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:28 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:29 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:40411
19/12/04 22:27:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44355.
19/12/04 22:27:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 22:27:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39953.
19/12/04 22:27:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:37581
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:29 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:29 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:44733
19/12/04 22:27:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36773.
19/12/04 22:27:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 22:27:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34905.
19/12/04 22:27:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:40779
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:29 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:30 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:37521
19/12/04 22:27:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46031.
19/12/04 22:27:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 22:27:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35265.
19/12/04 22:27:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:45389
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:30 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 22:27:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 22:27:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:39233
19/12/04 22:27:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 22:27:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 22:27:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 22:27:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575498446.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46331', 'job_port': u'0'}
19/12/04 22:27:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 22:27:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36127.
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 22:27:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 22:27:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 22:27:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44415.
19/12/04 22:27:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 22:27:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:36371
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 22:27:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 22:27:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 22:27:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 22:27:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 22:27:31 INFO sdk_worker.run: Done consuming work.
19/12/04 22:27:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 22:27:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 22:27:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 22:27:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575498446.7_6217620d-90fc-4942-ba77-f05e3fc654fc finished.
19/12/04 22:27:31 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 22:27:31 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d1606f06-1da3-4b38-8c3a-56c6121087b8","basePath":"/tmp/sparktestm4vBWQ"}: {}
java.io.FileNotFoundException: /tmp/sparktestm4vBWQ/job_d1606f06-1da3-4b38-8c3a-56c6121087b8/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <Thread(wait_until_finish_read, started daemon 140240255547136)>

# Thread: <Thread(Thread-119, started daemon 140240247154432)>

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140241037457152)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575498438.29_fe91c121-f28a-468b-b72a-f99292055e55 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 278.453s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 23s
60 actionable tasks: 57 executed, 3 from cache

Publishing build scan...
https://scans.gradle.com/s/hx2fbet5v7hzm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1694

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1694/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-4287] Add trySplit API to Java restriction tracker matching Python

[lcwik] fixup!

[github] Add a comment on RLock perf issues

[lcwik] fixup!


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 21:45:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:44609
19/12/04 21:45:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 21:45:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 21:45:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 21:45:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575495926.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46985', 'job_port': u'0'}
19/12/04 21:45:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 21:45:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42999.
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 21:45:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 21:45:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 21:45:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41635.
19/12/04 21:45:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 21:45:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:45093
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 21:45:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 21:45:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 21:45:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 21:45:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 21:45:29 INFO sdk_worker.run: Done consuming work.
19/12/04 21:45:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 21:45:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 21:45:30 INFO sdk_worker_main.main: Logging handler created.
19/12/04 21:45:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:34477
19/12/04 21:45:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 21:45:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 21:45:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 21:45:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575495926.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46985', 'job_port': u'0'}
19/12/04 21:45:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 21:45:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34541.
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 21:45:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 21:45:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 21:45:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41721.
19/12/04 21:45:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 21:45:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:41203
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 21:45:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 21:45:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 21:45:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 21:45:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 21:45:30 INFO sdk_worker.run: Done consuming work.
19/12/04 21:45:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 21:45:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 21:45:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 21:45:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:41469
19/12/04 21:45:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 21:45:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 21:45:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 21:45:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575495926.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46985', 'job_port': u'0'}
19/12/04 21:45:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 21:45:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33105.
19/12/04 21:45:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 21:45:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 21:45:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37851.
19/12/04 21:45:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 21:45:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:39851
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 21:45:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 21:45:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 21:45:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 21:45:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 21:45:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:31 INFO sdk_worker.run: Done consuming work.
19/12/04 21:45:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 21:45:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 21:45:32 INFO sdk_worker_main.main: Logging handler created.
19/12/04 21:45:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:42987
19/12/04 21:45:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 21:45:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 21:45:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 21:45:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575495926.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46985', 'job_port': u'0'}
19/12/04 21:45:32 INFO statecache.__init__: Creating state cache with size 0
19/12/04 21:45:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44831.
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 21:45:32 INFO sdk_worker.__init__: Control channel established.
19/12/04 21:45:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 21:45:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33619.
19/12/04 21:45:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 21:45:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:34889
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 21:45:32 INFO sdk_worker.run: No more requests from control plane
19/12/04 21:45:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 21:45:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 21:45:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 21:45:32 INFO sdk_worker.run: Done consuming work.
19/12/04 21:45:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 21:45:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 21:45:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 21:45:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575495926.47_23753869-7295-4cd5-9ca3-0c5d3d6306fa finished.
19/12/04 21:45:32 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 21:45:32 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_12ccfbc0-b337-4b65-8bc3-0cb9db77b6a6","basePath":"/tmp/sparktestY0PgyL"}: {}
java.io.FileNotFoundException: /tmp/sparktestY0PgyL/job_12ccfbc0-b337-4b65-8bc3-0cb9db77b6a6/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140518830888704)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-120, started daemon 140518847674112)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140519969101568)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(wait_until_finish_read, started daemon 140518814103296)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-126, started daemon 140518822496000)>

# Thread: <_MainThread(MainThread, started 140519969101568)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575495916.08_6e1630e4-802b-486b-8101-4dfbd37226c9 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-120, started daemon 140518847674112)>

----------------------------------------------------------------------
Ran 38 tests in 337.548s

# Thread: <Thread(wait_until_finish_read, started daemon 140518830888704)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 37s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/s6utnjquisoy4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1693

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1693/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8662] Remove Py3 annotations support from

[github] [BEAM-8481] Revert the increase in Postcommit timeout


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 20:35:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:45247
19/12/04 20:35:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 20:35:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 20:35:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 20:35:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575491723.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41411', 'job_port': u'0'}
19/12/04 20:35:26 INFO statecache.__init__: Creating state cache with size 0
19/12/04 20:35:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42111.
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 20:35:26 INFO sdk_worker.__init__: Control channel established.
19/12/04 20:35:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 20:35:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34755.
19/12/04 20:35:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 20:35:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:43895
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 20:35:26 INFO sdk_worker.run: No more requests from control plane
19/12/04 20:35:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 20:35:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 20:35:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 20:35:26 INFO sdk_worker.run: Done consuming work.
19/12/04 20:35:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 20:35:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 20:35:27 INFO sdk_worker_main.main: Logging handler created.
19/12/04 20:35:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:41337
19/12/04 20:35:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 20:35:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 20:35:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 20:35:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575491723.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41411', 'job_port': u'0'}
19/12/04 20:35:27 INFO statecache.__init__: Creating state cache with size 0
19/12/04 20:35:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43403.
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 20:35:27 INFO sdk_worker.__init__: Control channel established.
19/12/04 20:35:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 20:35:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45905.
19/12/04 20:35:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 20:35:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:44551
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 20:35:27 INFO sdk_worker.run: No more requests from control plane
19/12/04 20:35:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 20:35:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 20:35:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 20:35:27 INFO sdk_worker.run: Done consuming work.
19/12/04 20:35:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 20:35:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 20:35:28 INFO sdk_worker_main.main: Logging handler created.
19/12/04 20:35:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:39731
19/12/04 20:35:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 20:35:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 20:35:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 20:35:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575491723.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41411', 'job_port': u'0'}
19/12/04 20:35:28 INFO statecache.__init__: Creating state cache with size 0
19/12/04 20:35:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46547.
19/12/04 20:35:28 INFO sdk_worker.__init__: Control channel established.
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 20:35:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 20:35:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41475.
19/12/04 20:35:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 20:35:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:33213
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 20:35:28 INFO sdk_worker.run: No more requests from control plane
19/12/04 20:35:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 20:35:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 20:35:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 20:35:28 INFO sdk_worker.run: Done consuming work.
19/12/04 20:35:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 20:35:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 20:35:29 INFO sdk_worker_main.main: Logging handler created.
19/12/04 20:35:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:43651
19/12/04 20:35:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 20:35:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 20:35:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 20:35:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575491723.72', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41411', 'job_port': u'0'}
19/12/04 20:35:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 20:35:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40385.
19/12/04 20:35:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 20:35:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 20:35:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41981.
19/12/04 20:35:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 20:35:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:39379
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 20:35:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 20:35:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 20:35:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 20:35:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 20:35:29 INFO sdk_worker.run: Done consuming work.
19/12/04 20:35:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 20:35:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 20:35:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 20:35:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575491723.72_9378cec0-a42a-4483-82d5-5cc7ae12e54d finished.
19/12/04 20:35:29 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 20:35:29 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_374907c7-30f5-48b9-9de6-f3cc677df2a7","basePath":"/tmp/sparktestXF497R"}: {}
java.io.FileNotFoundException: /tmp/sparktestXF497R/job_374907c7-30f5-48b9-9de6-f3cc677df2a7/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140239925823232)>

# Thread: <Thread(Thread-119, started daemon 140239648880384)>

BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <_MainThread(MainThread, started 140240444344064)>
==================== Timed out after 60 seconds. ====================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140239630259968)>

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(Thread-125, started daemon 140239638652672)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140239648880384)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140240444344064)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140239925823232)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575491713.33_7b1ea2e6-8a34-43f5-81ee-cd1f99a33812 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 355.118s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 51s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/f4ns76y4jvqju

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1692

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1692/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 18:21:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:32909
19/12/04 18:21:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 18:21:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 18:21:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 18:21:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575483714.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58617', 'job_port': u'0'}
19/12/04 18:21:56 INFO statecache.__init__: Creating state cache with size 0
19/12/04 18:21:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39321.
19/12/04 18:21:56 INFO sdk_worker.__init__: Control channel established.
19/12/04 18:21:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 18:21:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42065.
19/12/04 18:21:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 18:21:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:42201
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 18:21:56 INFO sdk_worker.run: No more requests from control plane
19/12/04 18:21:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 18:21:56 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 18:21:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 18:21:56 INFO sdk_worker.run: Done consuming work.
19/12/04 18:21:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 18:21:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 18:21:57 INFO sdk_worker_main.main: Logging handler created.
19/12/04 18:21:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:41679
19/12/04 18:21:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 18:21:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 18:21:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 18:21:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575483714.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58617', 'job_port': u'0'}
19/12/04 18:21:57 INFO statecache.__init__: Creating state cache with size 0
19/12/04 18:21:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38849.
19/12/04 18:21:57 INFO sdk_worker.__init__: Control channel established.
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 18:21:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 18:21:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42513.
19/12/04 18:21:57 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 18:21:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:40479
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 18:21:57 INFO sdk_worker.run: No more requests from control plane
19/12/04 18:21:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 18:21:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:57 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 18:21:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 18:21:57 INFO sdk_worker.run: Done consuming work.
19/12/04 18:21:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 18:21:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 18:21:58 INFO sdk_worker_main.main: Logging handler created.
19/12/04 18:21:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:33177
19/12/04 18:21:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 18:21:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 18:21:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 18:21:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575483714.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58617', 'job_port': u'0'}
19/12/04 18:21:58 INFO statecache.__init__: Creating state cache with size 0
19/12/04 18:21:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34933.
19/12/04 18:21:58 INFO sdk_worker.__init__: Control channel established.
19/12/04 18:21:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 18:21:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39035.
19/12/04 18:21:58 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 18:21:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:42339
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 18:21:58 INFO sdk_worker.run: No more requests from control plane
19/12/04 18:21:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 18:21:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 18:21:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 18:21:58 INFO sdk_worker.run: Done consuming work.
19/12/04 18:21:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 18:21:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 18:21:59 INFO sdk_worker_main.main: Logging handler created.
19/12/04 18:21:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:41455
19/12/04 18:21:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 18:21:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 18:21:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 18:21:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575483714.08', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58617', 'job_port': u'0'}
19/12/04 18:21:59 INFO statecache.__init__: Creating state cache with size 0
19/12/04 18:21:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34365.
19/12/04 18:21:59 INFO sdk_worker.__init__: Control channel established.
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 18:21:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 18:21:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44115.
19/12/04 18:21:59 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 18:21:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:37205
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 18:21:59 INFO sdk_worker.run: No more requests from control plane
19/12/04 18:21:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 18:21:59 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 18:21:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 18:21:59 INFO sdk_worker.run: Done consuming work.
19/12/04 18:21:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 18:21:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 18:21:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 18:21:59 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575483714.08_63100ec5-fdc3-4adf-bd50-6e11b275fb44 finished.
19/12/04 18:21:59 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 18:21:59 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9efbd784-b92d-4c0a-92d5-23fe331890cc","basePath":"/tmp/sparktesthIEktb"}: {}
java.io.FileNotFoundException: /tmp/sparktesthIEktb/job_9efbd784-b92d-4c0a-92d5-23fe331890cc/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 139944758126336)>

# Thread: <Thread(Thread-118, started daemon 139944749733632)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139945546258176)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139944127362816)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 139944740292352)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 139945546258176)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-118, started daemon 139944749733632)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139944758126336)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575483704.65_6c1f8825-ec69-4452-a1a3-4756f4e7c47f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 323.691s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 36s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/znwea4wcnnbi2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1691

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1691/display/redirect?page=changes>

Changes:

[crites] Adds translation support for TestStream to Dataflow Java runner.

[crites] Formatting cleanup using gradlew spotnessApply.


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 17:25:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:33789
19/12/04 17:25:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 17:25:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 17:25:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 17:25:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575480314.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33645', 'job_port': u'0'}
19/12/04 17:25:16 INFO statecache.__init__: Creating state cache with size 0
19/12/04 17:25:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34211.
19/12/04 17:25:16 INFO sdk_worker.__init__: Control channel established.
19/12/04 17:25:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 17:25:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 17:25:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34839.
19/12/04 17:25:16 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 17:25:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:41425
19/12/04 17:25:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 17:25:17 INFO sdk_worker.run: No more requests from control plane
19/12/04 17:25:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 17:25:17 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 17:25:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 17:25:17 INFO sdk_worker.run: Done consuming work.
19/12/04 17:25:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 17:25:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 17:25:17 INFO sdk_worker_main.main: Logging handler created.
19/12/04 17:25:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:36555
19/12/04 17:25:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 17:25:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 17:25:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 17:25:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575480314.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33645', 'job_port': u'0'}
19/12/04 17:25:17 INFO statecache.__init__: Creating state cache with size 0
19/12/04 17:25:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33543.
19/12/04 17:25:17 INFO sdk_worker.__init__: Control channel established.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 17:25:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 17:25:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44123.
19/12/04 17:25:17 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 17:25:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:40993
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 17:25:17 INFO sdk_worker.run: No more requests from control plane
19/12/04 17:25:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 17:25:17 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 17:25:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 17:25:17 INFO sdk_worker.run: Done consuming work.
19/12/04 17:25:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 17:25:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 17:25:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 17:25:18 INFO sdk_worker_main.main: Logging handler created.
19/12/04 17:25:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:35015
19/12/04 17:25:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 17:25:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 17:25:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 17:25:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575480314.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33645', 'job_port': u'0'}
19/12/04 17:25:18 INFO statecache.__init__: Creating state cache with size 0
19/12/04 17:25:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44905.
19/12/04 17:25:18 INFO sdk_worker.__init__: Control channel established.
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 17:25:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 17:25:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33209.
19/12/04 17:25:18 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 17:25:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:39857
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 17:25:18 INFO sdk_worker.run: No more requests from control plane
19/12/04 17:25:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 17:25:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:18 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 17:25:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 17:25:18 INFO sdk_worker.run: Done consuming work.
19/12/04 17:25:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 17:25:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 17:25:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 17:25:19 INFO sdk_worker_main.main: Logging handler created.
19/12/04 17:25:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:45771
19/12/04 17:25:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 17:25:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 17:25:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 17:25:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575480314.42', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33645', 'job_port': u'0'}
19/12/04 17:25:19 INFO statecache.__init__: Creating state cache with size 0
19/12/04 17:25:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34843.
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 17:25:19 INFO sdk_worker.__init__: Control channel established.
19/12/04 17:25:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 17:25:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33845.
19/12/04 17:25:19 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 17:25:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:42749
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 17:25:19 INFO sdk_worker.run: No more requests from control plane
19/12/04 17:25:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 17:25:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:19 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 17:25:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 17:25:19 INFO sdk_worker.run: Done consuming work.
19/12/04 17:25:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 17:25:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 17:25:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 17:25:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575480314.42_1b58c62c-ffc5-4f99-82c3-bc99c25b4784 finished.
19/12/04 17:25:19 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 17:25:19 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_0b0fe65a-ecc9-40fb-8b44-e93e64c64324","basePath":"/tmp/sparktestZ9A4JM"}: {}
java.io.FileNotFoundException: /tmp/sparktestZ9A4JM/job_0b0fe65a-ecc9-40fb-8b44-e93e64c64324/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139705756677888)>

# Thread: <Thread(Thread-119, started daemon 139706107401984)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


# Thread: <_MainThread(MainThread, started 139706886641408)>
==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 139705739892480)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575480305.03_bdef18ec-859e-483e-850b-73a697fe50a2 failed in state FAILED: java.lang.UnsupportedOperationException: The A# Thread: <Thread(Thread-125, started daemon 139705731499776)>

# Thread: <Thread(Thread-119, started daemon 139706107401984)>

ctiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 139706886641408)>

----------------------------------------------------------------------
Ran 38 tests in 315.803s

# Thread: <Thread(wait_until_finish_read, started daemon 139705756677888)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 39s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/5zmfquzecif5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1690

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1690/display/redirect?page=changes>

Changes:

[suztomo] Hadoop client 2.8

[suztomo] Elasticsearch-hadoop's use of commons-httpclient

[suztomo] Hardcoding dfs.nameservices

[suztomo] Updated comment

[suztomo] Fixed unused import


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 16:52:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:33491
19/12/04 16:52:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 16:52:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 16:52:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 16:52:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575478347.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59393', 'job_port': u'0'}
19/12/04 16:52:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 16:52:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40291.
19/12/04 16:52:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 16:52:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 16:52:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40911.
19/12/04 16:52:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 16:52:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:41867
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 16:52:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 16:52:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 16:52:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 16:52:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 16:52:30 INFO sdk_worker.run: Done consuming work.
19/12/04 16:52:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 16:52:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 16:52:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 16:52:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:39673
19/12/04 16:52:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 16:52:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 16:52:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 16:52:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575478347.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59393', 'job_port': u'0'}
19/12/04 16:52:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 16:52:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33659.
19/12/04 16:52:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 16:52:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 16:52:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34731.
19/12/04 16:52:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 16:52:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:33181
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 16:52:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 16:52:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 16:52:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 16:52:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 16:52:31 INFO sdk_worker.run: Done consuming work.
19/12/04 16:52:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 16:52:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 16:52:32 INFO sdk_worker_main.main: Logging handler created.
19/12/04 16:52:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:40707
19/12/04 16:52:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 16:52:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 16:52:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 16:52:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575478347.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59393', 'job_port': u'0'}
19/12/04 16:52:32 INFO statecache.__init__: Creating state cache with size 0
19/12/04 16:52:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39851.
19/12/04 16:52:32 INFO sdk_worker.__init__: Control channel established.
19/12/04 16:52:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 16:52:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42809.
19/12/04 16:52:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 16:52:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:40723
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 16:52:32 INFO sdk_worker.run: No more requests from control plane
19/12/04 16:52:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 16:52:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 16:52:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 16:52:32 INFO sdk_worker.run: Done consuming work.
19/12/04 16:52:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 16:52:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:32 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 16:52:33 INFO sdk_worker_main.main: Logging handler created.
19/12/04 16:52:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:41565
19/12/04 16:52:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 16:52:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 16:52:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 16:52:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575478347.63', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59393', 'job_port': u'0'}
19/12/04 16:52:33 INFO statecache.__init__: Creating state cache with size 0
19/12/04 16:52:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36277.
19/12/04 16:52:33 INFO sdk_worker.__init__: Control channel established.
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 16:52:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 16:52:33 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34781.
19/12/04 16:52:33 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 16:52:33 INFO data_plane.create_data_channel: Creating client data channel for localhost:42649
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 16:52:33 INFO sdk_worker.run: No more requests from control plane
19/12/04 16:52:33 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 16:52:33 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 16:52:33 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 16:52:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:33 INFO sdk_worker.run: Done consuming work.
19/12/04 16:52:33 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 16:52:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 16:52:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 16:52:33 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575478347.63_14c488d0-88bc-4fc7-a9af-471a15b9838e finished.
19/12/04 16:52:33 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 16:52:33 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_1c607c90-51cf-4a3e-a5b9-d13f7871e5e2","basePath":"/tmp/sparktestqcNM7G"}: {}
java.io.FileNotFoundException: /tmp/sparktestqcNM7G/job_1c607c90-51cf-4a3e-a5b9-d13f7871e5e2/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 139865125283584)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-117, started daemon 139865108498176)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139865905022720)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575478337.44_133dbe3e-7959-4758-b7d2-051857ce95ed failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 338.990s

FAILED (errors=3, skipped=9)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139865090926336)>

# Thread: <Thread(Thread-123, started daemon 139865099319040)>

# Thread: <Thread(Thread-117, started daemon 139865108498176)>

# Thread: <_MainThread(MainThread, started 139865905022720)>

# Thread: <Thread(wait_until_finish_read, started daemon 139865125283584)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 55s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/7g2xypz3sucxg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1689

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1689/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 12:10:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:35519
19/12/04 12:10:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 12:10:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 12:10:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 12:10:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575461426.87', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45413', 'job_port': u'0'}
19/12/04 12:10:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 12:10:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38141.
19/12/04 12:10:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 12:10:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 12:10:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34925.
19/12/04 12:10:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 12:10:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:44557
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 12:10:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 12:10:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 12:10:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 12:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 12:10:29 INFO sdk_worker.run: Done consuming work.
19/12/04 12:10:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 12:10:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 12:10:30 INFO sdk_worker_main.main: Logging handler created.
19/12/04 12:10:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:40119
19/12/04 12:10:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 12:10:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 12:10:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 12:10:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575461426.87', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45413', 'job_port': u'0'}
19/12/04 12:10:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 12:10:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39315.
19/12/04 12:10:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 12:10:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 12:10:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33531.
19/12/04 12:10:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 12:10:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:45907
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 12:10:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 12:10:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 12:10:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 12:10:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 12:10:30 INFO sdk_worker.run: Done consuming work.
19/12/04 12:10:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 12:10:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 12:10:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 12:10:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:34501
19/12/04 12:10:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 12:10:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 12:10:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 12:10:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575461426.87', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45413', 'job_port': u'0'}
19/12/04 12:10:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 12:10:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44337.
19/12/04 12:10:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 12:10:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 12:10:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34801.
19/12/04 12:10:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 12:10:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:42231
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 12:10:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 12:10:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 12:10:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 12:10:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 12:10:31 INFO sdk_worker.run: Done consuming work.
19/12/04 12:10:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 12:10:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 12:10:32 INFO sdk_worker_main.main: Logging handler created.
19/12/04 12:10:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:46801
19/12/04 12:10:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 12:10:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 12:10:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 12:10:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575461426.87', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45413', 'job_port': u'0'}
19/12/04 12:10:32 INFO statecache.__init__: Creating state cache with size 0
19/12/04 12:10:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42001.
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 12:10:32 INFO sdk_worker.__init__: Control channel established.
19/12/04 12:10:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 12:10:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44249.
19/12/04 12:10:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 12:10:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:37121
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 12:10:32 INFO sdk_worker.run: No more requests from control plane
19/12/04 12:10:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 12:10:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 12:10:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 12:10:32 INFO sdk_worker.run: Done consuming work.
19/12/04 12:10:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 12:10:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 12:10:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 12:10:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575461426.87_aaf1a1fa-b749-4e92-a54b-20f9463a5cb4 finished.
19/12/04 12:10:32 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 12:10:32 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d3c627bb-b8a0-41dd-89b2-3d9a35210dd0","basePath":"/tmp/sparktestV7ikWu"}: {}
java.io.FileNotFoundException: /tmp/sparktestV7ikWu/job_d3c627bb-b8a0-41dd-89b2-3d9a35210dd0/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139848725260032)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 139848716867328)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
# Thread: <_MainThread(MainThread, started 139849504999168)>
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 139848220403456)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-126, started daemon 139848228796160)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 139849504999168)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 139848716867328)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139848725260032)>
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575461416.93_cf58e32e-8b51-46e2-b0cf-60b7a99bbcfe failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.111s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 57s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/hjjisjtwhc74u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1688

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1688/display/redirect?page=changes>

Changes:

[sambvfx] [BEAM-8836] Make ExternalTransform unique_name unique

[sambvfx] add simple unique_name test; remove all uses of

[sambvfx] fixup: pylint fix


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 11:48:14 INFO sdk_worker_main.start: Status HTTP server running at localhost:46113
19/12/04 11:48:14 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:48:14 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:48:14 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:48:14 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575460091.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36225', 'job_port': u'0'}
19/12/04 11:48:14 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:48:14 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37995.
19/12/04 11:48:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 11:48:14 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:48:14 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:48:14 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37483.
19/12/04 11:48:14 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:48:14 INFO data_plane.create_data_channel: Creating client data channel for localhost:36253
19/12/04 11:48:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:48:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:48:14 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:48:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:48:14 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:48:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:48:14 INFO sdk_worker.run: Done consuming work.
19/12/04 11:48:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:48:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:48:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:48:15 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:48:15 INFO sdk_worker_main.start: Status HTTP server running at localhost:36903
19/12/04 11:48:15 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:48:15 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:48:15 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:48:15 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575460091.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36225', 'job_port': u'0'}
19/12/04 11:48:15 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:48:15 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43227.
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 11:48:15 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:48:15 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:48:15 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43895.
19/12/04 11:48:15 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:48:15 INFO data_plane.create_data_channel: Creating client data channel for localhost:34779
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:48:15 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:48:15 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:48:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:15 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:48:15 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:48:15 INFO sdk_worker.run: Done consuming work.
19/12/04 11:48:15 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:48:15 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:48:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:48:16 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:48:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:33577
19/12/04 11:48:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:48:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:48:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:48:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575460091.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36225', 'job_port': u'0'}
19/12/04 11:48:16 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:48:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38715.
19/12/04 11:48:16 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:48:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 11:48:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33415.
19/12/04 11:48:16 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:48:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:38723
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:48:16 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:48:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:48:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:16 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:48:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:48:16 INFO sdk_worker.run: Done consuming work.
19/12/04 11:48:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:48:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:48:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:48:17 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:48:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:44317
19/12/04 11:48:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:48:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:48:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:48:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575460091.98', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36225', 'job_port': u'0'}
19/12/04 11:48:17 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:48:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38457.
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 11:48:17 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:48:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:48:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38275.
19/12/04 11:48:17 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:48:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:42971
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:48:17 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:48:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:48:17 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:48:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:48:17 INFO sdk_worker.run: Done consuming work.
19/12/04 11:48:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:48:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:48:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:48:18 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575460091.98_e565b90c-7d30-49fe-82ab-72eaf5d26b41 finished.
19/12/04 11:48:18 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 11:48:18 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_1b31baff-346a-4108-8033-b47e51230def","basePath":"/tmp/sparktestgqYk9O"}: {}
java.io.FileNotFoundException: /tmp/sparktestgqYk9O/job_1b31baff-346a-4108-8033-b47e51230def/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139858077013760)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-117, started daemon 139858085406464)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139859215615744)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139858060228352)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)

# Thread: <Thread(Thread-123, started daemon 139858068621056)>

# Thread: <_MainThread(MainThread, started 139859215615744)>

# Thread: <Thread(Thread-117, started daemon 139858085406464)>

# Thread: <Thread(wait_until_finish_read, started daemon 139858077013760)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575460080.94_2aadb51c-66d6-48a7-ad23-a04f4fbab691 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 336.104s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 15s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/ryigf7fbujw3o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1687

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1687/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-8869] Exclude system metrics test from legacy runner test suite


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 11:32:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:37931
19/12/04 11:32:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:32:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:32:34 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:32:34 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575459151.68', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53997', 'job_port': u'0'}
19/12/04 11:32:34 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:32:34 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42157.
19/12/04 11:32:34 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 11:32:34 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:32:34 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:32:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40603.
19/12/04 11:32:34 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:32:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:38579
19/12/04 11:32:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:32:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:32:34 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:32:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:32:35 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:32:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:32:35 INFO sdk_worker.run: Done consuming work.
19/12/04 11:32:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:32:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:32:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:32:36 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:32:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:46191
19/12/04 11:32:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:32:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:32:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:32:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575459151.68', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53997', 'job_port': u'0'}
19/12/04 11:32:36 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:32:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41695.
19/12/04 11:32:36 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 11:32:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:32:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44641.
19/12/04 11:32:36 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:32:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:33221
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:32:36 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:32:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:32:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:36 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:32:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:32:36 INFO sdk_worker.run: Done consuming work.
19/12/04 11:32:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:32:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:32:37 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:32:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:34587
19/12/04 11:32:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:32:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:32:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:32:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575459151.68', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53997', 'job_port': u'0'}
19/12/04 11:32:37 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:32:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38809.
19/12/04 11:32:37 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:32:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 11:32:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45645.
19/12/04 11:32:37 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:32:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:43051
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:32:37 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:32:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:32:37 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:32:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:32:37 INFO sdk_worker.run: Done consuming work.
19/12/04 11:32:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:32:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 11:32:38 INFO sdk_worker_main.main: Logging handler created.
19/12/04 11:32:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:32871
19/12/04 11:32:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 11:32:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 11:32:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 11:32:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575459151.68', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53997', 'job_port': u'0'}
19/12/04 11:32:38 INFO statecache.__init__: Creating state cache with size 0
19/12/04 11:32:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40209.
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 11:32:38 INFO sdk_worker.__init__: Control channel established.
19/12/04 11:32:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 11:32:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34555.
19/12/04 11:32:38 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 11:32:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:42019
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 11:32:38 INFO sdk_worker.run: No more requests from control plane
19/12/04 11:32:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 11:32:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:38 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 11:32:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 11:32:38 INFO sdk_worker.run: Done consuming work.
19/12/04 11:32:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 11:32:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 11:32:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 11:32:38 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575459151.68_54d713d7-34cd-4b79-aea8-b202a6506e79 finished.
19/12/04 11:32:38 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 11:32:38 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f099d220-a764-45ad-afe6-50a69ed4ffcb","basePath":"/tmp/sparktestMabO58"}: {}
java.io.FileNotFoundException: /tmp/sparktestMabO58/job_f099d220-a764-45ad-afe6-50a69ed4ffcb/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140525282371328)>

# Thread: <Thread(Thread-119, started daemon 140525273978624)>

# Thread: <_MainThread(MainThread, started 140526069442304)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140525255096064)>

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140525263750912)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-119, started daemon 140525273978624)>

# Thread: <Thread(wait_until_finish_read, started daemon 140525282371328)>

# Thread: <_MainThread(MainThread, started 140526069442304)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575459140.28_8bdce242-9f94-4309-bd88-c90a9ab91b2a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 370.820s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 20s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/gsnibvhy74fqo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1686

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1686/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8883] downgrade 'Failed to remove job staging directory' log level


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 09:29:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:41711
19/12/04 09:29:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 09:29:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 09:29:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 09:29:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575451777.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60061', 'job_port': u'0'}
19/12/04 09:29:40 INFO statecache.__init__: Creating state cache with size 0
19/12/04 09:29:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39997.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 09:29:40 INFO sdk_worker.__init__: Control channel established.
19/12/04 09:29:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 09:29:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36025.
19/12/04 09:29:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 09:29:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:43165
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 09:29:40 INFO sdk_worker.run: No more requests from control plane
19/12/04 09:29:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 09:29:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 09:29:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 09:29:40 INFO sdk_worker.run: Done consuming work.
19/12/04 09:29:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 09:29:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 09:29:40 INFO sdk_worker_main.main: Logging handler created.
19/12/04 09:29:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:41625
19/12/04 09:29:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 09:29:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 09:29:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 09:29:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575451777.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60061', 'job_port': u'0'}
19/12/04 09:29:40 INFO statecache.__init__: Creating state cache with size 0
19/12/04 09:29:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40019.
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 09:29:40 INFO sdk_worker.__init__: Control channel established.
19/12/04 09:29:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 09:29:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36075.
19/12/04 09:29:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 09:29:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:44647
19/12/04 09:29:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 09:29:41 INFO sdk_worker.run: No more requests from control plane
19/12/04 09:29:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 09:29:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 09:29:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 09:29:41 INFO sdk_worker.run: Done consuming work.
19/12/04 09:29:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 09:29:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 09:29:41 INFO sdk_worker_main.main: Logging handler created.
19/12/04 09:29:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:33333
19/12/04 09:29:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 09:29:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 09:29:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 09:29:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575451777.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60061', 'job_port': u'0'}
19/12/04 09:29:41 INFO statecache.__init__: Creating state cache with size 0
19/12/04 09:29:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34073.
19/12/04 09:29:41 INFO sdk_worker.__init__: Control channel established.
19/12/04 09:29:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 09:29:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36519.
19/12/04 09:29:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 09:29:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:42851
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 09:29:41 INFO sdk_worker.run: No more requests from control plane
19/12/04 09:29:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 09:29:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 09:29:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 09:29:41 INFO sdk_worker.run: Done consuming work.
19/12/04 09:29:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 09:29:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 09:29:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 09:29:42 INFO sdk_worker_main.main: Logging handler created.
19/12/04 09:29:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:41501
19/12/04 09:29:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 09:29:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 09:29:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 09:29:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575451777.47', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60061', 'job_port': u'0'}
19/12/04 09:29:42 INFO statecache.__init__: Creating state cache with size 0
19/12/04 09:29:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40855.
19/12/04 09:29:42 INFO sdk_worker.__init__: Control channel established.
19/12/04 09:29:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 09:29:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33813.
19/12/04 09:29:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 09:29:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:40913
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 09:29:42 INFO sdk_worker.run: No more requests from control plane
19/12/04 09:29:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 09:29:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 09:29:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 09:29:42 INFO sdk_worker.run: Done consuming work.
19/12/04 09:29:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 09:29:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 09:29:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 09:29:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575451777.47_e6b34f60-9214-4951-9c9a-b6fc3b11ffca finished.
19/12/04 09:29:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 09:29:42 WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_8785425c-a7e2-4aed-91a8-25eafc25ba56","basePath":"/tmp/sparktestiHzXMo"}: {}
java.io.FileNotFoundException: /tmp/sparktestiHzXMo/job_8785425c-a7e2-4aed-91a8-25eafc25ba56/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140666172253952)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140666163861248)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140666951993088)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140665673475840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-125, started daemon 140665665083136)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <_MainThread(MainThread, started 140666951993088)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140666163861248)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <Thread(wait_until_finish_read, started daemon 140666172253952)>
e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575451767.93_cf7097d7-eda4-4a7b-b4f8-cb30c67f4d03 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 322.151s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 58s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/gmgru5yqv2me4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1685

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1685/display/redirect>

Changes:


------------------------------------------
[...truncated 1.33 MB...]

19/12/04 06:52:25 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 06:52:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:25 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 06:52:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 06:52:26 INFO sdk_worker_main.main: Logging handler created.
19/12/04 06:52:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:41731
19/12/04 06:52:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 06:52:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 06:52:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 06:52:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575442343.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40217', 'job_port': u'0'}
19/12/04 06:52:26 INFO statecache.__init__: Creating state cache with size 0
19/12/04 06:52:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45521.
19/12/04 06:52:26 INFO sdk_worker.__init__: Control channel established.
19/12/04 06:52:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 06:52:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45179.
19/12/04 06:52:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 06:52:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:40911
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 06:52:26 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 06:52:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:26 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 06:52:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 06:52:27 INFO sdk_worker_main.main: Logging handler created.
19/12/04 06:52:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:39183
19/12/04 06:52:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 06:52:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 06:52:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 06:52:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575442343.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40217', 'job_port': u'0'}
19/12/04 06:52:27 INFO statecache.__init__: Creating state cache with size 0
19/12/04 06:52:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37041.
19/12/04 06:52:27 INFO sdk_worker.__init__: Control channel established.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 06:52:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 06:52:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39987.
19/12/04 06:52:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 06:52:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:44103
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 06:52:27 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 06:52:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:27 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 06:52:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 06:52:28 INFO sdk_worker_main.main: Logging handler created.
19/12/04 06:52:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:46015
19/12/04 06:52:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 06:52:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 06:52:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 06:52:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575442343.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40217', 'job_port': u'0'}
19/12/04 06:52:28 INFO statecache.__init__: Creating state cache with size 0
19/12/04 06:52:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39091.
19/12/04 06:52:28 INFO sdk_worker.__init__: Control channel established.
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 06:52:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 06:52:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41299.
19/12/04 06:52:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 06:52:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:46159
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 06:52:28 INFO sdk_worker.run: No more requests from control plane
19/12/04 06:52:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 06:52:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 06:52:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 06:52:28 INFO sdk_worker.run: Done consuming work.
19/12/04 06:52:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 06:52:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 06:52:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 06:52:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575442343.27_79e7d77d-f27a-4083-ab3c-d43dbbe2327d finished.
19/12/04 06:52:28 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 06:52:28 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_bb58eeaa-3c93-4deb-9b1f-b6b968d0f22d","basePath":"/tmp/sparktestDp1ByL"}: {}
java.io.FileNotFoundException: /tmp/sparktestDp1ByL/job_bb58eeaa-3c93-4deb-9b1f-b6b968d0f22d/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 139752301029120)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 139752292636416)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139753088661248)>
==================== Timed out after 60 seconds. ====================

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139752274278144)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-125, started daemon 139752282932992)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139753088661248)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575442334.21_81cdc114-5806-4fc0-b59d-23b2199121a0 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-119, started daemon 139752292636416)>

# Thread: <Thread(wait_until_finish_read, started daemon 139752301029120)>
----------------------------------------------------------------------
Ran 38 tests in 318.360s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 5s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: a1a7bad1-415c-4610-862c-59b434734332
Response status code: 502
Response content type: text/html; charset=UTF-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1684

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1684/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 04:58:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:33121
19/12/04 04:58:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 04:58:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 04:58:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 04:58:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575435506.04', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54053', 'job_port': u'0'}
19/12/04 04:58:28 INFO statecache.__init__: Creating state cache with size 0
19/12/04 04:58:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35433.
19/12/04 04:58:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 04:58:28 INFO sdk_worker.__init__: Control channel established.
19/12/04 04:58:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 04:58:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34081.
19/12/04 04:58:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 04:58:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:45463
19/12/04 04:58:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 04:58:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 04:58:28 INFO sdk_worker.run: No more requests from control plane
19/12/04 04:58:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 04:58:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 04:58:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 04:58:28 INFO sdk_worker.run: Done consuming work.
19/12/04 04:58:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 04:58:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 04:58:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 04:58:29 INFO sdk_worker_main.main: Logging handler created.
19/12/04 04:58:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:34309
19/12/04 04:58:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 04:58:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 04:58:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 04:58:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575435506.04', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54053', 'job_port': u'0'}
19/12/04 04:58:29 INFO statecache.__init__: Creating state cache with size 0
19/12/04 04:58:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46679.
19/12/04 04:58:29 INFO sdk_worker.__init__: Control channel established.
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 04:58:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 04:58:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46499.
19/12/04 04:58:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 04:58:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:45663
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 04:58:29 INFO sdk_worker.run: No more requests from control plane
19/12/04 04:58:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 04:58:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 04:58:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 04:58:29 INFO sdk_worker.run: Done consuming work.
19/12/04 04:58:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 04:58:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 04:58:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 04:58:30 INFO sdk_worker_main.main: Logging handler created.
19/12/04 04:58:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:34819
19/12/04 04:58:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 04:58:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 04:58:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 04:58:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575435506.04', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54053', 'job_port': u'0'}
19/12/04 04:58:30 INFO statecache.__init__: Creating state cache with size 0
19/12/04 04:58:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35257.
19/12/04 04:58:30 INFO sdk_worker.__init__: Control channel established.
19/12/04 04:58:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 04:58:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37191.
19/12/04 04:58:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 04:58:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:36143
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 04:58:30 INFO sdk_worker.run: No more requests from control plane
19/12/04 04:58:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 04:58:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 04:58:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 04:58:30 INFO sdk_worker.run: Done consuming work.
19/12/04 04:58:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 04:58:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 04:58:31 INFO sdk_worker_main.main: Logging handler created.
19/12/04 04:58:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:43261
19/12/04 04:58:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 04:58:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 04:58:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 04:58:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575435506.04', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54053', 'job_port': u'0'}
19/12/04 04:58:31 INFO statecache.__init__: Creating state cache with size 0
19/12/04 04:58:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36175.
19/12/04 04:58:31 INFO sdk_worker.__init__: Control channel established.
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 04:58:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 04:58:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36539.
19/12/04 04:58:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 04:58:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:46839
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 04:58:31 INFO sdk_worker.run: No more requests from control plane
19/12/04 04:58:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 04:58:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 04:58:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 04:58:31 INFO sdk_worker.run: Done consuming work.
19/12/04 04:58:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 04:58:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 04:58:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 04:58:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575435506.04_1a77ed46-84e8-4661-8b1d-2474e24a29cf finished.
19/12/04 04:58:31 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 04:58:31 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d8a7a09d-4a6c-40b2-a70c-0af5cf17f268","basePath":"/tmp/sparktestdgvhvE"}: {}
java.io.FileNotFoundException: /tmp/sparktestdgvhvE/job_d8a7a09d-4a6c-40b2-a70c-0af5cf17f268/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()

# Thread: <Thread(wait_until_finish_read, started daemon 140241098508032)>

# Thread: <Thread(Thread-119, started daemon 140241106900736)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140241895032576)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140240614323968)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140240605931264)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 140241106900736)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140241895032576)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140241098508032)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575435496.4_09206508-b8cf-48f8-9ca0-59bc5bda2f98 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 313.066s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 2s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/eymadeyilpyms

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1683

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1683/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8489] Filter: don't use callable's output type

[lostluck] [GoSDK] Handle data write errors & stream recreate

[github] [BEAM-8835] Disable Flink Uber Jar by default. (#10270)

[lostluck] [GoSDK] Cancel stream context on dataWriter error

[github] [BEAM-8651] [BEAM-8874] Change pickle_lock to be a reentrant lock, and

[lostluck] [GoSDK] Don't panic if debug symbols are striped

[lcwik] [BEAM-8523] Regenerate Go protos with respect to changes in #9959


------------------------------------------
[...truncated 1.32 MB...]
19/12/04 01:40:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:35573
19/12/04 01:40:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 01:40:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 01:40:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 01:40:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575423603.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47731', 'job_port': u'0'}
19/12/04 01:40:08 INFO statecache.__init__: Creating state cache with size 0
19/12/04 01:40:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39001.
19/12/04 01:40:08 INFO sdk_worker.__init__: Control channel established.
19/12/04 01:40:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/04 01:40:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38877.
19/12/04 01:40:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 01:40:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:40891
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 01:40:08 INFO sdk_worker.run: No more requests from control plane
19/12/04 01:40:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 01:40:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 01:40:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 01:40:08 INFO sdk_worker.run: Done consuming work.
19/12/04 01:40:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 01:40:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 01:40:09 INFO sdk_worker_main.main: Logging handler created.
19/12/04 01:40:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:46009
19/12/04 01:40:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 01:40:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 01:40:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 01:40:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575423603.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47731', 'job_port': u'0'}
19/12/04 01:40:09 INFO statecache.__init__: Creating state cache with size 0
19/12/04 01:40:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33879.
19/12/04 01:40:09 INFO sdk_worker.__init__: Control channel established.
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/04 01:40:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 01:40:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34221.
19/12/04 01:40:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 01:40:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:35113
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 01:40:09 INFO sdk_worker.run: No more requests from control plane
19/12/04 01:40:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 01:40:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 01:40:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 01:40:09 INFO sdk_worker.run: Done consuming work.
19/12/04 01:40:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 01:40:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 01:40:10 INFO sdk_worker_main.main: Logging handler created.
19/12/04 01:40:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:32899
19/12/04 01:40:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 01:40:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 01:40:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 01:40:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575423603.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47731', 'job_port': u'0'}
19/12/04 01:40:10 INFO statecache.__init__: Creating state cache with size 0
19/12/04 01:40:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39949.
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/04 01:40:10 INFO sdk_worker.__init__: Control channel established.
19/12/04 01:40:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 01:40:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40095.
19/12/04 01:40:10 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 01:40:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:35075
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 01:40:10 INFO sdk_worker.run: No more requests from control plane
19/12/04 01:40:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 01:40:10 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 01:40:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 01:40:10 INFO sdk_worker.run: Done consuming work.
19/12/04 01:40:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 01:40:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/04 01:40:11 INFO sdk_worker_main.main: Logging handler created.
19/12/04 01:40:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:35985
19/12/04 01:40:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/04 01:40:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/04 01:40:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/04 01:40:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575423603.25', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47731', 'job_port': u'0'}
19/12/04 01:40:11 INFO statecache.__init__: Creating state cache with size 0
19/12/04 01:40:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39537.
19/12/04 01:40:11 INFO sdk_worker.__init__: Control channel established.
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/04 01:40:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/04 01:40:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35983.
19/12/04 01:40:11 INFO sdk_worker.create_state_handler: State channel established.
19/12/04 01:40:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:45017
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/04 01:40:11 INFO sdk_worker.run: No more requests from control plane
19/12/04 01:40:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/04 01:40:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:11 INFO data_plane.close: Closing all cached grpc data channels.
19/12/04 01:40:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/04 01:40:11 INFO sdk_worker.run: Done consuming work.
19/12/04 01:40:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/04 01:40:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/04 01:40:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/04 01:40:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575423603.25_e9c31a43-4921-421e-b4c0-3413b1f8578c finished.
19/12/04 01:40:11 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/04 01:40:11 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_c80dd2ea-4386-48b6-8a25-4f701311bb19","basePath":"/tmp/sparktestm1A3Zb"}: {}
java.io.FileNotFoundException: /tmp/sparktestm1A3Zb/job_c80dd2ea-4386-48b6-8a25-4f701311bb19/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140031056205568)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(Thread-120, started daemon 140031047812864)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140031835944704)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140030949447424)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(Thread-126, started daemon 140030957840128)>

# Thread: <Thread(Thread-120, started daemon 140031047812864)>

    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140031835944704)>

# Thread: <Thread(wait_until_finish_read, started daemon 140031056205568)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575423581.94_481a58c0-479e-49d5-ad31-857ba4bf39f8 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 358.403s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 39s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/usp7qwxak4whw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1682

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1682/display/redirect?page=changes>

Changes:

[rohde.samuel] change definition of has_unbounded_sources in PIN to a pre-determined

[rohde.samuel] typo

[rohde.samuel] lint

[rohde.samuel] remove BigQueryReader from list

[rohde.samuel] lint

[rohde.samuel] remove external

[rohde.samuel] remove external

[github] Merge pull request #10248: [BEAM-7274] Add type conversions factory

[chamikara] Merge pull request #10262: [BEAM-8575] Revert validates runner test tag


------------------------------------------
[...truncated 1.31 MB...]
19/12/03 21:51:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6 on Spark master local
19/12/03 21:51:57 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/03 21:51:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6: Pipeline translated successfully. Computing outputs
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:51:58 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:51:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:42641
19/12/03 21:51:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:51:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:51:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:51:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:51:58 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:51:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36467.
19/12/03 21:51:58 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/03 21:51:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:51:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41631.
19/12/03 21:51:58 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:51:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:43143
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:51:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:51:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:51:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:51:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:51:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:51:58 INFO sdk_worker.run: Done consuming work.
19/12/03 21:51:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:51:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:51:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:51:59 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:51:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:45779
19/12/03 21:51:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:51:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:51:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:51:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:51:59 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:51:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34965.
19/12/03 21:51:59 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 21:51:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:51:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38899.
19/12/03 21:51:59 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:51:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:46697
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:51:59 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:51:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:51:59 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:51:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:51:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:51:59 INFO sdk_worker.run: Done consuming work.
19/12/03 21:51:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:51:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:51:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:52:00 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:52:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:44921
19/12/03 21:52:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:52:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:52:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:52:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:52:00 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:52:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43533.
19/12/03 21:52:00 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 21:52:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:52:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46109.
19/12/03 21:52:00 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:52:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:34391
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:52:00 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:52:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:52:00 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:52:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:52:00 INFO sdk_worker.run: Done consuming work.
19/12/03 21:52:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:52:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:52:01 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:52:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:34535
19/12/03 21:52:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:52:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:52:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:52:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:52:01 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:52:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38479.
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 21:52:01 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:52:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:52:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44381.
19/12/03 21:52:01 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:52:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:35977
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:52:01 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:52:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:52:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:01 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:52:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:52:01 INFO sdk_worker.run: Done consuming work.
19/12/03 21:52:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:52:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 21:52:02 INFO sdk_worker_main.main: Logging handler created.
19/12/03 21:52:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:37107
19/12/03 21:52:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 21:52:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 21:52:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 21:52:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575409916.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53383', 'job_port': u'0'}
19/12/03 21:52:02 INFO statecache.__init__: Creating state cache with size 0
19/12/03 21:52:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41631.
19/12/03 21:52:02 INFO sdk_worker.__init__: Control channel established.
19/12/03 21:52:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 21:52:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45533.
19/12/03 21:52:02 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 21:52:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:36265
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 21:52:02 INFO sdk_worker.run: No more requests from control plane
19/12/03 21:52:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 21:52:02 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 21:52:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 21:52:02 INFO sdk_worker.run: Done consuming work.
19/12/03 21:52:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 21:52:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 21:52:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 21:52:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575409916.95_17d55ea6-f15e-4fb4-921d-c3ebbaeb60c6 finished.
19/12/03 21:52:02 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 21:52:02 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_5b6fcb0f-ac5f-4898-b636-f5b177eeabc5","basePath":"/tmp/sparktest7B3eMY"}: {}
java.io.FileNotFoundException: /tmp/sparktest7B3eMY/job_5b6fcb0f-ac5f-4898-b636-f5b177eeabc5/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
==================== Timed out after 60 seconds. ====================

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140239455958784)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140239447566080)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <_MainThread(MainThread, started 140240235407104)>
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575409907.9_370c1377-80e9-4954-b0b6-c5e573567446 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.134s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 38s
60 actionable tasks: 56 executed, 4 from cache

Publishing build scan...
https://scans.gradle.com/s/5mivsj5vsyqac

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1681

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1681/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8251] plumb worker_(region|zone) to Environment proto

[kcweaver] Add null checks for worker region/zone options


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 19:01:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:41195
19/12/03 19:01:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 19:01:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 19:01:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 19:01:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575399675.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50261', 'job_port': u'0'}
19/12/03 19:01:18 INFO statecache.__init__: Creating state cache with size 0
19/12/03 19:01:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40043.
19/12/03 19:01:18 INFO sdk_worker.__init__: Control channel established.
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 19:01:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 19:01:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36633.
19/12/03 19:01:18 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 19:01:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:43169
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 19:01:18 INFO sdk_worker.run: No more requests from control plane
19/12/03 19:01:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 19:01:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:18 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 19:01:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 19:01:18 INFO sdk_worker.run: Done consuming work.
19/12/03 19:01:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 19:01:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 19:01:19 INFO sdk_worker_main.main: Logging handler created.
19/12/03 19:01:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:33569
19/12/03 19:01:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 19:01:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 19:01:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 19:01:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575399675.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50261', 'job_port': u'0'}
19/12/03 19:01:19 INFO statecache.__init__: Creating state cache with size 0
19/12/03 19:01:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36449.
19/12/03 19:01:19 INFO sdk_worker.__init__: Control channel established.
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 19:01:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 19:01:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45139.
19/12/03 19:01:19 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 19:01:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:35613
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 19:01:19 INFO sdk_worker.run: No more requests from control plane
19/12/03 19:01:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 19:01:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:19 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 19:01:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 19:01:19 INFO sdk_worker.run: Done consuming work.
19/12/03 19:01:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 19:01:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 19:01:20 INFO sdk_worker_main.main: Logging handler created.
19/12/03 19:01:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:40891
19/12/03 19:01:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 19:01:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 19:01:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 19:01:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575399675.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50261', 'job_port': u'0'}
19/12/03 19:01:20 INFO statecache.__init__: Creating state cache with size 0
19/12/03 19:01:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43941.
19/12/03 19:01:20 INFO sdk_worker.__init__: Control channel established.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 19:01:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 19:01:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44587.
19/12/03 19:01:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 19:01:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:35165
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 19:01:20 INFO sdk_worker.run: No more requests from control plane
19/12/03 19:01:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 19:01:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:20 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 19:01:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 19:01:20 INFO sdk_worker.run: Done consuming work.
19/12/03 19:01:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 19:01:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 19:01:20 INFO sdk_worker_main.main: Logging handler created.
19/12/03 19:01:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:33915
19/12/03 19:01:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 19:01:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 19:01:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 19:01:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575399675.95', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50261', 'job_port': u'0'}
19/12/03 19:01:20 INFO statecache.__init__: Creating state cache with size 0
19/12/03 19:01:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36813.
19/12/03 19:01:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 19:01:20 INFO sdk_worker.__init__: Control channel established.
19/12/03 19:01:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 19:01:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44343.
19/12/03 19:01:21 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 19:01:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:43425
19/12/03 19:01:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 19:01:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 19:01:21 INFO sdk_worker.run: No more requests from control plane
19/12/03 19:01:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 19:01:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:21 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 19:01:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 19:01:21 INFO sdk_worker.run: Done consuming work.
19/12/03 19:01:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 19:01:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 19:01:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 19:01:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575399675.95_649faa8c-3a04-4926-a03b-27dd2c3e5f7c finished.
19/12/03 19:01:21 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 19:01:21 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_63263033-cbbc-4326-9f78-f6fd0030a82c","basePath":"/tmp/sparktestcPqWqB"}: {}
java.io.FileNotFoundException: /tmp/sparktestcPqWqB/job_63263033-cbbc-4326-9f78-f6fd0030a82c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 358, in wait
    delay = min(delay * 2, remaining, .05)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140172163282688)>

======================================================================

# Thread: <Thread(Thread-119, started daemon 140172654675712)>
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):

# Thread: <_MainThread(MainThread, started 140173434124032)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140172146497280)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-125, started daemon 140172154889984)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140173434124032)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-119, started daemon 140172654675712)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140172163282688)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575399667.25_2f1240de-f72f-4d6b-84c4-b217ac653cea failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.126s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 15s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/hostbjuotr4ii

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1680

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1680/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 18:22:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:41071
19/12/03 18:22:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 18:22:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 18:22:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 18:22:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575397373.58', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35285', 'job_port': u'0'}
19/12/03 18:22:56 INFO statecache.__init__: Creating state cache with size 0
19/12/03 18:22:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46753.
19/12/03 18:22:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 18:22:56 INFO sdk_worker.__init__: Control channel established.
19/12/03 18:22:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 18:22:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34779.
19/12/03 18:22:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 18:22:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:46315
19/12/03 18:22:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 18:22:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 18:22:56 INFO sdk_worker.run: No more requests from control plane
19/12/03 18:22:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 18:22:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:56 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 18:22:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 18:22:56 INFO sdk_worker.run: Done consuming work.
19/12/03 18:22:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 18:22:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 18:22:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 18:22:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 18:22:57 INFO sdk_worker_main.main: Logging handler created.
19/12/03 18:22:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:44997
19/12/03 18:22:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 18:22:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 18:22:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 18:22:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575397373.58', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35285', 'job_port': u'0'}
19/12/03 18:22:57 INFO statecache.__init__: Creating state cache with size 0
19/12/03 18:22:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41191.
19/12/03 18:22:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 18:22:57 INFO sdk_worker.__init__: Control channel established.
19/12/03 18:22:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 18:22:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43971.
19/12/03 18:22:57 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 18:22:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:46091
19/12/03 18:22:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 18:22:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 18:22:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 18:22:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 18:22:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 18:22:58 INFO sdk_worker.run: Done consuming work.
19/12/03 18:22:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 18:22:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 18:22:58 INFO sdk_worker_main.main: Logging handler created.
19/12/03 18:22:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:37463
19/12/03 18:22:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 18:22:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 18:22:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 18:22:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575397373.58', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35285', 'job_port': u'0'}
19/12/03 18:22:58 INFO statecache.__init__: Creating state cache with size 0
19/12/03 18:22:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:32847.
19/12/03 18:22:58 INFO sdk_worker.__init__: Control channel established.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 18:22:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 18:22:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44681.
19/12/03 18:22:58 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 18:22:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:41993
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 18:22:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 18:22:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 18:22:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 18:22:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 18:22:58 INFO sdk_worker.run: Done consuming work.
19/12/03 18:22:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 18:22:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 18:22:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 18:22:59 INFO sdk_worker_main.main: Logging handler created.
19/12/03 18:22:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:42453
19/12/03 18:22:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 18:22:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 18:22:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 18:22:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575397373.58', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35285', 'job_port': u'0'}
19/12/03 18:22:59 INFO statecache.__init__: Creating state cache with size 0
19/12/03 18:22:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36699.
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 18:22:59 INFO sdk_worker.__init__: Control channel established.
19/12/03 18:22:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 18:22:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37289.
19/12/03 18:22:59 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 18:22:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:37039
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 18:22:59 INFO sdk_worker.run: No more requests from control plane
19/12/03 18:22:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 18:22:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:22:59 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 18:22:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 18:22:59 INFO sdk_worker.run: Done consuming work.
19/12/03 18:22:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 18:22:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 18:23:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 18:23:00 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575397373.58_ad87b30f-be4f-4393-94ac-6f006cce3e50 finished.
19/12/03 18:23:00 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 18:23:00 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6eee6b44-5e40-4cd6-baf0-e6a8e9ddc131","basePath":"/tmp/sparktestBjUqOr"}: {}
java.io.FileNotFoundException: /tmp/sparktestBjUqOr/job_6eee6b44-5e40-4cd6-baf0-e6a8e9ddc131/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140419824506624)>

# Thread: <Thread(Thread-120, started daemon 140419832899328)>

# Thread: <_MainThread(MainThread, started 140420621031168)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140419807196928)>

# Thread: <Thread(Thread-126, started daemon 140419815851776)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140420621031168)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-120, started daemon 140419832899328)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140419824506624)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575397362.88_94fd96ea-8d14-48fa-a799-469c421cbc90 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 341.460s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 33s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/my7zbyazlh7ms

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1679

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1679/display/redirect?page=changes>

Changes:

[kamil.wasilewski] Fixed a bug where the output PCollection was assigned to self.result


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 15:50:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:43901
19/12/03 15:50:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 15:50:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 15:50:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 15:50:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575388242.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39685', 'job_port': u'0'}
19/12/03 15:50:45 INFO statecache.__init__: Creating state cache with size 0
19/12/03 15:50:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38113.
19/12/03 15:50:45 INFO sdk_worker.__init__: Control channel established.
19/12/03 15:50:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 15:50:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34311.
19/12/03 15:50:45 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 15:50:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:42601
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 15:50:45 INFO sdk_worker.run: No more requests from control plane
19/12/03 15:50:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 15:50:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:45 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 15:50:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 15:50:45 INFO sdk_worker.run: Done consuming work.
19/12/03 15:50:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 15:50:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 15:50:46 INFO sdk_worker_main.main: Logging handler created.
19/12/03 15:50:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:34583
19/12/03 15:50:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 15:50:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 15:50:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 15:50:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575388242.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39685', 'job_port': u'0'}
19/12/03 15:50:46 INFO statecache.__init__: Creating state cache with size 0
19/12/03 15:50:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44497.
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 15:50:46 INFO sdk_worker.__init__: Control channel established.
19/12/03 15:50:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 15:50:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36797.
19/12/03 15:50:46 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 15:50:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:45661
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 15:50:46 INFO sdk_worker.run: No more requests from control plane
19/12/03 15:50:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 15:50:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:46 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 15:50:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 15:50:46 INFO sdk_worker.run: Done consuming work.
19/12/03 15:50:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 15:50:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 15:50:47 INFO sdk_worker_main.main: Logging handler created.
19/12/03 15:50:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:39605
19/12/03 15:50:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 15:50:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 15:50:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 15:50:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575388242.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39685', 'job_port': u'0'}
19/12/03 15:50:47 INFO statecache.__init__: Creating state cache with size 0
19/12/03 15:50:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45307.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 15:50:47 INFO sdk_worker.__init__: Control channel established.
19/12/03 15:50:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 15:50:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40885.
19/12/03 15:50:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 15:50:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:37659
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 15:50:47 INFO sdk_worker.run: No more requests from control plane
19/12/03 15:50:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 15:50:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 15:50:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 15:50:47 INFO sdk_worker.run: Done consuming work.
19/12/03 15:50:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 15:50:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 15:50:47 INFO sdk_worker_main.main: Logging handler created.
19/12/03 15:50:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:32891
19/12/03 15:50:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 15:50:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 15:50:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 15:50:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575388242.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39685', 'job_port': u'0'}
19/12/03 15:50:47 INFO statecache.__init__: Creating state cache with size 0
19/12/03 15:50:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34941.
19/12/03 15:50:47 INFO sdk_worker.__init__: Control channel established.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 15:50:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 15:50:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42221.
19/12/03 15:50:47 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 15:50:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:41727
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 15:50:47 INFO sdk_worker.run: No more requests from control plane
19/12/03 15:50:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 15:50:47 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 15:50:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 15:50:47 INFO sdk_worker.run: Done consuming work.
19/12/03 15:50:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 15:50:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 15:50:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 15:50:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575388242.88_c278ce29-8768-480a-a1af-1a7aa80c9420 finished.
19/12/03 15:50:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 15:50:48 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d33d6692-0e6d-47b2-9393-7958c1829329","basePath":"/tmp/sparktestZzRAEU"}: {}
java.io.FileNotFoundException: /tmp/sparktestZzRAEU/job_d33d6692-0e6d-47b2-9393-7958c1829329/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140533544515328)>

# Thread: <Thread(Thread-119, started daemon 140533894498048)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140534673737472)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140533519337216)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-123, started daemon 140533527729920)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 140534673737472)>

e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-119, started daemon 140533894498048)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575388234.1_95b07355-6f9d-40fc-a6a2-eeb51d7a9e26 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(wait_until_finish_read, started daemon 140533544515328)>
----------------------------------------------------------------------
Ran 38 tests in 296.757s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 29s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/zab3womupjjgc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1678

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1678/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/03 12:09:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5 on Spark master local
19/12/03 12:09:39 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/03 12:09:39 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5: Pipeline translated successfully. Computing outputs
19/12/03 12:09:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:40 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:36999
19/12/03 12:09:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:40 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41905.
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/03 12:09:40 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41869.
19/12/03 12:09:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:32883
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:40 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:40 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:41 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:41361
19/12/03 12:09:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:41 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42955.
19/12/03 12:09:41 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 12:09:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42199.
19/12/03 12:09:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:41633
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:41 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:41 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:42 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:34061
19/12/03 12:09:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:42 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44723.
19/12/03 12:09:42 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 12:09:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33791.
19/12/03 12:09:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:46877
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:42 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:42 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:43 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:40899
19/12/03 12:09:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:43 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40241.
19/12/03 12:09:43 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 12:09:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46373.
19/12/03 12:09:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:33663
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:43 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:43 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 12:09:43 INFO sdk_worker_main.main: Logging handler created.
19/12/03 12:09:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:34321
19/12/03 12:09:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 12:09:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 12:09:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 12:09:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575374977.56', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36381', 'job_port': u'0'}
19/12/03 12:09:43 INFO statecache.__init__: Creating state cache with size 0
19/12/03 12:09:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42173.
19/12/03 12:09:43 INFO sdk_worker.__init__: Control channel established.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 12:09:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 12:09:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46621.
19/12/03 12:09:43 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 12:09:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:34433
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 12:09:43 INFO sdk_worker.run: No more requests from control plane
19/12/03 12:09:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 12:09:43 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 12:09:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 12:09:43 INFO sdk_worker.run: Done consuming work.
19/12/03 12:09:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 12:09:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 12:09:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 12:09:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575374977.56_6084a658-2be8-4d03-8ebc-83558061f5a5 finished.
19/12/03 12:09:44 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 12:09:44 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_40b98026-1784-4613-a2d3-b3ccbd2e8483","basePath":"/tmp/sparktestrOXqAf"}: {}
java.io.FileNotFoundException: /tmp/sparktestrOXqAf/job_40b98026-1784-4613-a2d3-b3ccbd2e8483/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139640728188672)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


# Thread: <Thread(Thread-119, started daemon 139640719795968)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "ap# Thread: <_MainThread(MainThread, started 139641864398592)>
ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575374968.28_eb6dfdbb-5ea0-497d-b043-186ee848ef7b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 275.228s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 58s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/blpavca62heqy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1677

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1677/display/redirect?page=changes>

Changes:

[chadrik] [BEAM-8523] JobAPI: Give access to timestamped state change history

[chadrik] Rename GetJobStateResponse to JobStateEvent

[chadrik] Move state history utilities to AbstractBeamJob

[chadrik] Small bugfix to FlinkBeamJob job state mapping

[chadrik] Fix existing bugs in AbstractJobServiceServicer

[chadrik] Use timestamp.Timestamp instead of float


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 11:35:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:41335
19/12/03 11:35:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 11:35:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 11:35:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 11:35:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575372899.81', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36757', 'job_port': u'0'}
19/12/03 11:35:02 INFO statecache.__init__: Creating state cache with size 0
19/12/03 11:35:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45955.
19/12/03 11:35:02 INFO sdk_worker.__init__: Control channel established.
19/12/03 11:35:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 11:35:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 11:35:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41491.
19/12/03 11:35:02 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 11:35:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:42757
19/12/03 11:35:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 11:35:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 11:35:02 INFO sdk_worker.run: No more requests from control plane
19/12/03 11:35:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 11:35:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:02 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 11:35:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 11:35:02 INFO sdk_worker.run: Done consuming work.
19/12/03 11:35:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 11:35:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 11:35:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 11:35:03 INFO sdk_worker_main.main: Logging handler created.
19/12/03 11:35:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:36193
19/12/03 11:35:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 11:35:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 11:35:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 11:35:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575372899.81', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36757', 'job_port': u'0'}
19/12/03 11:35:03 INFO statecache.__init__: Creating state cache with size 0
19/12/03 11:35:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42431.
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 11:35:03 INFO sdk_worker.__init__: Control channel established.
19/12/03 11:35:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 11:35:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39075.
19/12/03 11:35:03 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 11:35:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:36375
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 11:35:03 INFO sdk_worker.run: No more requests from control plane
19/12/03 11:35:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 11:35:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:03 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 11:35:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 11:35:03 INFO sdk_worker.run: Done consuming work.
19/12/03 11:35:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 11:35:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 11:35:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 11:35:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 11:35:04 INFO sdk_worker_main.main: Logging handler created.
19/12/03 11:35:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:45601
19/12/03 11:35:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 11:35:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 11:35:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 11:35:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575372899.81', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36757', 'job_port': u'0'}
19/12/03 11:35:04 INFO statecache.__init__: Creating state cache with size 0
19/12/03 11:35:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34499.
19/12/03 11:35:04 INFO sdk_worker.__init__: Control channel established.
19/12/03 11:35:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 11:35:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 11:35:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42113.
19/12/03 11:35:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 11:35:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:45129
19/12/03 11:35:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 11:35:05 INFO sdk_worker.run: No more requests from control plane
19/12/03 11:35:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 11:35:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 11:35:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 11:35:05 INFO sdk_worker.run: Done consuming work.
19/12/03 11:35:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 11:35:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 11:35:05 INFO sdk_worker_main.main: Logging handler created.
19/12/03 11:35:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:44193
19/12/03 11:35:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 11:35:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 11:35:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 11:35:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575372899.81', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36757', 'job_port': u'0'}
19/12/03 11:35:05 INFO statecache.__init__: Creating state cache with size 0
19/12/03 11:35:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44999.
19/12/03 11:35:05 INFO sdk_worker.__init__: Control channel established.
19/12/03 11:35:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 11:35:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33289.
19/12/03 11:35:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 11:35:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:32783
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 11:35:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 11:35:06 INFO sdk_worker.run: No more requests from control plane
19/12/03 11:35:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 11:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 11:35:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 11:35:06 INFO sdk_worker.run: Done consuming work.
19/12/03 11:35:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 11:35:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 11:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 11:35:06 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575372899.81_4d5c66d7-175f-482a-966c-31a058f1590f finished.
19/12/03 11:35:06 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 11:35:06 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_3608f4ff-9bdf-4f8d-b70f-a14a36d94e4c","basePath":"/tmp/sparktestIlqvH6"}: {}
java.io.FileNotFoundException: /tmp/sparktestIlqvH6/job_3608f4ff-9bdf-4f8d-b70f-a14a36d94e4c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

# Thread: <Thread(wait_until_finish_read, started daemon 140527394232064)>

# Thread: <Thread(Thread-116, started daemon 140527385839360)>

# Thread: <_MainThread(MainThread, started 140528173471488)>
==================== Timed out after 60 seconds. ====================

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140527286605568)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:

# Thread: <Thread(Thread-122, started daemon 140527294998272)>

# Thread: <_MainThread(MainThread, started 140528173471488)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-116, started daemon 140527385839360)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140527394232064)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575372888.95_dadbb2de-233f-4879-aec2-1bc4dd6caa0c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 377.069s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 22s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/qppimngrpirts

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1676

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1676/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-8470] Update capability matrix: add Spark Structured Streaming

[echauchot] [BEAM-8470] Update Spark runner page: add Spark Structured Streaming


------------------------------------------
[...truncated 1.44 MB...]
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
l_finish_read, started daemon 139703307187968)>
    for state_response in self._state_stream:

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139703919560448)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(wait_until_finish_read, started daemon 139704294000384)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139703323973376)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(Thread-132, started daemon 139703332366080)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-116, started daemon 139703944738560)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <Thread(wait_until_finish_read, started daemon 139703340758784)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703902775040)>

    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-120, started daemon 139703927953152)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-128, started daemon 139703894382336)>

# Thread: <Thread(Thread-136, started daemon 139703315580672)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139702795495168)>

# Thread: <Thread(Thread-144, started daemon 139702803887872)>

# Thread: <_MainThread(MainThread, started 139705073448704)>

# Thread: <Thread(Thread-124, started daemon 139703911167744)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703307187968)>

# Thread: <Thread(Thread-136, started daemon 139703315580672)>

# Thread: <Thread(Thread-128, started daemon 139703894382336)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703340758784)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703902775040)>

# Thread: <Thread(wait_until_finish_read, started daemon 139703290402560)>

# Thread: <Thread(Thread-140, started daemon 139703298795264)>
======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_windowed_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 181, in test_pardo_windowed_side_inputs
    label='windowed')
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_read (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 578, in test_read
    equal_to(['a', 'b', 'c']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_reshuffle (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 548, in test_reshuffle
    equal_to([1, 2, 3]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_check_done_failed (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 470, in test_sdf_with_check_done_failed
    | beam.ParDo(ExpandingStringsDoFn()))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

----------------------------------------------------------------------
Ran 38 tests in 692.383s

FAILED (errors=8, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 4s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/6gsu4h7uagqdy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1675

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1675/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 06:11:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:38311
19/12/03 06:11:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 06:11:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 06:11:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 06:11:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575353513.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43039', 'job_port': u'0'}
19/12/03 06:11:56 INFO statecache.__init__: Creating state cache with size 0
19/12/03 06:11:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43883.
19/12/03 06:11:56 INFO sdk_worker.__init__: Control channel established.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 06:11:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 06:11:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45281.
19/12/03 06:11:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 06:11:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:41371
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 06:11:56 INFO sdk_worker.run: No more requests from control plane
19/12/03 06:11:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 06:11:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:56 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 06:11:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 06:11:56 INFO sdk_worker.run: Done consuming work.
19/12/03 06:11:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 06:11:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 06:11:56 INFO sdk_worker_main.main: Logging handler created.
19/12/03 06:11:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:44881
19/12/03 06:11:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 06:11:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 06:11:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 06:11:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575353513.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43039', 'job_port': u'0'}
19/12/03 06:11:56 INFO statecache.__init__: Creating state cache with size 0
19/12/03 06:11:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38645.
19/12/03 06:11:56 INFO sdk_worker.__init__: Control channel established.
19/12/03 06:11:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 06:11:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35489.
19/12/03 06:11:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 06:11:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:39659
19/12/03 06:11:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 06:11:57 INFO sdk_worker.run: No more requests from control plane
19/12/03 06:11:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 06:11:57 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 06:11:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 06:11:57 INFO sdk_worker.run: Done consuming work.
19/12/03 06:11:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 06:11:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 06:11:57 INFO sdk_worker_main.main: Logging handler created.
19/12/03 06:11:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:43481
19/12/03 06:11:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 06:11:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 06:11:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 06:11:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575353513.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43039', 'job_port': u'0'}
19/12/03 06:11:57 INFO statecache.__init__: Creating state cache with size 0
19/12/03 06:11:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35193.
19/12/03 06:11:57 INFO sdk_worker.__init__: Control channel established.
19/12/03 06:11:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 06:11:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38835.
19/12/03 06:11:57 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 06:11:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:39335
19/12/03 06:11:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 06:11:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 06:11:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 06:11:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 06:11:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 06:11:58 INFO sdk_worker.run: Done consuming work.
19/12/03 06:11:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 06:11:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 06:11:58 INFO sdk_worker_main.main: Logging handler created.
19/12/03 06:11:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:35341
19/12/03 06:11:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 06:11:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 06:11:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 06:11:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575353513.64', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43039', 'job_port': u'0'}
19/12/03 06:11:58 INFO statecache.__init__: Creating state cache with size 0
19/12/03 06:11:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33683.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 06:11:58 INFO sdk_worker.__init__: Control channel established.
19/12/03 06:11:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 06:11:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45869.
19/12/03 06:11:58 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 06:11:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:34973
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 06:11:58 INFO sdk_worker.run: No more requests from control plane
19/12/03 06:11:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 06:11:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:58 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 06:11:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 06:11:58 INFO sdk_worker.run: Done consuming work.
19/12/03 06:11:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 06:11:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 06:11:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 06:11:58 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575353513.64_92d0cbe6-a7e1-4253-a1f6-6202c20bcd88 finished.
19/12/03 06:11:58 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 06:11:58 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_3783fbdc-5bf4-491e-9318-5c47b864da7a","basePath":"/tmp/sparktestuuUmOk"}: {}
java.io.FileNotFoundException: /tmp/sparktestuuUmOk/job_3783fbdc-5bf4-491e-9318-5c47b864da7a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140260562347776)>

# Thread: <Thread(Thread-120, started daemon 140260478547712)>

# Thread: <_MainThread(MainThread, started 140261349848832)>
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140260461762304)>

# Thread: <Thread(Thread-126, started daemon 140260470155008)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140261349848832)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-120, started daemon 140260478547712)>

# Thread: <Thread(wait_until_finish_read, started daemon 140260562347776)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575353504.56_adff07a2-9b27-40e5-8a87-ee0a5c63ad35 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 307.245s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 48s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ohuuxjzlyffj4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1674

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1674/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-7594] Fix flaky filename generation

[ehudm] [BEAM-8842] Disable the correct test


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 04:56:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 04:56:03 INFO sdk_worker_main.main: Logging handler created.
19/12/03 04:56:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:43485
19/12/03 04:56:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 04:56:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 04:56:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 04:56:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575348961.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34033', 'job_port': u'0'}
19/12/03 04:56:03 INFO statecache.__init__: Creating state cache with size 0
19/12/03 04:56:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46327.
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 04:56:03 INFO sdk_worker.__init__: Control channel established.
19/12/03 04:56:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 04:56:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46579.
19/12/03 04:56:03 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 04:56:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:41801
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 04:56:03 INFO sdk_worker.run: No more requests from control plane
19/12/03 04:56:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 04:56:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:03 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 04:56:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 04:56:03 INFO sdk_worker.run: Done consuming work.
19/12/03 04:56:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 04:56:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 04:56:04 INFO sdk_worker_main.main: Logging handler created.
19/12/03 04:56:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:44809
19/12/03 04:56:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 04:56:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 04:56:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 04:56:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575348961.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34033', 'job_port': u'0'}
19/12/03 04:56:04 INFO statecache.__init__: Creating state cache with size 0
19/12/03 04:56:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45041.
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 04:56:04 INFO sdk_worker.__init__: Control channel established.
19/12/03 04:56:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 04:56:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33217.
19/12/03 04:56:04 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 04:56:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:37173
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 04:56:04 INFO sdk_worker.run: No more requests from control plane
19/12/03 04:56:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 04:56:04 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 04:56:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 04:56:04 INFO sdk_worker.run: Done consuming work.
19/12/03 04:56:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 04:56:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 04:56:05 INFO sdk_worker_main.main: Logging handler created.
19/12/03 04:56:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:37807
19/12/03 04:56:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 04:56:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 04:56:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 04:56:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575348961.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34033', 'job_port': u'0'}
19/12/03 04:56:05 INFO statecache.__init__: Creating state cache with size 0
19/12/03 04:56:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40929.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 04:56:05 INFO sdk_worker.__init__: Control channel established.
19/12/03 04:56:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 04:56:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46855.
19/12/03 04:56:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 04:56:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:42829
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 04:56:05 INFO sdk_worker.run: No more requests from control plane
19/12/03 04:56:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 04:56:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 04:56:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 04:56:05 INFO sdk_worker.run: Done consuming work.
19/12/03 04:56:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 04:56:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 04:56:05 INFO sdk_worker_main.main: Logging handler created.
19/12/03 04:56:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:32973
19/12/03 04:56:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 04:56:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 04:56:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 04:56:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575348961.03', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34033', 'job_port': u'0'}
19/12/03 04:56:05 INFO statecache.__init__: Creating state cache with size 0
19/12/03 04:56:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40649.
19/12/03 04:56:05 INFO sdk_worker.__init__: Control channel established.
19/12/03 04:56:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 04:56:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37059.
19/12/03 04:56:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 04:56:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:33219
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 04:56:05 INFO sdk_worker.run: No more requests from control plane
19/12/03 04:56:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 04:56:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:05 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 04:56:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 04:56:05 INFO sdk_worker.run: Done consuming work.
19/12/03 04:56:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 04:56:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 04:56:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 04:56:06 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575348961.03_2a3020b5-f7d9-4a2c-90a3-99f314ad5869 finished.
19/12/03 04:56:06 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 04:56:06 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_eb85b03b-44c5-4090-861a-da8cb6305635","basePath":"/tmp/sparktestgSq5aU"}: {}
java.io.FileNotFoundException: /tmp/sparktestgSq5aU/job_eb85b03b-44c5-4090-861a-da8cb6305635/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(wait_until_finish_read, started daemon 140457482700544)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-118, started daemon 140457474307840)>

# Thread: <_MainThread(MainThread, started 140458261939968)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140456839083776)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 140457456998144)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140458261939968)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575348952.38_328c2416-8400-4f35-8e58-8cea5540977e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 308.213s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 32s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/3m5z6hcuwf3pa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1673

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1673/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/03 03:38:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0 on Spark master local
19/12/03 03:38:28 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/03 03:38:28 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0: Pipeline translated successfully. Computing outputs
19/12/03 03:38:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:29 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:35273
19/12/03 03:38:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:29 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40597.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/03 03:38:29 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41231.
19/12/03 03:38:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:42959
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:29 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:29 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:29 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:44077
19/12/03 03:38:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:29 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44601.
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 03:38:29 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40855.
19/12/03 03:38:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:33653
19/12/03 03:38:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:30 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:30 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:30 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:41775
19/12/03 03:38:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:30 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46037.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 03:38:30 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45403.
19/12/03 03:38:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:44897
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:30 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:30 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:31 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:36857
19/12/03 03:38:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:31 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42177.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 03:38:31 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37347.
19/12/03 03:38:31 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:42025
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:31 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:31 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:31 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:38:32 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:38:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:36345
19/12/03 03:38:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:38:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:38:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:38:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575344307.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53999', 'job_port': u'0'}
19/12/03 03:38:32 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:38:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33699.
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 03:38:32 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:38:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:38:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35675.
19/12/03 03:38:32 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:38:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:33887
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:38:32 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:38:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:38:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:32 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:38:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:38:32 INFO sdk_worker.run: Done consuming work.
19/12/03 03:38:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:38:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:38:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:38:32 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575344307.31_0cc2673a-9ca2-4ff7-af16-607d06812be0 finished.
19/12/03 03:38:32 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 03:38:32 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_a9e1fbaa-8200-4174-9cc9-bc8efd84af98","basePath":"/tmp/sparktestKpO9GN"}: {}
java.io.FileNotFoundException: /tmp/sparktestKpO9GN/job_a9e1fbaa-8200-4174-9cc9-bc8efd84af98/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140097127180032)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-115, started daemon 140097110394624)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 140097981904640)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575344298.04_d3c5337f-5af4-4db2-9cd2-9b62c0e077c1 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 288.880s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 20s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/slo47yhmrgdyk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1672

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1672/display/redirect?page=changes>

Changes:

[altay] Increase overhaed budget for test_sampler_transition_overhead

[aaltay] [BEAM-8814] Changed no_auth option from bool to store_true (#10202)


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 03:26:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:26:20 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:26:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:40473
19/12/03 03:26:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:26:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:26:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:26:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575343578.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42045', 'job_port': u'0'}
19/12/03 03:26:20 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:26:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43083.
19/12/03 03:26:20 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 03:26:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:26:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40017.
19/12/03 03:26:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:26:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:44987
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:26:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:26:21 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:26:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:26:21 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:26:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:26:21 INFO sdk_worker.run: Done consuming work.
19/12/03 03:26:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:26:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:26:21 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:26:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:36621
19/12/03 03:26:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:26:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:26:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:26:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575343578.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42045', 'job_port': u'0'}
19/12/03 03:26:21 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:26:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41435.
19/12/03 03:26:21 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:26:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 03:26:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34969.
19/12/03 03:26:21 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:26:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:34289
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:26:21 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:26:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:26:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:21 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:26:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:26:21 INFO sdk_worker.run: Done consuming work.
19/12/03 03:26:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:26:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:26:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:26:22 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:26:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:44889
19/12/03 03:26:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:26:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:26:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:26:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575343578.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42045', 'job_port': u'0'}
19/12/03 03:26:22 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:26:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38269.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 03:26:22 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:26:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:26:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42941.
19/12/03 03:26:22 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:26:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:35953
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:26:22 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:26:22 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:26:22 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:26:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:22 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:26:22 INFO sdk_worker.run: Done consuming work.
19/12/03 03:26:22 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:26:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:26:23 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:26:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:37481
19/12/03 03:26:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:26:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:26:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:26:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575343578.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:42045', 'job_port': u'0'}
19/12/03 03:26:23 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:26:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44515.
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 03:26:23 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:26:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:26:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34535.
19/12/03 03:26:23 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:26:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:36537
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:26:23 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:26:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:26:23 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:26:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:26:23 INFO sdk_worker.run: Done consuming work.
19/12/03 03:26:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:26:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:26:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:26:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575343578.26_87a06f0d-ad4f-4dc3-8519-e4c9a40ef0ef finished.
19/12/03 03:26:23 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 03:26:23 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_884a85e9-a9a2-4444-a033-ac3d45acd999","basePath":"/tmp/sparktestji0bvb"}: {}
java.io.FileNotFoundException: /tmp/sparktestji0bvb/job_884a85e9-a9a2-4444-a033-ac3d45acd999/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140416640874240)>


======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(Thread-119, started daemon 140416988854016)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <_MainThread(MainThread, started 140417768093440)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140416624088832)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-125, started daemon 140416615696128)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140417768093440)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575343568.19_6cdcb876-5d7e-4311-9d5e-3b811cbea544 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 297.314s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 47s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/bgkbtbm5mdjic

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1671

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1671/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8863] experiment=beam_fn_api in runtime/environments page


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 03:09:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:09:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:09:27 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:09:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:35947
19/12/03 03:09:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:09:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:09:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:09:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575342565.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37941', 'job_port': u'0'}
19/12/03 03:09:27 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:09:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34729.
19/12/03 03:09:27 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:09:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 03:09:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:09:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45231.
19/12/03 03:09:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:09:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:38925
19/12/03 03:09:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:09:28 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:09:28 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:09:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:28 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:09:28 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:09:28 INFO sdk_worker.run: Done consuming work.
19/12/03 03:09:28 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:09:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:09:28 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:09:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:35263
19/12/03 03:09:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:09:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:09:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:09:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575342565.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37941', 'job_port': u'0'}
19/12/03 03:09:28 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:09:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42421.
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 03:09:28 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:09:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:09:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34723.
19/12/03 03:09:28 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:09:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:46359
19/12/03 03:09:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:09:29 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:09:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:09:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:09:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:09:29 INFO sdk_worker.run: Done consuming work.
19/12/03 03:09:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:09:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:09:29 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:09:29 INFO sdk_worker_main.start: Status HTTP server running at localhost:45717
19/12/03 03:09:29 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:09:29 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:09:29 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:09:29 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575342565.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37941', 'job_port': u'0'}
19/12/03 03:09:29 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:09:29 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39369.
19/12/03 03:09:29 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:09:29 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 03:09:29 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45855.
19/12/03 03:09:29 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:09:29 INFO data_plane.create_data_channel: Creating client data channel for localhost:42483
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:09:29 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:09:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:09:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:29 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:09:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:09:29 INFO sdk_worker.run: Done consuming work.
19/12/03 03:09:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:09:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:09:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 03:09:30 INFO sdk_worker_main.main: Logging handler created.
19/12/03 03:09:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:33103
19/12/03 03:09:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 03:09:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 03:09:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 03:09:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575342565.27', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37941', 'job_port': u'0'}
19/12/03 03:09:30 INFO statecache.__init__: Creating state cache with size 0
19/12/03 03:09:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46477.
19/12/03 03:09:30 INFO sdk_worker.__init__: Control channel established.
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 03:09:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 03:09:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36689.
19/12/03 03:09:30 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 03:09:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:42155
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 03:09:30 INFO sdk_worker.run: No more requests from control plane
19/12/03 03:09:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 03:09:30 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 03:09:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 03:09:30 INFO sdk_worker.run: Done consuming work.
19/12/03 03:09:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 03:09:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 03:09:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 03:09:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575342565.27_f8ea50c7-0c3f-40e5-8dc8-ceb78d4d89d7 finished.
19/12/03 03:09:31 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 03:09:31 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_982fecfb-14df-4eb0-b936-52f71a96ac17","basePath":"/tmp/sparktestZwXDPf"}: {}
java.io.FileNotFoundException: /tmp/sparktestZwXDPf/job_982fecfb-14df-4eb0-b936-52f71a96ac17/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_assert_that (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 101, in test_assert_that
    assert_that(p | beam.Create(['a', 'b']), equal_to(['a']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):

  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(wait_until_finish_read, started daemon 140664983860992)>

# Thread: <Thread(Thread-5, started daemon 140664992253696)>

    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140665773664000)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 140664983860992)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-115, started daemon 140664967075584)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140665773664000)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575342553.69_6b0bf8de-7c5e-4a82-96ea-7a2097f23dff failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 385.950s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ph6yewtsraoay

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1670

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1670/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-2929] Ensure that the Beam Java SDK sends the property

[lcwik] [BEAM-2929] Ensure that the Beam Go SDK sends the property

[lcwik] [BEAM-2929] Ensure that the Beam Python SDK sends the property

[lostluck] [BEAM-2929] Fix go code format for


------------------------------------------
[...truncated 1.32 MB...]
19/12/03 00:11:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:33599
19/12/03 00:11:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 00:11:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 00:11:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 00:11:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575331864.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43113', 'job_port': u'0'}
19/12/03 00:11:06 INFO statecache.__init__: Creating state cache with size 0
19/12/03 00:11:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37981.
19/12/03 00:11:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/03 00:11:06 INFO sdk_worker.__init__: Control channel established.
19/12/03 00:11:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 00:11:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46153.
19/12/03 00:11:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 00:11:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:44163
19/12/03 00:11:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 00:11:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 00:11:06 INFO sdk_worker.run: No more requests from control plane
19/12/03 00:11:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 00:11:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 00:11:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 00:11:06 INFO sdk_worker.run: Done consuming work.
19/12/03 00:11:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 00:11:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 00:11:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 00:11:07 INFO sdk_worker_main.main: Logging handler created.
19/12/03 00:11:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:35101
19/12/03 00:11:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 00:11:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 00:11:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 00:11:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575331864.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43113', 'job_port': u'0'}
19/12/03 00:11:07 INFO statecache.__init__: Creating state cache with size 0
19/12/03 00:11:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37429.
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/03 00:11:07 INFO sdk_worker.__init__: Control channel established.
19/12/03 00:11:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 00:11:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43551.
19/12/03 00:11:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 00:11:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:34943
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 00:11:07 INFO sdk_worker.run: No more requests from control plane
19/12/03 00:11:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 00:11:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 00:11:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 00:11:07 INFO sdk_worker.run: Done consuming work.
19/12/03 00:11:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 00:11:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 00:11:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 00:11:08 INFO sdk_worker_main.main: Logging handler created.
19/12/03 00:11:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:38241
19/12/03 00:11:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 00:11:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 00:11:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 00:11:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575331864.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43113', 'job_port': u'0'}
19/12/03 00:11:08 INFO statecache.__init__: Creating state cache with size 0
19/12/03 00:11:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43349.
19/12/03 00:11:08 INFO sdk_worker.__init__: Control channel established.
19/12/03 00:11:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/03 00:11:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40377.
19/12/03 00:11:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 00:11:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:37203
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 00:11:08 INFO sdk_worker.run: No more requests from control plane
19/12/03 00:11:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 00:11:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 00:11:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 00:11:08 INFO sdk_worker.run: Done consuming work.
19/12/03 00:11:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 00:11:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 00:11:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/03 00:11:09 INFO sdk_worker_main.main: Logging handler created.
19/12/03 00:11:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:43493
19/12/03 00:11:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/03 00:11:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/03 00:11:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/03 00:11:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575331864.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43113', 'job_port': u'0'}
19/12/03 00:11:09 INFO statecache.__init__: Creating state cache with size 0
19/12/03 00:11:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46075.
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/03 00:11:09 INFO sdk_worker.__init__: Control channel established.
19/12/03 00:11:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/03 00:11:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37617.
19/12/03 00:11:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/03 00:11:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:41875
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/03 00:11:09 INFO sdk_worker.run: No more requests from control plane
19/12/03 00:11:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/03 00:11:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/03 00:11:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/03 00:11:09 INFO sdk_worker.run: Done consuming work.
19/12/03 00:11:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/03 00:11:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/03 00:11:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/03 00:11:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575331864.06_31c3dda5-0f98-4a2f-a9c6-8be47e52c740 finished.
19/12/03 00:11:09 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/03 00:11:09 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_27aee76c-7c13-49eb-9c1f-4c2dde96b503","basePath":"/tmp/sparktestmA8Uhd"}: {}
java.io.FileNotFoundException: /tmp/sparktestmA8Uhd/job_27aee76c-7c13-49eb-9c1f-4c2dde96b503/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140191893288704)>

# Thread: <Thread(Thread-119, started daemon 140191978215168)>

# Thread: <_MainThread(MainThread, started 140192757323520)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140191876241152)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)

# Thread: <Thread(Thread-125, started daemon 140191884896000)>

# Thread: <Thread(Thread-119, started daemon 140191978215168)>

# Thread: <_MainThread(MainThread, started 140192757323520)>

# Thread: <Thread(wait_until_finish_read, started daemon 140191893288704)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575331854.27_c7750d1d-f9f2-4853-8d7e-0a77302ac586 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 320.375s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 1s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/fq5drf6ofabrq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1669

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1669/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-4132] Set multi-output PCollections types to Any

[migryz] Bump Release Build Timeout

[migryz] fix syntax

[github] Bump time to 5 hours.

[robertwb] [BEAM-8645] A test case for TimestampCombiner. (#10081)


------------------------------------------
[...truncated 1.32 MB...]
19/12/02 23:07:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 23:07:39 INFO sdk_worker_main.main: Logging handler created.
19/12/02 23:07:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:35671
19/12/02 23:07:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 23:07:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 23:07:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 23:07:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575328056.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52653', 'job_port': u'0'}
19/12/02 23:07:39 INFO statecache.__init__: Creating state cache with size 0
19/12/02 23:07:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34507.
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 23:07:39 INFO sdk_worker.__init__: Control channel established.
19/12/02 23:07:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 23:07:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45819.
19/12/02 23:07:39 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 23:07:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:45147
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 23:07:39 INFO sdk_worker.run: No more requests from control plane
19/12/02 23:07:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 23:07:39 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 23:07:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 23:07:39 INFO sdk_worker.run: Done consuming work.
19/12/02 23:07:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 23:07:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 23:07:40 INFO sdk_worker_main.main: Logging handler created.
19/12/02 23:07:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:37879
19/12/02 23:07:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 23:07:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 23:07:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 23:07:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575328056.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52653', 'job_port': u'0'}
19/12/02 23:07:40 INFO statecache.__init__: Creating state cache with size 0
19/12/02 23:07:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42563.
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 23:07:40 INFO sdk_worker.__init__: Control channel established.
19/12/02 23:07:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 23:07:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40577.
19/12/02 23:07:40 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 23:07:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:36769
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 23:07:40 INFO sdk_worker.run: No more requests from control plane
19/12/02 23:07:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 23:07:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:40 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 23:07:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 23:07:40 INFO sdk_worker.run: Done consuming work.
19/12/02 23:07:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 23:07:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 23:07:41 INFO sdk_worker_main.main: Logging handler created.
19/12/02 23:07:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:45151
19/12/02 23:07:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 23:07:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 23:07:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 23:07:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575328056.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52653', 'job_port': u'0'}
19/12/02 23:07:41 INFO statecache.__init__: Creating state cache with size 0
19/12/02 23:07:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38761.
19/12/02 23:07:41 INFO sdk_worker.__init__: Control channel established.
19/12/02 23:07:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 23:07:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40951.
19/12/02 23:07:41 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 23:07:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:39553
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 23:07:41 INFO sdk_worker.run: No more requests from control plane
19/12/02 23:07:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 23:07:41 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 23:07:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 23:07:41 INFO sdk_worker.run: Done consuming work.
19/12/02 23:07:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 23:07:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 23:07:42 INFO sdk_worker_main.main: Logging handler created.
19/12/02 23:07:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:43565
19/12/02 23:07:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 23:07:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 23:07:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 23:07:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575328056.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52653', 'job_port': u'0'}
19/12/02 23:07:42 INFO statecache.__init__: Creating state cache with size 0
19/12/02 23:07:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33313.
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 23:07:42 INFO sdk_worker.__init__: Control channel established.
19/12/02 23:07:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 23:07:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37805.
19/12/02 23:07:42 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 23:07:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:44857
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 23:07:42 INFO sdk_worker.run: No more requests from control plane
19/12/02 23:07:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 23:07:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:42 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 23:07:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 23:07:42 INFO sdk_worker.run: Done consuming work.
19/12/02 23:07:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 23:07:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 23:07:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 23:07:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575328056.3_afcf8989-ddf5-44c2-9134-a79fd6e0b154 finished.
19/12/02 23:07:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 23:07:42 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_4244f732-b366-4b56-bb49-1a8d2662e9b2","basePath":"/tmp/sparktestBf4Pjn"}: {}
java.io.FileNotFoundException: /tmp/sparktestBf4Pjn/job_4244f732-b366-4b56-bb49-1a8d2662e9b2/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140069438170880)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-118, started daemon 140069352503040)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140070217279232)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140069344110336)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-124, started daemon 140069335717632)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140070217279232)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575328045.52_bcf643e3-9a9c-4eaa-a471-f4b6e7dc7b11 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 305.006s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 17s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/24s5jlkvigxvk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1668

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1668/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-8733]  Handle the registration request synchronously in the Python


------------------------------------------
[...truncated 1.32 MB...]
19/12/02 19:52:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:44005
19/12/02 19:52:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 19:52:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 19:52:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 19:52:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575316323.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46905', 'job_port': u'0'}
19/12/02 19:52:06 INFO statecache.__init__: Creating state cache with size 0
19/12/02 19:52:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40443.
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 19:52:06 INFO sdk_worker.__init__: Control channel established.
19/12/02 19:52:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 19:52:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39951.
19/12/02 19:52:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 19:52:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:42195
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 19:52:06 INFO sdk_worker.run: No more requests from control plane
19/12/02 19:52:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 19:52:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 19:52:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 19:52:06 INFO sdk_worker.run: Done consuming work.
19/12/02 19:52:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 19:52:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 19:52:07 INFO sdk_worker_main.main: Logging handler created.
19/12/02 19:52:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:43457
19/12/02 19:52:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 19:52:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 19:52:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 19:52:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575316323.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46905', 'job_port': u'0'}
19/12/02 19:52:07 INFO statecache.__init__: Creating state cache with size 0
19/12/02 19:52:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41305.
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 19:52:07 INFO sdk_worker.__init__: Control channel established.
19/12/02 19:52:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 19:52:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41275.
19/12/02 19:52:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 19:52:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:33861
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 19:52:07 INFO sdk_worker.run: No more requests from control plane
19/12/02 19:52:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 19:52:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 19:52:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 19:52:07 INFO sdk_worker.run: Done consuming work.
19/12/02 19:52:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 19:52:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 19:52:08 INFO sdk_worker_main.main: Logging handler created.
19/12/02 19:52:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:37131
19/12/02 19:52:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 19:52:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 19:52:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 19:52:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575316323.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46905', 'job_port': u'0'}
19/12/02 19:52:08 INFO statecache.__init__: Creating state cache with size 0
19/12/02 19:52:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46441.
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 19:52:08 INFO sdk_worker.__init__: Control channel established.
19/12/02 19:52:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 19:52:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36415.
19/12/02 19:52:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 19:52:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:39377
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 19:52:08 INFO sdk_worker.run: No more requests from control plane
19/12/02 19:52:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 19:52:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 19:52:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 19:52:08 INFO sdk_worker.run: Done consuming work.
19/12/02 19:52:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 19:52:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 19:52:09 INFO sdk_worker_main.main: Logging handler created.
19/12/02 19:52:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:44675
19/12/02 19:52:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 19:52:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 19:52:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 19:52:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575316323.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46905', 'job_port': u'0'}
19/12/02 19:52:09 INFO statecache.__init__: Creating state cache with size 0
19/12/02 19:52:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43365.
19/12/02 19:52:09 INFO sdk_worker.__init__: Control channel established.
19/12/02 19:52:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 19:52:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32889.
19/12/02 19:52:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 19:52:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:45293
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 19:52:09 INFO sdk_worker.run: No more requests from control plane
19/12/02 19:52:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 19:52:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 19:52:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 19:52:09 INFO sdk_worker.run: Done consuming work.
19/12/02 19:52:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 19:52:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 19:52:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 19:52:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575316323.07_6046d00c-fff4-4db8-b431-caf957a07f07 finished.
19/12/02 19:52:09 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 19:52:09 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_18efc3f0-0527-47af-bcfd-e4553086af64","basePath":"/tmp/sparktestzSz0RX"}: {}
java.io.FileNotFoundException: /tmp/sparktestzSz0RX/job_18efc3f0-0527-47af-bcfd-e4553086af64/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 140353306609408)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-116, started daemon 140353323394816)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140354102503168)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140352811951872)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-122, started daemon 140352820344576)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140354102503168)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 140353306609408)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-116, started daemon 140353323394816)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575316310.32_8df681ce-4d54-4250-90ec-a1e9e39e5651 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 351.412s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/iwdvlnwg6euhk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1667

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1667/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/02 18:14:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41 on Spark master local
19/12/02 18:14:05 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/02 18:14:05 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41: Pipeline translated successfully. Computing outputs
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:05 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:43165
19/12/02 18:14:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:05 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35021.
19/12/02 18:14:05 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/02 18:14:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42059.
19/12/02 18:14:05 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:46473
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:06 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:06 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:06 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:41317
19/12/02 18:14:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:06 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38915.
19/12/02 18:14:06 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 18:14:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38099.
19/12/02 18:14:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:45031
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:06 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:06 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:07 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:44829
19/12/02 18:14:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:07 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36647.
19/12/02 18:14:07 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 18:14:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43441.
19/12/02 18:14:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:43043
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:07 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:07 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:08 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:35317
19/12/02 18:14:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:08 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42263.
19/12/02 18:14:08 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 18:14:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40303.
19/12/02 18:14:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:43231
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:08 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:08 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 18:14:09 INFO sdk_worker_main.main: Logging handler created.
19/12/02 18:14:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:36029
19/12/02 18:14:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 18:14:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 18:14:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 18:14:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575310444.26', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:53973', 'job_port': u'0'}
19/12/02 18:14:09 INFO statecache.__init__: Creating state cache with size 0
19/12/02 18:14:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41519.
19/12/02 18:14:09 INFO sdk_worker.__init__: Control channel established.
19/12/02 18:14:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 18:14:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38267.
19/12/02 18:14:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 18:14:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:46233
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 18:14:09 INFO sdk_worker.run: No more requests from control plane
19/12/02 18:14:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 18:14:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 18:14:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 18:14:09 INFO sdk_worker.run: Done consuming work.
19/12/02 18:14:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 18:14:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 18:14:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 18:14:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575310444.26_4dc2b39c-59c1-44dd-8e8c-c812a503ff41 finished.
19/12/02 18:14:09 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 18:14:09 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_252fe288-178c-490a-beb1-a694f350c984","basePath":"/tmp/sparktest6W8AMZ"}: {}
java.io.FileNotFoundException: /tmp/sparktest6W8AMZ/job_252fe288-178c-490a-beb1-a694f350c984/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================
  File "/usr/lib/python2.7/threading.py", line 359, in wait

# Thread: <Thread(wait_until_finish_read, started daemon 140077219415808)>

    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-119, started daemon 140076734478080)>

======================================================================
# Thread: <_MainThread(MainThread, started 140078014588672)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575310435.37_3fb6ed47-5828-4d8b-beee-03b3d2f9939b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 287.483s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 16s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/jfecorx56tfr6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1666

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1666/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/02 12:19:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 12:19:07 INFO sdk_worker_main.main: Logging handler created.
19/12/02 12:19:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:39717
19/12/02 12:19:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 12:19:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 12:19:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 12:19:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575289144.78', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46617', 'job_port': u'0'}
19/12/02 12:19:07 INFO statecache.__init__: Creating state cache with size 0
19/12/02 12:19:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39003.
19/12/02 12:19:07 INFO sdk_worker.__init__: Control channel established.
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 12:19:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 12:19:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35451.
19/12/02 12:19:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 12:19:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:41007
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 12:19:07 INFO sdk_worker.run: No more requests from control plane
19/12/02 12:19:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 12:19:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 12:19:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 12:19:07 INFO sdk_worker.run: Done consuming work.
19/12/02 12:19:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 12:19:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 12:19:08 INFO sdk_worker_main.main: Logging handler created.
19/12/02 12:19:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:34331
19/12/02 12:19:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 12:19:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 12:19:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 12:19:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575289144.78', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46617', 'job_port': u'0'}
19/12/02 12:19:08 INFO statecache.__init__: Creating state cache with size 0
19/12/02 12:19:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46437.
19/12/02 12:19:08 INFO sdk_worker.__init__: Control channel established.
19/12/02 12:19:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 12:19:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34887.
19/12/02 12:19:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 12:19:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:34529
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 12:19:08 INFO sdk_worker.run: No more requests from control plane
19/12/02 12:19:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 12:19:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 12:19:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 12:19:08 INFO sdk_worker.run: Done consuming work.
19/12/02 12:19:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 12:19:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 12:19:09 INFO sdk_worker_main.main: Logging handler created.
19/12/02 12:19:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:35705
19/12/02 12:19:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 12:19:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 12:19:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 12:19:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575289144.78', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46617', 'job_port': u'0'}
19/12/02 12:19:09 INFO statecache.__init__: Creating state cache with size 0
19/12/02 12:19:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43033.
19/12/02 12:19:09 INFO sdk_worker.__init__: Control channel established.
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 12:19:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 12:19:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35445.
19/12/02 12:19:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 12:19:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:34099
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 12:19:09 INFO sdk_worker.run: No more requests from control plane
19/12/02 12:19:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 12:19:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 12:19:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 12:19:09 INFO sdk_worker.run: Done consuming work.
19/12/02 12:19:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 12:19:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 12:19:10 INFO sdk_worker_main.main: Logging handler created.
19/12/02 12:19:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:36901
19/12/02 12:19:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 12:19:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 12:19:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 12:19:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575289144.78', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46617', 'job_port': u'0'}
19/12/02 12:19:10 INFO statecache.__init__: Creating state cache with size 0
19/12/02 12:19:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42677.
19/12/02 12:19:10 INFO sdk_worker.__init__: Control channel established.
19/12/02 12:19:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 12:19:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36879.
19/12/02 12:19:10 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 12:19:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:46811
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 12:19:10 INFO sdk_worker.run: No more requests from control plane
19/12/02 12:19:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 12:19:10 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 12:19:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 12:19:10 INFO sdk_worker.run: Done consuming work.
19/12/02 12:19:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 12:19:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 12:19:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 12:19:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575289144.78_5e824f44-7a7a-4970-bc99-2d727a3a3c55 finished.
19/12/02 12:19:10 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 12:19:10 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f8f93c73-88df-4615-9bad-5b744ab55167","basePath":"/tmp/sparktestmQZUfn"}: {}
java.io.FileNotFoundException: /tmp/sparktestmQZUfn/job_f8f93c73-88df-4615-9bad-5b744ab55167/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================
----------------------------------------------------------------------

Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140444742702848)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-120, started daemon 140444759488256)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140445881960192)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140444725917440)>

# Thread: <Thread(Thread-124, started daemon 140444734310144)>

# Thread: <_MainThread(MainThread, started 140445881960192)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575289135.1_d1b63aed-344e-494c-8d90-17e859a01314 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 306.917s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 39s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/anjtpjlrjvdzy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1665

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1665/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/02 06:17:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:41719
19/12/02 06:17:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 06:17:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 06:17:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 06:17:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575267442.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58661', 'job_port': u'0'}
19/12/02 06:17:24 INFO statecache.__init__: Creating state cache with size 0
19/12/02 06:17:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38767.
19/12/02 06:17:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 06:17:24 INFO sdk_worker.__init__: Control channel established.
19/12/02 06:17:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 06:17:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42597.
19/12/02 06:17:24 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 06:17:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:34629
19/12/02 06:17:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 06:17:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 06:17:24 INFO sdk_worker.run: No more requests from control plane
19/12/02 06:17:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 06:17:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:24 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 06:17:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 06:17:24 INFO sdk_worker.run: Done consuming work.
19/12/02 06:17:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 06:17:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 06:17:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 06:17:25 INFO sdk_worker_main.main: Logging handler created.
19/12/02 06:17:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:43851
19/12/02 06:17:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 06:17:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 06:17:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 06:17:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575267442.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58661', 'job_port': u'0'}
19/12/02 06:17:25 INFO statecache.__init__: Creating state cache with size 0
19/12/02 06:17:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42927.
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 06:17:25 INFO sdk_worker.__init__: Control channel established.
19/12/02 06:17:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 06:17:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41097.
19/12/02 06:17:25 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 06:17:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:43859
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 06:17:25 INFO sdk_worker.run: No more requests from control plane
19/12/02 06:17:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 06:17:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:25 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 06:17:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 06:17:25 INFO sdk_worker.run: Done consuming work.
19/12/02 06:17:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 06:17:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 06:17:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 06:17:26 INFO sdk_worker_main.main: Logging handler created.
19/12/02 06:17:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:36345
19/12/02 06:17:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 06:17:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 06:17:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 06:17:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575267442.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58661', 'job_port': u'0'}
19/12/02 06:17:26 INFO statecache.__init__: Creating state cache with size 0
19/12/02 06:17:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35421.
19/12/02 06:17:26 INFO sdk_worker.__init__: Control channel established.
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 06:17:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 06:17:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40135.
19/12/02 06:17:26 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 06:17:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:46839
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 06:17:26 INFO sdk_worker.run: No more requests from control plane
19/12/02 06:17:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 06:17:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:26 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 06:17:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 06:17:26 INFO sdk_worker.run: Done consuming work.
19/12/02 06:17:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 06:17:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 06:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 06:17:27 INFO sdk_worker_main.main: Logging handler created.
19/12/02 06:17:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:40109
19/12/02 06:17:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 06:17:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 06:17:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 06:17:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575267442.11', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58661', 'job_port': u'0'}
19/12/02 06:17:27 INFO statecache.__init__: Creating state cache with size 0
19/12/02 06:17:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34953.
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 06:17:27 INFO sdk_worker.__init__: Control channel established.
19/12/02 06:17:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 06:17:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42439.
19/12/02 06:17:27 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 06:17:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:38581
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 06:17:27 INFO sdk_worker.run: No more requests from control plane
19/12/02 06:17:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 06:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:27 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 06:17:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 06:17:27 INFO sdk_worker.run: Done consuming work.
19/12/02 06:17:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 06:17:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 06:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 06:17:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575267442.11_a5cf41f0-f4da-4292-b4f4-f25c8fe90247 finished.
19/12/02 06:17:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 06:17:27 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6e65d1a0-ce03-4b48-a0c3-305442a24622","basePath":"/tmp/sparktest3OMUrJ"}: {}
java.io.FileNotFoundException: /tmp/sparktest3OMUrJ/job_6e65d1a0-ce03-4b48-a0c3-305442a24622/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()

# Thread: <Thread(wait_until_finish_read, started daemon 140651100137216)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-120, started daemon 140651108529920)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140651887638272)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 140651073910528)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-126, started daemon 140651082565376)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 140651887638272)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 140651108529920)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140651100137216)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575267431.96_c05e4f36-efb4-49d7-ac00-e4c0dbcc17e4 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 319.185s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 16s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://scans.gradle.com/s/z33tgmiq5duhc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1664

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1664/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/02 00:13:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934 on Spark master local
19/12/02 00:13:10 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/02 00:13:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934: Pipeline translated successfully. Computing outputs
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:10 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:45553
19/12/02 00:13:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:10 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34887.
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/02 00:13:10 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43647.
19/12/02 00:13:10 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:43265
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:10 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:10 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:10 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:11 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:36639
19/12/02 00:13:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:11 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39109.
19/12/02 00:13:11 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/02 00:13:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39153.
19/12/02 00:13:11 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:35023
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:11 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:11 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:11 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:12 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:12 INFO sdk_worker_main.start: Status HTTP server running at localhost:43743
19/12/02 00:13:12 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:12 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:12 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:12 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:12 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:12 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39141.
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/02 00:13:12 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:12 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:12 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40089.
19/12/02 00:13:12 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:12 INFO data_plane.create_data_channel: Creating client data channel for localhost:44481
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:12 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:12 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:12 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:12 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:12 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:12 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:13 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:13 INFO sdk_worker_main.start: Status HTTP server running at localhost:41215
19/12/02 00:13:13 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:13 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41823.
19/12/02 00:13:13 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/02 00:13:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38499.
19/12/02 00:13:13 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:45021
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:13 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:13 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:13 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:13 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:13 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:13 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:13 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/02 00:13:14 INFO sdk_worker_main.main: Logging handler created.
19/12/02 00:13:14 INFO sdk_worker_main.start: Status HTTP server running at localhost:42905
19/12/02 00:13:14 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/02 00:13:14 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/02 00:13:14 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/02 00:13:14 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575245589.18', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:39877', 'job_port': u'0'}
19/12/02 00:13:14 INFO statecache.__init__: Creating state cache with size 0
19/12/02 00:13:14 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38583.
19/12/02 00:13:14 INFO sdk_worker.__init__: Control channel established.
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/02 00:13:14 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/02 00:13:14 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43555.
19/12/02 00:13:14 INFO sdk_worker.create_state_handler: State channel established.
19/12/02 00:13:14 INFO data_plane.create_data_channel: Creating client data channel for localhost:38667
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/02 00:13:14 INFO sdk_worker.run: No more requests from control plane
19/12/02 00:13:14 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/02 00:13:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:14 INFO data_plane.close: Closing all cached grpc data channels.
19/12/02 00:13:14 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/02 00:13:14 INFO sdk_worker.run: Done consuming work.
19/12/02 00:13:14 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/02 00:13:14 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/02 00:13:14 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/02 00:13:14 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575245589.18_a19de7f5-6f55-4ac5-a608-4c5045401934 finished.
19/12/02 00:13:14 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/02 00:13:14 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_523424ac-49a3-409c-89d4-dc39c8986a5c","basePath":"/tmp/sparktestmn1d0_"}: {}
java.io.FileNotFoundException: /tmp/sparktestmn1d0_/job_523424ac-49a3-409c-89d4-dc39c8986a5c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140645320652544)>

----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
# Thread: <Thread(Thread-118, started daemon 140645337437952)>

    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "ap# Thread: <_MainThread(MainThread, started 140646116546304)>
ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575245580.14_7d10de80-86a8-4622-b70e-338ea53bb9cd failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 293.346s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 41s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/psiroieuwxsuy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1663

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1663/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/01 18:14:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:46665
19/12/01 18:14:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 18:14:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 18:14:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 18:14:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575224047.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52543', 'job_port': u'0'}
19/12/01 18:14:09 INFO statecache.__init__: Creating state cache with size 0
19/12/01 18:14:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36243.
19/12/01 18:14:09 INFO sdk_worker.__init__: Control channel established.
19/12/01 18:14:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 18:14:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/01 18:14:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44173.
19/12/01 18:14:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 18:14:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:44623
19/12/01 18:14:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 18:14:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 18:14:09 INFO sdk_worker.run: No more requests from control plane
19/12/01 18:14:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 18:14:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 18:14:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 18:14:09 INFO sdk_worker.run: Done consuming work.
19/12/01 18:14:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 18:14:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 18:14:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 18:14:11 INFO sdk_worker_main.main: Logging handler created.
19/12/01 18:14:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:41255
19/12/01 18:14:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 18:14:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 18:14:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 18:14:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575224047.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52543', 'job_port': u'0'}
19/12/01 18:14:11 INFO statecache.__init__: Creating state cache with size 0
19/12/01 18:14:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40929.
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/01 18:14:11 INFO sdk_worker.__init__: Control channel established.
19/12/01 18:14:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 18:14:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37819.
19/12/01 18:14:11 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 18:14:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:41279
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 18:14:11 INFO sdk_worker.run: No more requests from control plane
19/12/01 18:14:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 18:14:11 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 18:14:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 18:14:11 INFO sdk_worker.run: Done consuming work.
19/12/01 18:14:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 18:14:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 18:14:12 INFO sdk_worker_main.main: Logging handler created.
19/12/01 18:14:12 INFO sdk_worker_main.start: Status HTTP server running at localhost:44021
19/12/01 18:14:12 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 18:14:12 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 18:14:12 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 18:14:12 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575224047.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52543', 'job_port': u'0'}
19/12/01 18:14:12 INFO statecache.__init__: Creating state cache with size 0
19/12/01 18:14:12 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42507.
19/12/01 18:14:12 INFO sdk_worker.__init__: Control channel established.
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/01 18:14:12 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 18:14:12 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43667.
19/12/01 18:14:12 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 18:14:12 INFO data_plane.create_data_channel: Creating client data channel for localhost:33717
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 18:14:12 INFO sdk_worker.run: No more requests from control plane
19/12/01 18:14:12 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 18:14:12 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 18:14:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:12 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 18:14:12 INFO sdk_worker.run: Done consuming work.
19/12/01 18:14:12 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 18:14:12 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:12 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 18:14:13 INFO sdk_worker_main.main: Logging handler created.
19/12/01 18:14:13 INFO sdk_worker_main.start: Status HTTP server running at localhost:41751
19/12/01 18:14:13 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 18:14:13 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 18:14:13 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 18:14:13 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575224047.07', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52543', 'job_port': u'0'}
19/12/01 18:14:13 INFO statecache.__init__: Creating state cache with size 0
19/12/01 18:14:13 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44363.
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/01 18:14:13 INFO sdk_worker.__init__: Control channel established.
19/12/01 18:14:13 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 18:14:13 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45565.
19/12/01 18:14:13 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 18:14:13 INFO data_plane.create_data_channel: Creating client data channel for localhost:44899
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 18:14:13 INFO sdk_worker.run: No more requests from control plane
19/12/01 18:14:13 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 18:14:13 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 18:14:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:13 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 18:14:13 INFO sdk_worker.run: Done consuming work.
19/12/01 18:14:13 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 18:14:13 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 18:14:13 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 18:14:13 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575224047.07_d03acb74-235e-422c-924c-deda0bd67ed6 finished.
19/12/01 18:14:13 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/01 18:14:13 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9119c505-116b-413e-a604-0e42f7bffe15","basePath":"/tmp/sparktestrisL3D"}: {}
java.io.FileNotFoundException: /tmp/sparktestrisL3D/job_9119c505-116b-413e-a604-0e42f7bffe15/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

# Thread: <Thread(wait_until_finish_read, started daemon 140272040601344)>

----------------------------------------------------------------------
# Thread: <Thread(Thread-118, started daemon 140271551510272)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140272821880576)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(wait_until_finish_read, started daemon 140271534724864)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 140271543117568)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140271551510272)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 140272821880576)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140272040601344)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575224036.85_df39dfc2-1351-44aa-b4cc-3152bef3a1ae failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 363.548s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 52s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/5mw34r72dozze

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1662

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1662/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/01 12:13:53 INFO sdk_worker_main.start: Status HTTP server running at localhost:46707
19/12/01 12:13:53 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 12:13:53 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 12:13:53 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 12:13:53 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575202430.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57751', 'job_port': u'0'}
19/12/01 12:13:53 INFO statecache.__init__: Creating state cache with size 0
19/12/01 12:13:53 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36091.
19/12/01 12:13:53 INFO sdk_worker.__init__: Control channel established.
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/01 12:13:53 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 12:13:53 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36939.
19/12/01 12:13:53 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 12:13:53 INFO data_plane.create_data_channel: Creating client data channel for localhost:40289
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 12:13:53 INFO sdk_worker.run: No more requests from control plane
19/12/01 12:13:53 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:53 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 12:13:53 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 12:13:53 INFO sdk_worker.run: Done consuming work.
19/12/01 12:13:53 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 12:13:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 12:13:54 INFO sdk_worker_main.main: Logging handler created.
19/12/01 12:13:54 INFO sdk_worker_main.start: Status HTTP server running at localhost:37659
19/12/01 12:13:54 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 12:13:54 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 12:13:54 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 12:13:54 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575202430.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57751', 'job_port': u'0'}
19/12/01 12:13:54 INFO statecache.__init__: Creating state cache with size 0
19/12/01 12:13:54 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33469.
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/01 12:13:54 INFO sdk_worker.__init__: Control channel established.
19/12/01 12:13:54 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 12:13:54 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38017.
19/12/01 12:13:54 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 12:13:54 INFO data_plane.create_data_channel: Creating client data channel for localhost:42779
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 12:13:54 INFO sdk_worker.run: No more requests from control plane
19/12/01 12:13:54 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:54 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 12:13:54 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 12:13:54 INFO sdk_worker.run: Done consuming work.
19/12/01 12:13:54 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 12:13:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 12:13:55 INFO sdk_worker_main.main: Logging handler created.
19/12/01 12:13:55 INFO sdk_worker_main.start: Status HTTP server running at localhost:45869
19/12/01 12:13:55 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 12:13:55 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 12:13:55 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 12:13:55 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575202430.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57751', 'job_port': u'0'}
19/12/01 12:13:55 INFO statecache.__init__: Creating state cache with size 0
19/12/01 12:13:55 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44459.
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/01 12:13:55 INFO sdk_worker.__init__: Control channel established.
19/12/01 12:13:55 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 12:13:55 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46813.
19/12/01 12:13:55 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 12:13:55 INFO data_plane.create_data_channel: Creating client data channel for localhost:40735
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 12:13:55 INFO sdk_worker.run: No more requests from control plane
19/12/01 12:13:55 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 12:13:55 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 12:13:55 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 12:13:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:55 INFO sdk_worker.run: Done consuming work.
19/12/01 12:13:55 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 12:13:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 12:13:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 12:13:56 INFO sdk_worker_main.main: Logging handler created.
19/12/01 12:13:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:38903
19/12/01 12:13:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 12:13:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 12:13:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 12:13:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575202430.51', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:57751', 'job_port': u'0'}
19/12/01 12:13:56 INFO statecache.__init__: Creating state cache with size 0
19/12/01 12:13:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43969.
19/12/01 12:13:56 INFO sdk_worker.__init__: Control channel established.
19/12/01 12:13:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/01 12:13:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44521.
19/12/01 12:13:56 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 12:13:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:40269
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 12:13:56 INFO sdk_worker.run: No more requests from control plane
19/12/01 12:13:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 12:13:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:56 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 12:13:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 12:13:56 INFO sdk_worker.run: Done consuming work.
19/12/01 12:13:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 12:13:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 12:13:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 12:13:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575202430.51_0a95af0d-f1e1-424a-82ce-af0c1978c806 finished.
19/12/01 12:13:57 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/01 12:13:57 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_ec3d552e-2f53-4045-ac2f-8ced9250d7d5","basePath":"/tmp/sparktestH370Sx"}: {}
java.io.FileNotFoundException: /tmp/sparktestH370Sx/job_ec3d552e-2f53-4045-ac2f-8ced9250d7d5/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140613191739136)>

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-119, started daemon 140613183346432)>

# Thread: <_MainThread(MainThread, started 140613970847488)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140613157119744)>

# Thread: <Thread(Thread-125, started daemon 140613165774592)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140613970847488)>

# Thread: <Thread(Thread-119, started daemon 140613183346432)>

# Thread: <Thread(wait_until_finish_read, started daemon 140613191739136)>
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575202417.02_c2504de6-3ed7-449a-9321-a85a8cb98407 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 359.553s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 0s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/widgowtoxhy5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1661

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1661/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/12/01 06:15:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:44161
19/12/01 06:15:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 06:15:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 06:15:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 06:15:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575180903.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41149', 'job_port': u'0'}
19/12/01 06:15:06 INFO statecache.__init__: Creating state cache with size 0
19/12/01 06:15:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45393.
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/01 06:15:06 INFO sdk_worker.__init__: Control channel established.
19/12/01 06:15:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 06:15:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45171.
19/12/01 06:15:06 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 06:15:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:41081
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 06:15:06 INFO sdk_worker.run: No more requests from control plane
19/12/01 06:15:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 06:15:06 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 06:15:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 06:15:06 INFO sdk_worker.run: Done consuming work.
19/12/01 06:15:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 06:15:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 06:15:07 INFO sdk_worker_main.main: Logging handler created.
19/12/01 06:15:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:42365
19/12/01 06:15:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 06:15:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 06:15:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 06:15:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575180903.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41149', 'job_port': u'0'}
19/12/01 06:15:07 INFO statecache.__init__: Creating state cache with size 0
19/12/01 06:15:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42973.
19/12/01 06:15:07 INFO sdk_worker.__init__: Control channel established.
19/12/01 06:15:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/01 06:15:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34961.
19/12/01 06:15:07 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 06:15:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:45213
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 06:15:07 INFO sdk_worker.run: No more requests from control plane
19/12/01 06:15:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:07 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 06:15:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 06:15:07 INFO sdk_worker.run: Done consuming work.
19/12/01 06:15:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 06:15:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 06:15:08 INFO sdk_worker_main.main: Logging handler created.
19/12/01 06:15:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:33693
19/12/01 06:15:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 06:15:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 06:15:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 06:15:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575180903.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41149', 'job_port': u'0'}
19/12/01 06:15:08 INFO statecache.__init__: Creating state cache with size 0
19/12/01 06:15:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43601.
19/12/01 06:15:08 INFO sdk_worker.__init__: Control channel established.
19/12/01 06:15:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/01 06:15:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36295.
19/12/01 06:15:08 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 06:15:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:35417
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 06:15:08 INFO sdk_worker.run: No more requests from control plane
19/12/01 06:15:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 06:15:08 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 06:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 06:15:08 INFO sdk_worker.run: Done consuming work.
19/12/01 06:15:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 06:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 06:15:09 INFO sdk_worker_main.main: Logging handler created.
19/12/01 06:15:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:38179
19/12/01 06:15:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 06:15:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 06:15:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 06:15:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575180903.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41149', 'job_port': u'0'}
19/12/01 06:15:09 INFO statecache.__init__: Creating state cache with size 0
19/12/01 06:15:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40855.
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/01 06:15:09 INFO sdk_worker.__init__: Control channel established.
19/12/01 06:15:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 06:15:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46451.
19/12/01 06:15:09 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 06:15:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:33645
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 06:15:09 INFO sdk_worker.run: No more requests from control plane
19/12/01 06:15:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 06:15:09 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 06:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 06:15:09 INFO sdk_worker.run: Done consuming work.
19/12/01 06:15:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 06:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 06:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 06:15:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575180903.43_39ed56dd-132f-4ddb-8fa2-877a640281ba finished.
19/12/01 06:15:09 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/01 06:15:09 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f422cdc0-36ca-4a3f-b490-a9f35c3af59a","basePath":"/tmp/sparktestLcRVOs"}: {}
java.io.FileNotFoundException: /tmp/sparktestLcRVOs/job_f422cdc0-36ca-4a3f-b490-a9f35c3af59a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140464365328128)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(Thread-118, started daemon 140464373720832)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140465152829184)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 140464271382272)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-124, started daemon 140464279774976)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-118, started daemon 140464373720832)>

BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140465152829184)>

======================================================================
# Thread: <Thread(wait_until_finish_read, started daemon 140464365328128)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575180892.6_b1ddb27a-2a34-40db-a13a-f5d4d66ec74a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 338.481s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 39s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/fsgxyjrls2aia

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1660

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1660/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/12/01 00:16:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8 on Spark master local
19/12/01 00:16:17 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/12/01 00:16:17 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8: Pipeline translated successfully. Computing outputs
19/12/01 00:16:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:18 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:45899
19/12/01 00:16:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:18 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46547.
19/12/01 00:16:18 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/12/01 00:16:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36457.
19/12/01 00:16:18 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:37683
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:18 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:18 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:18 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:19 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:40521
19/12/01 00:16:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:19 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43705.
19/12/01 00:16:19 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/12/01 00:16:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38251.
19/12/01 00:16:19 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:34325
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:19 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:19 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:19 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:20 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:38959
19/12/01 00:16:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:20 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36723.
19/12/01 00:16:20 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/12/01 00:16:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34523.
19/12/01 00:16:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:43515
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:20 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:20 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:20 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:20 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:34003
19/12/01 00:16:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:20 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37411.
19/12/01 00:16:20 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/12/01 00:16:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40947.
19/12/01 00:16:20 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:39605
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:20 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:20 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:20 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/12/01 00:16:21 INFO sdk_worker_main.main: Logging handler created.
19/12/01 00:16:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:38603
19/12/01 00:16:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/12/01 00:16:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/12/01 00:16:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/12/01 00:16:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575159376.84', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44733', 'job_port': u'0'}
19/12/01 00:16:21 INFO statecache.__init__: Creating state cache with size 0
19/12/01 00:16:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39541.
19/12/01 00:16:21 INFO sdk_worker.__init__: Control channel established.
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/12/01 00:16:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/12/01 00:16:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42945.
19/12/01 00:16:21 INFO sdk_worker.create_state_handler: State channel established.
19/12/01 00:16:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:39745
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/12/01 00:16:21 INFO sdk_worker.run: No more requests from control plane
19/12/01 00:16:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/12/01 00:16:21 INFO data_plane.close: Closing all cached grpc data channels.
19/12/01 00:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/12/01 00:16:21 INFO sdk_worker.run: Done consuming work.
19/12/01 00:16:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/12/01 00:16:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/12/01 00:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/12/01 00:16:21 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575159376.84_4487da24-21a5-4203-9bc0-ea21c5cfa0e8 finished.
19/12/01 00:16:21 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/12/01 00:16:21 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_b4f3d69f-4a39-4144-b371-18770b2f92f7","basePath":"/tmp/sparktestqn2xiM"}: {}
java.io.FileNotFoundException: /tmp/sparktestqn2xiM/job_b4f3d69f-4a39-4144-b371-18770b2f92f7/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__

# Thread: <Thread(wait_until_finish_read, started daemon 140119541540608)>

    self.run().wait_until_finish()
  File "ap# Thread: <Thread(Thread-118, started daemon 140119533147904)>
ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575159368.3_d83954d4-a21b-4bbf-9dbd-2fe0eae8d139 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 140120391169792)>
----------------------------------------------------------------------
Ran 38 tests in 266.073s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/jrbtxjbv53o34

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1659

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1659/display/redirect>

Changes:


------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/>
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1f64ba3aeb093c77c4d931fb6791b8b239be3f85 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1f64ba3aeb093c77c4d931fb6791b8b239be3f85
Commit message: "Merge pull request #10105: [BEAM-4776] Add metrics support to Java PortableRunner"
 > git rev-list --no-walk 1f64ba3aeb093c77c4d931fb6791b8b239be3f85 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :sdks:python:test-suites:portable:py2:sparkValidatesRunner
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 12933
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-12933.out.log
----- Last  20 lines from daemon log file - daemon-12933.out.log -----
18:05:13.801 [DEBUG] [org.gradle.launcher.daemon.server.DefaultDaemonConnection] thread 114: Received non-IO message from client: Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}>
18:05:13.802 [INFO] [org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler] Received command: Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}.>
18:05:13.802 [DEBUG] [org.gradle.launcher.daemon.server.DefaultIncomingConnectionHandler] Starting executing command: Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}> with connection: socket connection from /0:0:0:0:0:0:0:1:46033 to /0:0:0:0:0:0:0:1:54266.
18:05:13.802 [ERROR] [org.gradle.launcher.daemon.server.DaemonStateCoordinator] Command execution: started DaemonCommandExecution[command = Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src},> connection = DefaultDaemonConnection: socket connection from /0:0:0:0:0:0:0:1:46033 to /0:0:0:0:0:0:0:1:54266] after 0.0 minutes of idle
18:05:13.802 [INFO] [org.gradle.launcher.daemon.server.DaemonRegistryUpdater] Marking the daemon as busy, address: [724d7790-1aa3-4ed1-8189-3232e24e40ba port:46033, addresses:[/0:0:0:0:0:0:0:1%lo, /127.0.0.1]]
18:05:13.802 [DEBUG] [org.gradle.launcher.daemon.registry.PersistentDaemonRegistry] Marking busy by address: [724d7790-1aa3-4ed1-8189-3232e24e40ba port:46033, addresses:[/0:0:0:0:0:0:0:1%lo, /127.0.0.1]]
18:05:13.802 [DEBUG] [org.gradle.cache.internal.DefaultFileLockManager] Waiting to acquire exclusive lock on daemon addresses registry.
18:05:13.803 [DEBUG] [org.gradle.cache.internal.DefaultFileLockManager] Lock acquired on daemon addresses registry.
18:05:13.803 [DEBUG] [org.gradle.cache.internal.DefaultFileLockManager] Releasing lock on daemon addresses registry.
18:05:13.803 [DEBUG] [org.gradle.launcher.daemon.server.DaemonStateCoordinator] resetting idle timer
18:05:13.803 [DEBUG] [org.gradle.launcher.daemon.server.DaemonStateCoordinator] daemon is running. Sleeping until state changes.
18:05:13.803 [INFO] [org.gradle.launcher.daemon.server.exec.StartBuildOrRespondWithBusy] Daemon is about to start building Build{id=a797e98f-ad75-422a-a42c-f30b1842b92f, currentDir=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src}.> Dispatching build started information...
18:05:13.803 [DEBUG] [org.gradle.launcher.daemon.server.SynchronizedDispatchConnection] thread 26: dispatching class org.gradle.launcher.daemon.protocol.BuildStarted
18:05:13.804 [DEBUG] [org.gradle.launcher.daemon.server.exec.EstablishBuildEnvironment] Configuring env variables: {PATH=/home/jenkins/tools/java/latest1.8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games, RUN_DISPLAY_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1659/display/redirect, HUDSON_HOME=/x1/jenkins/jenkins-home, RUN_CHANGES_DISPLAY_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1659/display/redirect?page=changes, JOB_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/, HUDSON_COOKIE=1c398942-becf-455b-9e59-da903e7e45f9, NIX_LABEL=ubuntu, SLACK_WEBHOOK_URL=****, SUDO_USER=yifanzou, MAIL=/var/mail/jenkins, WIN_LABEL=Windows, JENKINS_SERVER_COOKIE=f4ebd1e6b0d976e8, USERNAME=root, LOGNAME=jenkins, PWD=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src,> JENKINS_URL=https://builds.apache.org/, SHELL=/bin/bash, BUILD_TAG=jenkins-beam_PostCommit_Python_VR_Spark-1659, GIT_AUTHOR_EMAIL=builds@apache.org, LESSOPEN=| /usr/bin/lesspipe %s, ROOT_BUILD_CAUSE=TIMERTRIGGER, BUILD_CAUSE_TIMERTRIGGER=true, GIT_AUTHOR_NAME=jenkins, OLDPWD=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src,> JENKINS_HOME=/x1/jenkins/jenkins-home, sha1=master, NODE_NAME=apache-beam-jenkins-7, BUILD_DISPLAY_NAME=#1659, JOB_DISPLAY_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/display/redirect, GIT_BRANCH=origin/master, LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:, SHLVL=1, GIT_PREVIOUS_COMMIT=1f64ba3aeb093c77c4d931fb6791b8b239be3f85, LESSCLOSE=/usr/bin/lesspipe %s %s, JAVA_HOME=/home/jenkins/tools/java/latest1.8, TERM=xterm-256color, BUILD_ID=1659, LANG=en_US.UTF-8, JOB_NAME=beam_PostCommit_Python_VR_Spark, SPARK_LOCAL_IP=127.0.0.1, BUILD_CAUSE=TIMERTRIGGER, SUDO_GID=1014, GIT_PREVIOUS_SUCCESSFUL_COMMIT=fc77c31fe7fa99425862de688251a09ded57bf54, NODE_LABELS=apache-beam-jenkins-7 beam, HUDSON_URL=https://builds.apache.org/, WORKSPACE=<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/,> ROOT_BUILD_CAUSE_TIMERTRIGGER=true, SUDO_UID=1013, _=/usr/bin/nohup, GIT_COMMIT=1f64ba3aeb093c77c4d931fb6791b8b239be3f85, COVERALLS_REPO_TOKEN=****, EXECUTOR_NUMBER=1, HUDSON_SERVER_COOKIE=f4ebd1e6b0d976e8, GIT_COMMITTER_NAME=jenkins, JOB_BASE_NAME=beam_PostCommit_Python_VR_Spark, USER=jenkins, SUDO_COMMAND=/bin/su jenkins, BUILD_NUMBER=1659, BUILD_URL=https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1659/, GIT_COMMITTER_EMAIL=builds@apache.org, GIT_URL=https://github.com/apache/beam.git, HOME=/home/jenkins}
18:05:13.805 [DEBUG] [org.gradle.launcher.daemon.server.exec.LogToClient] About to start relaying all logs to the client via the connection.
18:05:13.805 [INFO] [org.gradle.launcher.daemon.server.exec.LogToClient] The client will now receive all logging from the daemon (pid: 12933). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-12933.out.log
18:05:13.805 [INFO] [org.gradle.launcher.daemon.server.exec.LogAndCheckHealth] Starting 2nd build in daemon [uptime: 4 mins 55.395 secs, performance: 100%]
18:05:13.806 [DEBUG] [org.gradle.launcher.daemon.server.exec.ExecuteBuild] The daemon has started executing the build.
18:05:13.806 [DEBUG] [org.gradle.launcher.daemon.server.exec.ExecuteBuild] Executing build with daemon context: DefaultDaemonContext[uid=ff25c547-99d5-42b9-9f17-59fef341331e,javaHome=/usr/lib/jvm/java-8-openjdk-amd64,daemonRegistryDir=/home/jenkins/.gradle/daemon,pid=12933,idleTimeout=10800000,priority=NORMAL,daemonOpts=-Xmx4g,-Dfile.encoding=UTF-8,-Duser.country=US,-Duser.language=en,-Duser.variant]
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1658

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1658/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/30 12:14:32 INFO sdk_worker_main.start: Status HTTP server running at localhost:35437
19/11/30 12:14:32 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 12:14:32 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 12:14:32 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 12:14:32 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575116069.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52465', 'job_port': u'0'}
19/11/30 12:14:32 INFO statecache.__init__: Creating state cache with size 0
19/11/30 12:14:32 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36037.
19/11/30 12:14:32 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/30 12:14:32 INFO sdk_worker.__init__: Control channel established.
19/11/30 12:14:32 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 12:14:32 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43925.
19/11/30 12:14:32 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 12:14:32 INFO data_plane.create_data_channel: Creating client data channel for localhost:42373
19/11/30 12:14:32 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 12:14:32 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 12:14:32 INFO sdk_worker.run: No more requests from control plane
19/11/30 12:14:32 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 12:14:32 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 12:14:32 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:32 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 12:14:32 INFO sdk_worker.run: Done consuming work.
19/11/30 12:14:32 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 12:14:32 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 12:14:33 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:33 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 12:14:33 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 12:14:33 INFO sdk_worker_main.main: Logging handler created.
19/11/30 12:14:33 INFO sdk_worker_main.start: Status HTTP server running at localhost:39827
19/11/30 12:14:33 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 12:14:33 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 12:14:33 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 12:14:33 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575116069.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52465', 'job_port': u'0'}
19/11/30 12:14:33 INFO statecache.__init__: Creating state cache with size 0
19/11/30 12:14:33 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35191.
19/11/30 12:14:33 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/30 12:14:33 INFO sdk_worker.__init__: Control channel established.
19/11/30 12:14:33 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 12:14:34 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32903.
19/11/30 12:14:34 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 12:14:34 INFO data_plane.create_data_channel: Creating client data channel for localhost:36261
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 12:14:34 INFO sdk_worker.run: No more requests from control plane
19/11/30 12:14:34 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 12:14:34 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 12:14:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:34 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 12:14:34 INFO sdk_worker.run: Done consuming work.
19/11/30 12:14:34 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 12:14:34 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 12:14:34 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 12:14:34 INFO sdk_worker_main.main: Logging handler created.
19/11/30 12:14:34 INFO sdk_worker_main.start: Status HTTP server running at localhost:41055
19/11/30 12:14:34 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 12:14:34 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 12:14:35 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 12:14:35 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575116069.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52465', 'job_port': u'0'}
19/11/30 12:14:35 INFO statecache.__init__: Creating state cache with size 0
19/11/30 12:14:35 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33361.
19/11/30 12:14:35 INFO sdk_worker.__init__: Control channel established.
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/30 12:14:35 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 12:14:35 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44789.
19/11/30 12:14:35 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 12:14:35 INFO data_plane.create_data_channel: Creating client data channel for localhost:33401
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 12:14:35 INFO sdk_worker.run: No more requests from control plane
19/11/30 12:14:35 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 12:14:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:35 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 12:14:35 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 12:14:35 INFO sdk_worker.run: Done consuming work.
19/11/30 12:14:35 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 12:14:35 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 12:14:36 INFO sdk_worker_main.main: Logging handler created.
19/11/30 12:14:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:37463
19/11/30 12:14:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 12:14:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 12:14:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 12:14:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575116069.86', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52465', 'job_port': u'0'}
19/11/30 12:14:36 INFO statecache.__init__: Creating state cache with size 0
19/11/30 12:14:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39681.
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/30 12:14:36 INFO sdk_worker.__init__: Control channel established.
19/11/30 12:14:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 12:14:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38379.
19/11/30 12:14:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 12:14:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:36309
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 12:14:36 INFO sdk_worker.run: No more requests from control plane
19/11/30 12:14:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 12:14:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 12:14:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 12:14:36 INFO sdk_worker.run: Done consuming work.
19/11/30 12:14:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 12:14:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 12:14:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 12:14:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575116069.86_3cd1cbfa-7915-4c83-aa7b-2c6f436fef16 finished.
19/11/30 12:14:36 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/30 12:14:36 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9221b2c7-6b45-4164-b4db-a41bc751aa71","basePath":"/tmp/sparktestBOKZU6"}: {}
java.io.FileNotFoundException: /tmp/sparktestBOKZU6/job_9221b2c7-6b45-4164-b4db-a41bc751aa71/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139711570622208)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 139711553836800)>

# Thread: <_MainThread(MainThread, started 139712357693184)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139711537051392)>

# Thread: <Thread(Thread-123, started daemon 139711545444096)>

# Thread: <_MainThread(MainThread, started 139712357693184)>

# Thread: <Thread(Thread-117, started daemon 139711553836800)>

# Thread: <Thread(wait_until_finish_read, started daemon 139711570622208)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575116057.53_71ca086a-5c44-40c8-a56f-f6dc227b021c failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 390.704s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 53s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/tlkgrr3zr3c3o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1657

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1657/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/11/30 06:13:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96 on Spark master local
19/11/30 06:13:44 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/30 06:13:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96: Pipeline translated successfully. Computing outputs
19/11/30 06:13:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:44 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:44379
19/11/30 06:13:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:44 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:32967.
19/11/30 06:13:44 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/30 06:13:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35435.
19/11/30 06:13:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:36853
19/11/30 06:13:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:45 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:45 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:45 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:35519
19/11/30 06:13:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:45 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35925.
19/11/30 06:13:45 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/30 06:13:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43781.
19/11/30 06:13:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:43017
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:45 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:45 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:46 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:45115
19/11/30 06:13:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:46 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46439.
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/30 06:13:46 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33719.
19/11/30 06:13:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:35047
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:46 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:46 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:47 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:46371
19/11/30 06:13:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:47 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43573.
19/11/30 06:13:47 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/30 06:13:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35163.
19/11/30 06:13:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:40869
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:47 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:47 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 06:13:48 INFO sdk_worker_main.main: Logging handler created.
19/11/30 06:13:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:46235
19/11/30 06:13:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 06:13:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 06:13:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 06:13:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575094423.21', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55161', 'job_port': u'0'}
19/11/30 06:13:48 INFO statecache.__init__: Creating state cache with size 0
19/11/30 06:13:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40823.
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/30 06:13:48 INFO sdk_worker.__init__: Control channel established.
19/11/30 06:13:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 06:13:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33777.
19/11/30 06:13:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 06:13:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:46709
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 06:13:48 INFO sdk_worker.run: No more requests from control plane
19/11/30 06:13:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 06:13:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 06:13:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 06:13:48 INFO sdk_worker.run: Done consuming work.
19/11/30 06:13:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 06:13:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 06:13:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 06:13:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575094423.21_b7416250-51ef-463d-ab15-359481c8eb96 finished.
19/11/30 06:13:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/30 06:13:48 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_bb19bb58-dba2-4d67-a447-c69e8cc3bab0","basePath":"/tmp/sparktest5nkpoo"}: {}
java.io.FileNotFoundException: /tmp/sparktest5nkpoo/job_bb19bb58-dba2-4d67-a447-c69e8cc3bab0/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139990183896832)>

  File "ap# Thread: <Thread(Thread-119, started daemon 139990175504128)>

ache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139990979360512)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575094414.53_7de8c860-0f3b-4ceb-b8c1-efcb0017e56f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 290.919s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 27s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ldwkadgw26evy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1656

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1656/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/30 00:14:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:36521
19/11/30 00:14:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 00:14:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 00:14:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 00:14:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575072885.57', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37285', 'job_port': u'0'}
19/11/30 00:14:48 INFO statecache.__init__: Creating state cache with size 0
19/11/30 00:14:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36389.
19/11/30 00:14:48 INFO sdk_worker.__init__: Control channel established.
19/11/30 00:14:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/30 00:14:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39833.
19/11/30 00:14:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 00:14:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:46531
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 00:14:48 INFO sdk_worker.run: No more requests from control plane
19/11/30 00:14:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 00:14:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 00:14:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 00:14:48 INFO sdk_worker.run: Done consuming work.
19/11/30 00:14:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 00:14:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 00:14:49 INFO sdk_worker_main.main: Logging handler created.
19/11/30 00:14:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:44247
19/11/30 00:14:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 00:14:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 00:14:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 00:14:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575072885.57', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37285', 'job_port': u'0'}
19/11/30 00:14:49 INFO statecache.__init__: Creating state cache with size 0
19/11/30 00:14:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35123.
19/11/30 00:14:49 INFO sdk_worker.__init__: Control channel established.
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/30 00:14:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 00:14:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44495.
19/11/30 00:14:49 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 00:14:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:39499
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 00:14:49 INFO sdk_worker.run: No more requests from control plane
19/11/30 00:14:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 00:14:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 00:14:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 00:14:49 INFO sdk_worker.run: Done consuming work.
19/11/30 00:14:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 00:14:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 00:14:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 00:14:50 INFO sdk_worker_main.main: Logging handler created.
19/11/30 00:14:50 INFO sdk_worker_main.start: Status HTTP server running at localhost:39669
19/11/30 00:14:50 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 00:14:50 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 00:14:50 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 00:14:50 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575072885.57', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37285', 'job_port': u'0'}
19/11/30 00:14:50 INFO statecache.__init__: Creating state cache with size 0
19/11/30 00:14:50 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40087.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/30 00:14:50 INFO sdk_worker.__init__: Control channel established.
19/11/30 00:14:50 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 00:14:50 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40339.
19/11/30 00:14:50 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 00:14:50 INFO data_plane.create_data_channel: Creating client data channel for localhost:42267
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 00:14:50 INFO sdk_worker.run: No more requests from control plane
19/11/30 00:14:50 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 00:14:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:50 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 00:14:50 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 00:14:50 INFO sdk_worker.run: Done consuming work.
19/11/30 00:14:50 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 00:14:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/30 00:14:51 INFO sdk_worker_main.main: Logging handler created.
19/11/30 00:14:51 INFO sdk_worker_main.start: Status HTTP server running at localhost:34937
19/11/30 00:14:51 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/30 00:14:51 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/30 00:14:51 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/30 00:14:51 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575072885.57', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:37285', 'job_port': u'0'}
19/11/30 00:14:51 INFO statecache.__init__: Creating state cache with size 0
19/11/30 00:14:51 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44903.
19/11/30 00:14:51 INFO sdk_worker.__init__: Control channel established.
19/11/30 00:14:51 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/30 00:14:51 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44183.
19/11/30 00:14:51 INFO sdk_worker.create_state_handler: State channel established.
19/11/30 00:14:51 INFO data_plane.create_data_channel: Creating client data channel for localhost:36255
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/30 00:14:51 INFO sdk_worker.run: No more requests from control plane
19/11/30 00:14:51 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/30 00:14:51 INFO data_plane.close: Closing all cached grpc data channels.
19/11/30 00:14:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:51 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/30 00:14:51 INFO sdk_worker.run: Done consuming work.
19/11/30 00:14:51 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/30 00:14:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/30 00:14:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/30 00:14:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575072885.57_58a7d6ba-405a-4636-94bf-9bb8da9f0a3e finished.
19/11/30 00:14:51 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/30 00:14:51 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_48db0018-07ef-4b59-982a-8f3db09921c6","basePath":"/tmp/sparktestSxlOWN"}: {}
java.io.FileNotFoundException: /tmp/sparktestSxlOWN/job_48db0018-07ef-4b59-982a-8f3db09921c6/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139853024847616)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 139853008062208)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 139853804586752)>
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 139852918023936)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575072874.84_228de8fb-7cdf-4d8d-a149-6e844e40a68d failed in state FAILED: java.lang.UnsupportedOperationException: The A# Thread: <Thread(Thread-123, started daemon 139852926416640)>

ctiveBundle does not have a registered bundle checkpoint handler.

# Thread: <Thread(Thread-118, started daemon 139853008062208)>

----------------------------------------------------------------------
Ran 38 tests in 346.700s
# Thread: <Thread(wait_until_finish_read, started daemon 139853024847616)>


FAILED (errors=3, skipped=9)
# Thread: <_MainThread(MainThread, started 139853804586752)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 3s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ljuieyrac5ssg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1655

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1655/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/29 18:14:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:38003
19/11/29 18:14:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 18:14:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 18:14:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 18:14:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575051259.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55047', 'job_port': u'0'}
19/11/29 18:14:22 INFO statecache.__init__: Creating state cache with size 0
19/11/29 18:14:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43597.
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 18:14:22 INFO sdk_worker.__init__: Control channel established.
19/11/29 18:14:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 18:14:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45301.
19/11/29 18:14:22 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 18:14:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:33895
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 18:14:22 INFO sdk_worker.run: No more requests from control plane
19/11/29 18:14:22 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 18:14:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:22 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 18:14:22 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 18:14:22 INFO sdk_worker.run: Done consuming work.
19/11/29 18:14:22 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 18:14:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 18:14:23 INFO sdk_worker_main.main: Logging handler created.
19/11/29 18:14:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:44011
19/11/29 18:14:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 18:14:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 18:14:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 18:14:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575051259.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55047', 'job_port': u'0'}
19/11/29 18:14:23 INFO statecache.__init__: Creating state cache with size 0
19/11/29 18:14:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45871.
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 18:14:23 INFO sdk_worker.__init__: Control channel established.
19/11/29 18:14:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 18:14:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37195.
19/11/29 18:14:23 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 18:14:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:41657
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 18:14:23 INFO sdk_worker.run: No more requests from control plane
19/11/29 18:14:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 18:14:23 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 18:14:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 18:14:23 INFO sdk_worker.run: Done consuming work.
19/11/29 18:14:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 18:14:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 18:14:24 INFO sdk_worker_main.main: Logging handler created.
19/11/29 18:14:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:38091
19/11/29 18:14:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 18:14:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 18:14:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 18:14:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575051259.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55047', 'job_port': u'0'}
19/11/29 18:14:24 INFO statecache.__init__: Creating state cache with size 0
19/11/29 18:14:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46499.
19/11/29 18:14:24 INFO sdk_worker.__init__: Control channel established.
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 18:14:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 18:14:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44909.
19/11/29 18:14:24 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 18:14:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:35809
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 18:14:24 INFO sdk_worker.run: No more requests from control plane
19/11/29 18:14:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 18:14:24 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 18:14:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 18:14:24 INFO sdk_worker.run: Done consuming work.
19/11/29 18:14:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 18:14:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 18:14:25 INFO sdk_worker_main.main: Logging handler created.
19/11/29 18:14:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:35301
19/11/29 18:14:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 18:14:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 18:14:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 18:14:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575051259.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:55047', 'job_port': u'0'}
19/11/29 18:14:25 INFO statecache.__init__: Creating state cache with size 0
19/11/29 18:14:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43685.
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 18:14:25 INFO sdk_worker.__init__: Control channel established.
19/11/29 18:14:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 18:14:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43791.
19/11/29 18:14:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 18:14:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:40761
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 18:14:25 INFO sdk_worker.run: No more requests from control plane
19/11/29 18:14:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 18:14:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 18:14:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 18:14:25 INFO sdk_worker.run: Done consuming work.
19/11/29 18:14:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 18:14:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 18:14:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 18:14:25 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575051259.83_a6b2b250-ccf9-42f3-b596-f47b9aaddda0 finished.
19/11/29 18:14:25 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 18:14:25 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_b1b2af48-4685-432e-8479-5d19945bde69","basePath":"/tmp/sparktest15mcnz"}: {}
java.io.FileNotFoundException: /tmp/sparktest15mcnz/job_b1b2af48-4685-432e-8479-5d19945bde69/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140310801803008)>


# Thread: <Thread(Thread-117, started daemon 140310785017600)>

# Thread: <_MainThread(MainThread, started 140311654516480)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140310768232192)>

# Thread: <Thread(Thread-123, started daemon 140310776624896)>

# Thread: <Thread(Thread-117, started daemon 140310785017600)>

# Thread: <Thread(wait_until_finish_read, started daemon 140310801803008)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140311654516480)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575051249.94_4caff44e-cd3d-4568-991e-29abb3e23b24 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.058s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 22s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/dndm7ra5cuakq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1654

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1654/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/29 12:12:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:42615
19/11/29 12:12:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 12:12:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 12:12:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 12:12:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575029554.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48583', 'job_port': u'0'}
19/11/29 12:12:37 INFO statecache.__init__: Creating state cache with size 0
19/11/29 12:12:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45481.
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 12:12:37 INFO sdk_worker.__init__: Control channel established.
19/11/29 12:12:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 12:12:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33027.
19/11/29 12:12:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 12:12:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:32901
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 12:12:37 INFO sdk_worker.run: No more requests from control plane
19/11/29 12:12:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 12:12:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 12:12:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 12:12:37 INFO sdk_worker.run: Done consuming work.
19/11/29 12:12:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 12:12:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 12:12:38 INFO sdk_worker_main.main: Logging handler created.
19/11/29 12:12:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:37781
19/11/29 12:12:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 12:12:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 12:12:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 12:12:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575029554.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48583', 'job_port': u'0'}
19/11/29 12:12:38 INFO statecache.__init__: Creating state cache with size 0
19/11/29 12:12:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42357.
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 12:12:38 INFO sdk_worker.__init__: Control channel established.
19/11/29 12:12:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 12:12:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43987.
19/11/29 12:12:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 12:12:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:39295
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 12:12:38 INFO sdk_worker.run: No more requests from control plane
19/11/29 12:12:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 12:12:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 12:12:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 12:12:38 INFO sdk_worker.run: Done consuming work.
19/11/29 12:12:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 12:12:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 12:12:39 INFO sdk_worker_main.main: Logging handler created.
19/11/29 12:12:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:35499
19/11/29 12:12:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 12:12:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 12:12:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 12:12:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575029554.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48583', 'job_port': u'0'}
19/11/29 12:12:39 INFO statecache.__init__: Creating state cache with size 0
19/11/29 12:12:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36363.
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 12:12:39 INFO sdk_worker.__init__: Control channel established.
19/11/29 12:12:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 12:12:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33791.
19/11/29 12:12:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 12:12:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:43577
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 12:12:39 INFO sdk_worker.run: No more requests from control plane
19/11/29 12:12:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 12:12:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 12:12:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 12:12:39 INFO sdk_worker.run: Done consuming work.
19/11/29 12:12:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 12:12:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 12:12:40 INFO sdk_worker_main.main: Logging handler created.
19/11/29 12:12:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:43873
19/11/29 12:12:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 12:12:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 12:12:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 12:12:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575029554.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48583', 'job_port': u'0'}
19/11/29 12:12:40 INFO statecache.__init__: Creating state cache with size 0
19/11/29 12:12:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35203.
19/11/29 12:12:40 INFO sdk_worker.__init__: Control channel established.
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 12:12:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 12:12:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41033.
19/11/29 12:12:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 12:12:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:38461
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 12:12:40 INFO sdk_worker.run: No more requests from control plane
19/11/29 12:12:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 12:12:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 12:12:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 12:12:40 INFO sdk_worker.run: Done consuming work.
19/11/29 12:12:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 12:12:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 12:12:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 12:12:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575029554.43_b56895fb-7feb-4107-8f0b-1aa0b3b6c877 finished.
19/11/29 12:12:40 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 12:12:40 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d8cf24be-967d-4d0a-98a7-551048e51290","basePath":"/tmp/sparktesteS8EBL"}: {}
java.io.FileNotFoundException: /tmp/sparktesteS8EBL/job_d8cf24be-967d-4d0a-98a7-551048e51290/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139668226037504)>

# Thread: <Thread(Thread-117, started daemon 139668242822912)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <_MainThread(MainThread, started 139669361047296)>
==================== Timed out after 60 seconds. ====================
    _common.wait(self._state.condition.wait, _response_ready)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139668209252096)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(Thread-123, started daemon 139668217644800)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <_MainThread(MainThread, started 139669361047296)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-117, started daemon 139668242822912)>

BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139668226037504)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575029543.72_f2ed65e8-5f92-4c62-b1f1-05bf65105e97 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 355.012s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 7s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/vrzmgbbhc4q5k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1653

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1653/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-4776] Add metrics support to Java PortableRunner


------------------------------------------
[...truncated 1.32 MB...]
19/11/29 10:50:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:42463
19/11/29 10:50:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 10:50:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 10:50:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 10:50:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575024605.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56299', 'job_port': u'0'}
19/11/29 10:50:08 INFO statecache.__init__: Creating state cache with size 0
19/11/29 10:50:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43531.
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 10:50:08 INFO sdk_worker.__init__: Control channel established.
19/11/29 10:50:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 10:50:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37503.
19/11/29 10:50:08 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 10:50:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:35437
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 10:50:08 INFO sdk_worker.run: No more requests from control plane
19/11/29 10:50:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 10:50:08 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 10:50:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 10:50:08 INFO sdk_worker.run: Done consuming work.
19/11/29 10:50:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 10:50:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 10:50:09 INFO sdk_worker_main.main: Logging handler created.
19/11/29 10:50:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:40289
19/11/29 10:50:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 10:50:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 10:50:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 10:50:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575024605.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56299', 'job_port': u'0'}
19/11/29 10:50:09 INFO statecache.__init__: Creating state cache with size 0
19/11/29 10:50:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36399.
19/11/29 10:50:09 INFO sdk_worker.__init__: Control channel established.
19/11/29 10:50:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 10:50:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36603.
19/11/29 10:50:09 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 10:50:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:35241
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 10:50:09 INFO sdk_worker.run: No more requests from control plane
19/11/29 10:50:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 10:50:09 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 10:50:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 10:50:09 INFO sdk_worker.run: Done consuming work.
19/11/29 10:50:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 10:50:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 10:50:10 INFO sdk_worker_main.main: Logging handler created.
19/11/29 10:50:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:34789
19/11/29 10:50:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 10:50:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 10:50:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 10:50:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575024605.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56299', 'job_port': u'0'}
19/11/29 10:50:10 INFO statecache.__init__: Creating state cache with size 0
19/11/29 10:50:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45001.
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 10:50:10 INFO sdk_worker.__init__: Control channel established.
19/11/29 10:50:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 10:50:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34823.
19/11/29 10:50:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 10:50:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:40903
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 10:50:10 INFO sdk_worker.run: No more requests from control plane
19/11/29 10:50:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 10:50:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 10:50:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 10:50:10 INFO sdk_worker.run: Done consuming work.
19/11/29 10:50:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 10:50:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 10:50:11 INFO sdk_worker_main.main: Logging handler created.
19/11/29 10:50:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:33847
19/11/29 10:50:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 10:50:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 10:50:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 10:50:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575024605.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56299', 'job_port': u'0'}
19/11/29 10:50:11 INFO statecache.__init__: Creating state cache with size 0
19/11/29 10:50:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44901.
19/11/29 10:50:11 INFO sdk_worker.__init__: Control channel established.
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 10:50:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 10:50:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36833.
19/11/29 10:50:11 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 10:50:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:42143
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 10:50:11 INFO sdk_worker.run: No more requests from control plane
19/11/29 10:50:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 10:50:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:11 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 10:50:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 10:50:11 INFO sdk_worker.run: Done consuming work.
19/11/29 10:50:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 10:50:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 10:50:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 10:50:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575024605.76_33164bde-7f4b-4610-ad8a-a3da2692fca6 finished.
19/11/29 10:50:11 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 10:50:11 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d6ae00cf-7dd2-4474-85b7-0d60cdcf768a","basePath":"/tmp/sparktestZcj4xW"}: {}
java.io.FileNotFoundException: /tmp/sparktestZcj4xW/job_d6ae00cf-7dd2-4474-85b7-0d60cdcf768a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140121965852416)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-116, started daemon 140122450085632)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <_MainThread(MainThread, started 140123236865792)>
==================== Timed out after 60 seconds. ====================

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140121949067008)>

# Thread: <Thread(Thread-122, started daemon 140121957459712)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(lis# Thread: <Thread(Thread-116, started daemon 140122450085632)>

t(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575024595.75_fc4af90c-6c4b-432a-b0bc-121618f770fb failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 374.762s

FAILED (errors=3, skipped=9)
# Thread: <_MainThread(MainThread, started 140123236865792)>

# Thread: <Thread(wait_until_finish_read, started daemon 140121965852416)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 34s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/hthqaapepoflo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1652

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1652/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/29 06:11:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:42821
19/11/29 06:11:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 06:11:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 06:11:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 06:11:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575007882.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41345', 'job_port': u'0'}
19/11/29 06:11:25 INFO statecache.__init__: Creating state cache with size 0
19/11/29 06:11:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38501.
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 06:11:25 INFO sdk_worker.__init__: Control channel established.
19/11/29 06:11:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 06:11:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46805.
19/11/29 06:11:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 06:11:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:37533
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 06:11:25 INFO sdk_worker.run: No more requests from control plane
19/11/29 06:11:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 06:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 06:11:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 06:11:25 INFO sdk_worker.run: Done consuming work.
19/11/29 06:11:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 06:11:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 06:11:26 INFO sdk_worker_main.main: Logging handler created.
19/11/29 06:11:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:39925
19/11/29 06:11:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 06:11:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 06:11:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 06:11:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575007882.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41345', 'job_port': u'0'}
19/11/29 06:11:26 INFO statecache.__init__: Creating state cache with size 0
19/11/29 06:11:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37647.
19/11/29 06:11:26 INFO sdk_worker.__init__: Control channel established.
19/11/29 06:11:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 06:11:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34887.
19/11/29 06:11:26 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 06:11:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:38961
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 06:11:26 INFO sdk_worker.run: No more requests from control plane
19/11/29 06:11:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 06:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 06:11:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 06:11:26 INFO sdk_worker.run: Done consuming work.
19/11/29 06:11:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 06:11:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 06:11:26 INFO sdk_worker_main.main: Logging handler created.
19/11/29 06:11:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:44403
19/11/29 06:11:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 06:11:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 06:11:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 06:11:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575007882.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41345', 'job_port': u'0'}
19/11/29 06:11:26 INFO statecache.__init__: Creating state cache with size 0
19/11/29 06:11:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38545.
19/11/29 06:11:26 INFO sdk_worker.__init__: Control channel established.
19/11/29 06:11:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 06:11:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39013.
19/11/29 06:11:26 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 06:11:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:44051
19/11/29 06:11:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 06:11:27 INFO sdk_worker.run: No more requests from control plane
19/11/29 06:11:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 06:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 06:11:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 06:11:27 INFO sdk_worker.run: Done consuming work.
19/11/29 06:11:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 06:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 06:11:27 INFO sdk_worker_main.main: Logging handler created.
19/11/29 06:11:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:40567
19/11/29 06:11:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 06:11:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 06:11:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 06:11:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1575007882.93', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41345', 'job_port': u'0'}
19/11/29 06:11:27 INFO statecache.__init__: Creating state cache with size 0
19/11/29 06:11:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39163.
19/11/29 06:11:27 INFO sdk_worker.__init__: Control channel established.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 06:11:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 06:11:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44101.
19/11/29 06:11:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 06:11:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:44047
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 06:11:27 INFO sdk_worker.run: No more requests from control plane
19/11/29 06:11:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 06:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 06:11:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 06:11:27 INFO sdk_worker.run: Done consuming work.
19/11/29 06:11:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 06:11:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 06:11:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 06:11:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1575007882.93_4ede7c62-9a82-402b-8731-afff60c28df3 finished.
19/11/29 06:11:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 06:11:27 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_bb078cef-db22-483a-8ea3-85f0ce69cb25","basePath":"/tmp/sparktestAIrvRJ"}: {}
java.io.FileNotFoundException: /tmp/sparktestAIrvRJ/job_bb078cef-db22-483a-8ea3-85f0ce69cb25/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(wait_until_finish_read, started daemon 140636883773184)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-118, started daemon 140637236946688)>

# Thread: <_MainThread(MainThread, started 140638016186112)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140636875380480)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-124, started daemon 140636866987776)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-118, started daemon 140637236946688)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140636883773184)>

# Thread: <_MainThread(MainThread, started 140638016186112)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1575007874.29_a26ed542-8f8e-4a1b-b94b-47869cdd7acd failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 292.068s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 56s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/ksnchvfk64xhw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1651

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1651/display/redirect?page=changes>

Changes:

[milantracy] [BEAM-8406] Add support for JSON format text tables


------------------------------------------
[...truncated 1.31 MB...]
19/11/29 01:37:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c on Spark master local
19/11/29 01:37:35 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/29 01:37:35 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c: Pipeline translated successfully. Computing outputs
19/11/29 01:37:35 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:36 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:36 INFO sdk_worker_main.start: Status HTTP server running at localhost:39697
19/11/29 01:37:36 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:36 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:36 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:36 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:36 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:36 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46491.
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/29 01:37:36 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:36 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:36 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41617.
19/11/29 01:37:36 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:36 INFO data_plane.create_data_channel: Creating client data channel for localhost:39375
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:36 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:36 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:36 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:36 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:36 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:36 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:36 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:37 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:37475
19/11/29 01:37:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:37 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39535.
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 01:37:37 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38385.
19/11/29 01:37:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:37481
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:37 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:37 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:39 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:33617
19/11/29 01:37:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:39 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38643.
19/11/29 01:37:39 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 01:37:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42013.
19/11/29 01:37:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:45069
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:39 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:39 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:40 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:41715
19/11/29 01:37:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:40 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36951.
19/11/29 01:37:40 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 01:37:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34227.
19/11/29 01:37:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:42807
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:40 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:40 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 01:37:41 INFO sdk_worker_main.main: Logging handler created.
19/11/29 01:37:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:36879
19/11/29 01:37:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 01:37:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 01:37:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 01:37:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574991454.52', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:34069', 'job_port': u'0'}
19/11/29 01:37:41 INFO statecache.__init__: Creating state cache with size 0
19/11/29 01:37:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37759.
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 01:37:41 INFO sdk_worker.__init__: Control channel established.
19/11/29 01:37:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 01:37:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33627.
19/11/29 01:37:41 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 01:37:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:33507
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 01:37:41 INFO sdk_worker.run: No more requests from control plane
19/11/29 01:37:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 01:37:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 01:37:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 01:37:41 INFO sdk_worker.run: Done consuming work.
19/11/29 01:37:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 01:37:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 01:37:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 01:37:41 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574991454.52_dc10f49f-1fb5-425d-b431-1b4587bd283c finished.
19/11/29 01:37:41 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 01:37:41 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_152e6e81-7877-4988-8d79-1fbaae46340b","basePath":"/tmp/sparktestZOrA2j"}: {}
java.io.FileNotFoundException: /tmp/sparktestZOrA2j/job_152e6e81-7877-4988-8d79-1fbaae46340b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once

# Thread: <Thread(wait_until_finish_read, started daemon 139828471441152)>

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <Thread(Thread-119, started daemon 139828488226560)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139829338294016)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574991443.9_c80e0d24-c7cd-4947-9bdc-50653bf49598 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 286.874s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 21s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/pvpz7nvgxzics

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1650

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1650/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/11/29 00:15:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34953.
19/11/29 00:15:08 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:35509
19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:08 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:08 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:08 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 00:15:09 INFO sdk_worker_main.main: Logging handler created.
19/11/29 00:15:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:42109
19/11/29 00:15:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 00:15:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 00:15:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 00:15:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'}
19/11/29 00:15:09 INFO statecache.__init__: Creating state cache with size 0
19/11/29 00:15:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41497.
19/11/29 00:15:09 INFO sdk_worker.__init__: Control channel established.
19/11/29 00:15:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/29 00:15:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44863.
19/11/29 00:15:09 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:33501
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:09 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:09 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:09 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 00:15:09 INFO sdk_worker_main.main: Logging handler created.
19/11/29 00:15:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:43469
19/11/29 00:15:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 00:15:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 00:15:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 00:15:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'}
19/11/29 00:15:09 INFO statecache.__init__: Creating state cache with size 0
19/11/29 00:15:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46185.
19/11/29 00:15:09 INFO sdk_worker.__init__: Control channel established.
19/11/29 00:15:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 00:15:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/29 00:15:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42357.
19/11/29 00:15:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:45441
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:10 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:10 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 00:15:10 INFO sdk_worker_main.main: Logging handler created.
19/11/29 00:15:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:41001
19/11/29 00:15:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 00:15:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 00:15:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 00:15:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'}
19/11/29 00:15:10 INFO statecache.__init__: Creating state cache with size 0
19/11/29 00:15:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36987.
19/11/29 00:15:10 INFO sdk_worker.__init__: Control channel established.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/29 00:15:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 00:15:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36841.
19/11/29 00:15:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:46021
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:10 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:10 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/29 00:15:11 INFO sdk_worker_main.main: Logging handler created.
19/11/29 00:15:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:45481
19/11/29 00:15:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/29 00:15:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/29 00:15:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/29 00:15:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574986506.55', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48649', 'job_port': u'0'}
19/11/29 00:15:11 INFO statecache.__init__: Creating state cache with size 0
19/11/29 00:15:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37199.
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/29 00:15:11 INFO sdk_worker.__init__: Control channel established.
19/11/29 00:15:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/29 00:15:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35381.
19/11/29 00:15:11 INFO sdk_worker.create_state_handler: State channel established.
19/11/29 00:15:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:36523
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/29 00:15:11 INFO sdk_worker.run: No more requests from control plane
19/11/29 00:15:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/29 00:15:11 INFO data_plane.close: Closing all cached grpc data channels.
19/11/29 00:15:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/29 00:15:11 INFO sdk_worker.run: Done consuming work.
19/11/29 00:15:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/29 00:15:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/29 00:15:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/29 00:15:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574986506.55_ed89aacb-d6dd-465d-a448-52124f6bd9b4 finished.
19/11/29 00:15:11 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/29 00:15:11 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_4c87db9a-2362-43a4-8fc0-3bf93b5d00d7","basePath":"/tmp/sparktestfCu1th"}: {}
java.io.FileNotFoundException: /tmp/sparktestfCu1th/job_4c87db9a-2362-43a4-8fc0-3bf93b5d00d7/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
==================== Timed out after 60 seconds. ====================

ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140161978164992)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-120, started daemon 140161986557696)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574986497.17_4451bb5c-06f7-4f54-9832-4c2cd5f4e1e5 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

# Thread: <_MainThread(MainThread, started 140162776860416)>
----------------------------------------------------------------------
Ran 38 tests in 299.835s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 35s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 7392d0df-69d3-4bd2-9090-2d798b99440a
Response status code: 502
Response content type: text/html; charset=UTF-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1649

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1649/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/28 18:14:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:37201
19/11/28 18:14:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 18:14:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 18:14:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 18:14:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574964854.35', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40867', 'job_port': u'0'}
19/11/28 18:14:17 INFO statecache.__init__: Creating state cache with size 0
19/11/28 18:14:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46535.
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/28 18:14:17 INFO sdk_worker.__init__: Control channel established.
19/11/28 18:14:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 18:14:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36715.
19/11/28 18:14:17 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 18:14:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:45499
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 18:14:17 INFO sdk_worker.run: No more requests from control plane
19/11/28 18:14:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 18:14:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:17 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 18:14:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 18:14:17 INFO sdk_worker.run: Done consuming work.
19/11/28 18:14:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 18:14:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:17 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 18:14:18 INFO sdk_worker_main.main: Logging handler created.
19/11/28 18:14:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:34085
19/11/28 18:14:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 18:14:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 18:14:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 18:14:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574964854.35', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40867', 'job_port': u'0'}
19/11/28 18:14:18 INFO statecache.__init__: Creating state cache with size 0
19/11/28 18:14:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46433.
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 18:14:18 INFO sdk_worker.__init__: Control channel established.
19/11/28 18:14:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 18:14:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33413.
19/11/28 18:14:18 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 18:14:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:38639
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 18:14:18 INFO sdk_worker.run: No more requests from control plane
19/11/28 18:14:18 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 18:14:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:18 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 18:14:18 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 18:14:18 INFO sdk_worker.run: Done consuming work.
19/11/28 18:14:18 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 18:14:18 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 18:14:19 INFO sdk_worker_main.main: Logging handler created.
19/11/28 18:14:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:33269
19/11/28 18:14:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 18:14:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 18:14:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 18:14:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574964854.35', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40867', 'job_port': u'0'}
19/11/28 18:14:19 INFO statecache.__init__: Creating state cache with size 0
19/11/28 18:14:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46051.
19/11/28 18:14:19 INFO sdk_worker.__init__: Control channel established.
19/11/28 18:14:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 18:14:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38331.
19/11/28 18:14:19 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 18:14:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:38693
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 18:14:19 INFO sdk_worker.run: No more requests from control plane
19/11/28 18:14:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 18:14:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 18:14:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 18:14:19 INFO sdk_worker.run: Done consuming work.
19/11/28 18:14:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 18:14:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 18:14:20 INFO sdk_worker_main.main: Logging handler created.
19/11/28 18:14:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:39253
19/11/28 18:14:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 18:14:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 18:14:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 18:14:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574964854.35', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40867', 'job_port': u'0'}
19/11/28 18:14:20 INFO statecache.__init__: Creating state cache with size 0
19/11/28 18:14:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36793.
19/11/28 18:14:20 INFO sdk_worker.__init__: Control channel established.
19/11/28 18:14:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 18:14:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38779.
19/11/28 18:14:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 18:14:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:36697
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 18:14:20 INFO sdk_worker.run: No more requests from control plane
19/11/28 18:14:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 18:14:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 18:14:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 18:14:20 INFO sdk_worker.run: Done consuming work.
19/11/28 18:14:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 18:14:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 18:14:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 18:14:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574964854.35_3a9e48a6-8429-458a-9fde-a5e77bf61044 finished.
19/11/28 18:14:20 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 18:14:20 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_fa3ed5b1-7c2a-42a7-933d-a31963c1d0c5","basePath":"/tmp/sparktestBx1X1F"}: {}
java.io.FileNotFoundException: /tmp/sparktestBx1X1F/job_fa3ed5b1-7c2a-42a7-933d-a31963c1d0c5/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139901823129344)>

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(Thread-119, started daemon 139901814736640)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139902611261184)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139901797426944)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-124, started daemon 139901806081792)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-119, started daemon 139901814736640)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <_MainThread(MainThread, started 139902611261184)>

# Thread: <Thread(wait_until_finish_read, started daemon 139901823129344)>
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574964843.24_bbd1a1e1-5f35-40f2-b3e9-7c55bfefa023 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 353.300s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 46s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://scans.gradle.com/s/5ldhideppofye

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1648

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1648/display/redirect?page=changes>

Changes:

[mxm] [BEAM-8656] Update documentation for flink_master parameter


------------------------------------------
[...truncated 1.32 MB...]
19/11/28 14:35:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:40935
19/11/28 14:35:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 14:35:05 INFO sdk_worker.run: No more requests from control plane
19/11/28 14:35:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 14:35:05 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 14:35:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 14:35:05 INFO sdk_worker.run: Done consuming work.
19/11/28 14:35:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 14:35:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 14:35:05 INFO sdk_worker_main.main: Logging handler created.
19/11/28 14:35:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:32881
19/11/28 14:35:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 14:35:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 14:35:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574951702.43_4f27f123-ce51-4955-af39-7ffa2192a2a0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 14:35:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574951702.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50283', 'job_port': u'0'}
19/11/28 14:35:05 INFO statecache.__init__: Creating state cache with size 0
19/11/28 14:35:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33943.
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 14:35:05 INFO sdk_worker.__init__: Control channel established.
19/11/28 14:35:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 14:35:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34067.
19/11/28 14:35:05 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 14:35:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:40431
19/11/28 14:35:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 14:35:06 INFO sdk_worker.run: No more requests from control plane
19/11/28 14:35:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 14:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 14:35:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 14:35:06 INFO sdk_worker.run: Done consuming work.
19/11/28 14:35:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 14:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 14:35:06 INFO sdk_worker_main.main: Logging handler created.
19/11/28 14:35:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:40447
19/11/28 14:35:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 14:35:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 14:35:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574951702.43_4f27f123-ce51-4955-af39-7ffa2192a2a0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 14:35:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574951702.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50283', 'job_port': u'0'}
19/11/28 14:35:06 INFO statecache.__init__: Creating state cache with size 0
19/11/28 14:35:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34197.
19/11/28 14:35:06 INFO sdk_worker.__init__: Control channel established.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 14:35:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 14:35:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32805.
19/11/28 14:35:06 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 14:35:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:38987
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 14:35:06 INFO sdk_worker.run: No more requests from control plane
19/11/28 14:35:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 14:35:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 14:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 14:35:06 INFO sdk_worker.run: Done consuming work.
19/11/28 14:35:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 14:35:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 14:35:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 14:35:07 INFO sdk_worker_main.main: Logging handler created.
19/11/28 14:35:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:35563
19/11/28 14:35:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 14:35:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 14:35:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574951702.43_4f27f123-ce51-4955-af39-7ffa2192a2a0', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 14:35:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574951702.43', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:50283', 'job_port': u'0'}
19/11/28 14:35:07 INFO statecache.__init__: Creating state cache with size 0
19/11/28 14:35:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:32911.
19/11/28 14:35:07 INFO sdk_worker.__init__: Control channel established.
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 14:35:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 14:35:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45251.
19/11/28 14:35:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 14:35:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:38299
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 14:35:07 INFO sdk_worker.run: No more requests from control plane
19/11/28 14:35:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 14:35:07 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 14:35:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 14:35:07 INFO sdk_worker.run: Done consuming work.
19/11/28 14:35:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 14:35:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 14:35:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 14:35:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574951702.43_4f27f123-ce51-4955-af39-7ffa2192a2a0 finished.
19/11/28 14:35:07 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 14:35:07 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_0e5c2d4e-9b09-4a38-8456-5171e1826f76","basePath":"/tmp/sparktestsViQPm"}: {}
java.io.FileNotFoundException: /tmp/sparktestsViQPm/job_0e5c2d4e-9b09-4a38-8456-5171e1826f76/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139822749406976)>

# Thread: <Thread(Thread-119, started daemon 139822766192384)>
BaseException: Timed out after 60 seconds.

======================================================================

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139823545640704)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(wait_until_finish_read, started daemon 139822123448064)>
nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:

# Thread: <Thread(Thread-124, started daemon 139822740227840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 139823545640704)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574951693.11_b93923b1-4ff8-4d14-a784-f26d4d84b26b failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 298.706s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 28s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: ae6c3e66-1515-47d0-910c-e587ce3c8145
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1647

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1647/display/redirect?page=changes>

Changes:

[piotr.szczepanik] [BEAM-8819] Fix AvroCoder serialisation by introduction of

[piotr.szczepanik] Added missing license header for AvroGenericCoder

[piotr.szczepanik] Fixed code style violations

[piotr.szczepanik] Fixed missing AvroCoder -> AvroGenericCoder in python tests

[coheigea] A fix for some TLS issues in the MongoDB IO


------------------------------------------
[...truncated 1.42 MB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
read(wait_until_finish_read, started daemon 140401583310592)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(Thread-135, started daemon 140401591703296)>

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140401600096000)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140403418679040)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140402103396096)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
# Thread: <Thread(wait_until_finish_read, started daemon 140402095003392)>

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-130, started daemon 140401608488704)>
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_unfusable_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 244, in test_pardo_unfusable_side_inputs
    equal_to([('a', 'a'), ('a', 'b'), ('b', 'a'), ('b', 'b')]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_windowed_side_inputs (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 181, in test_pardo_windowed_side_inputs
    label='windowed')
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_read (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 578, in test_read
    equal_to(['a', 'b', 'c']))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_reshuffle (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 548, in test_reshuffle
    equal_to([1, 2, 3]))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_check_done_failed (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 470, in test_sdf_with_check_done_failed
    | beam.ParDo(ExpandingStringsDoFn()))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574949668.15_6f5b9749-c444-42b9-969c-1df39118c542 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 755.924s

FAILED (errors=8, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m 17s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 8530ed8e-ca5d-465b-8c69-b28eb443a8ac
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1646

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1646/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]

19/11/28 12:14:00 INFO sdk_worker.run: No more requests from control plane
19/11/28 12:14:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 12:14:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 12:14:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 12:14:00 INFO sdk_worker.run: Done consuming work.
19/11/28 12:14:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 12:14:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 12:14:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 12:14:01 INFO sdk_worker_main.main: Logging handler created.
19/11/28 12:14:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:38815
19/11/28 12:14:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 12:14:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 12:14:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574943238.09_4de6c956-10c8-4539-852f-15cf4f277972', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 12:14:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574943238.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38275', 'job_port': u'0'}
19/11/28 12:14:01 INFO statecache.__init__: Creating state cache with size 0
19/11/28 12:14:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45223.
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 12:14:01 INFO sdk_worker.__init__: Control channel established.
19/11/28 12:14:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 12:14:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44935.
19/11/28 12:14:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 12:14:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:37205
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 12:14:01 INFO sdk_worker.run: No more requests from control plane
19/11/28 12:14:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 12:14:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 12:14:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 12:14:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:01 INFO sdk_worker.run: Done consuming work.
19/11/28 12:14:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 12:14:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 12:14:02 INFO sdk_worker_main.main: Logging handler created.
19/11/28 12:14:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:41311
19/11/28 12:14:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 12:14:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 12:14:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574943238.09_4de6c956-10c8-4539-852f-15cf4f277972', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 12:14:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574943238.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38275', 'job_port': u'0'}
19/11/28 12:14:02 INFO statecache.__init__: Creating state cache with size 0
19/11/28 12:14:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39427.
19/11/28 12:14:02 INFO sdk_worker.__init__: Control channel established.
19/11/28 12:14:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 12:14:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43007.
19/11/28 12:14:02 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 12:14:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:36617
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 12:14:02 INFO sdk_worker.run: No more requests from control plane
19/11/28 12:14:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:02 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 12:14:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 12:14:02 INFO sdk_worker.run: Done consuming work.
19/11/28 12:14:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 12:14:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 12:14:03 INFO sdk_worker_main.main: Logging handler created.
19/11/28 12:14:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:42407
19/11/28 12:14:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 12:14:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 12:14:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574943238.09_4de6c956-10c8-4539-852f-15cf4f277972', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 12:14:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574943238.09', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:38275', 'job_port': u'0'}
19/11/28 12:14:03 INFO statecache.__init__: Creating state cache with size 0
19/11/28 12:14:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40333.
19/11/28 12:14:03 INFO sdk_worker.__init__: Control channel established.
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 12:14:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 12:14:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40295.
19/11/28 12:14:03 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 12:14:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:36579
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 12:14:03 INFO sdk_worker.run: No more requests from control plane
19/11/28 12:14:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 12:14:03 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 12:14:03 INFO sdk_worker.run: Done consuming work.
19/11/28 12:14:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 12:14:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 12:14:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 12:14:03 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574943238.09_4de6c956-10c8-4539-852f-15cf4f277972 finished.
19/11/28 12:14:03 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 12:14:03 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_597fd3fa-4b40-42fc-9364-a8d9e1cf63da","basePath":"/tmp/sparktestK4RKHK"}: {}
java.io.FileNotFoundException: /tmp/sparktestK4RKHK/job_597fd3fa-4b40-42fc-9364-a8d9e1cf63da/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
==================== Timed out after 60 seconds. ====================

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139931569612544)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-120, started daemon 139931578005248)>

# Thread: <_MainThread(MainThread, started 139932706768640)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139931561219840)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-126, started daemon 139931552827136)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-120, started daemon 139931578005248)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139932706768640)>

# Thread: <Thread(wait_until_finish_read, started daemon 139931569612544)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574943229.32_d4095ec5-e217-4456-a9a8-56bd8ebd5264 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.523s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 162fa53d-c55c-4802-a633-0c993d02e40a
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1645

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1645/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]

19/11/28 06:14:54 INFO sdk_worker.run: No more requests from control plane
19/11/28 06:14:54 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 06:14:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:54 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 06:14:54 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 06:14:54 INFO sdk_worker.run: Done consuming work.
19/11/28 06:14:54 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 06:14:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 06:14:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 06:14:55 INFO sdk_worker_main.main: Logging handler created.
19/11/28 06:14:55 INFO sdk_worker_main.start: Status HTTP server running at localhost:44249
19/11/28 06:14:55 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 06:14:55 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 06:14:55 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574921691.59_f4a4491f-78ad-4d01-afee-d796008ad3fb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 06:14:55 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574921691.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48831', 'job_port': u'0'}
19/11/28 06:14:55 INFO statecache.__init__: Creating state cache with size 0
19/11/28 06:14:55 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44797.
19/11/28 06:14:55 INFO sdk_worker.__init__: Control channel established.
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 06:14:55 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 06:14:55 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35441.
19/11/28 06:14:55 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 06:14:55 INFO data_plane.create_data_channel: Creating client data channel for localhost:42235
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 06:14:55 INFO sdk_worker.run: No more requests from control plane
19/11/28 06:14:55 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 06:14:55 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 06:14:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:55 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 06:14:55 INFO sdk_worker.run: Done consuming work.
19/11/28 06:14:55 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 06:14:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 06:14:56 INFO sdk_worker_main.main: Logging handler created.
19/11/28 06:14:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:45305
19/11/28 06:14:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 06:14:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 06:14:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574921691.59_f4a4491f-78ad-4d01-afee-d796008ad3fb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 06:14:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574921691.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48831', 'job_port': u'0'}
19/11/28 06:14:56 INFO statecache.__init__: Creating state cache with size 0
19/11/28 06:14:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44129.
19/11/28 06:14:56 INFO sdk_worker.__init__: Control channel established.
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 06:14:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 06:14:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40233.
19/11/28 06:14:56 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 06:14:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:42161
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 06:14:56 INFO sdk_worker.run: No more requests from control plane
19/11/28 06:14:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 06:14:56 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 06:14:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 06:14:56 INFO sdk_worker.run: Done consuming work.
19/11/28 06:14:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 06:14:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 06:14:57 INFO sdk_worker_main.main: Logging handler created.
19/11/28 06:14:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:39735
19/11/28 06:14:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 06:14:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 06:14:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574921691.59_f4a4491f-78ad-4d01-afee-d796008ad3fb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 06:14:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574921691.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48831', 'job_port': u'0'}
19/11/28 06:14:57 INFO statecache.__init__: Creating state cache with size 0
19/11/28 06:14:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33991.
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 06:14:57 INFO sdk_worker.__init__: Control channel established.
19/11/28 06:14:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 06:14:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46677.
19/11/28 06:14:57 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 06:14:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:42811
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 06:14:57 INFO sdk_worker.run: No more requests from control plane
19/11/28 06:14:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 06:14:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:57 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 06:14:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 06:14:57 INFO sdk_worker.run: Done consuming work.
19/11/28 06:14:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 06:14:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 06:14:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 06:14:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574921691.59_f4a4491f-78ad-4d01-afee-d796008ad3fb finished.
19/11/28 06:14:57 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 06:14:57 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f3c16b7c-6678-4432-bb12-824a5508fcbc","basePath":"/tmp/sparktesttErq5x"}: {}
java.io.FileNotFoundException: /tmp/sparktesttErq5x/job_f3c16b7c-6678-4432-bb12-824a5508fcbc/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
==================== Timed out after 60 seconds. ====================
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139773872174848)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)

# Thread: <Thread(Thread-118, started daemon 139773863782144)>

----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 139774994171648)>
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(wait_until_finish_read, started daemon 139773838604032)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-124, started daemon 139773846996736)>

# Thread: <Thread(Thread-118, started daemon 139773863782144)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139774994171648)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139773872174848)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574921679.37_8c7502a3-82b7-41d1-9f9f-32555c68f686 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 364.459s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 15s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: 41b4815c-91bd-4209-8c52-ac5b176e3907
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1644

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1644/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-8842] Temporarily disable test


------------------------------------------
[...truncated 1.31 MB...]
19/11/28 01:21:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43607.
19/11/28 01:21:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:45483
19/11/28 01:21:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:01 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:01 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 01:21:02 INFO sdk_worker_main.main: Logging handler created.
19/11/28 01:21:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:37387
19/11/28 01:21:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 01:21:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 01:21:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 01:21:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574904059.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56129', 'job_port': u'0'}
19/11/28 01:21:02 INFO statecache.__init__: Creating state cache with size 0
19/11/28 01:21:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38797.
19/11/28 01:21:02 INFO sdk_worker.__init__: Control channel established.
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/28 01:21:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 01:21:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42703.
19/11/28 01:21:02 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:38457
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:02 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:02 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:02 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:02 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 01:21:03 INFO sdk_worker_main.main: Logging handler created.
19/11/28 01:21:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:40983
19/11/28 01:21:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 01:21:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 01:21:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 01:21:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574904059.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56129', 'job_port': u'0'}
19/11/28 01:21:03 INFO statecache.__init__: Creating state cache with size 0
19/11/28 01:21:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43823.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 01:21:03 INFO sdk_worker.__init__: Control channel established.
19/11/28 01:21:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 01:21:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39975.
19/11/28 01:21:03 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:37249
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:03 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:03 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:03 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:03 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:03 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:03 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:03 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 01:21:03 INFO sdk_worker_main.main: Logging handler created.
19/11/28 01:21:03 INFO sdk_worker_main.start: Status HTTP server running at localhost:35237
19/11/28 01:21:03 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 01:21:03 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 01:21:03 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 01:21:03 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574904059.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56129', 'job_port': u'0'}
19/11/28 01:21:03 INFO statecache.__init__: Creating state cache with size 0
19/11/28 01:21:03 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33061.
19/11/28 01:21:03 INFO sdk_worker.__init__: Control channel established.
19/11/28 01:21:03 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 01:21:03 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33051.
19/11/28 01:21:03 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:03 INFO data_plane.create_data_channel: Creating client data channel for localhost:43049
19/11/28 01:21:03 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:04 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:04 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:04 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 01:21:04 INFO sdk_worker_main.main: Logging handler created.
19/11/28 01:21:04 INFO sdk_worker_main.start: Status HTTP server running at localhost:39347
19/11/28 01:21:04 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 01:21:04 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 01:21:04 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 01:21:04 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574904059.7', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56129', 'job_port': u'0'}
19/11/28 01:21:04 INFO statecache.__init__: Creating state cache with size 0
19/11/28 01:21:04 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33883.
19/11/28 01:21:04 INFO sdk_worker.__init__: Control channel established.
19/11/28 01:21:04 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 01:21:04 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39815.
19/11/28 01:21:04 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 01:21:04 INFO data_plane.create_data_channel: Creating client data channel for localhost:36559
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 01:21:04 INFO sdk_worker.run: No more requests from control plane
19/11/28 01:21:04 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 01:21:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:04 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 01:21:04 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 01:21:04 INFO sdk_worker.run: Done consuming work.
19/11/28 01:21:04 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 01:21:04 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 01:21:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 01:21:04 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574904059.7_5141349a-a45a-45e4-bdc3-292aa594092f finished.
19/11/28 01:21:04 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 01:21:04 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_557239a3-e385-4e7b-95bc-5d6512571339","basePath":"/tmp/sparktestHXa7q3"}: {}
java.io.FileNotFoundException: /tmp/sparktestHXa7q3/job_557239a3-e385-4e7b-95bc-5d6512571339/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
==================== Timed out after 60 seconds. ====================
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139844466358016)>

# Thread: <Thread(Thread-118, started daemon 139844734617344)>


======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 139845253138176)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574904050.75_50427494-9ba0-4cd8-957b-e84174595caf failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 291.159s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 38s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...

Publishing failed.

The response from https://scans-in.gradle.com/in/5.2.1/2.3 was not from the build scan server.
Your network environment may be interfering, or the service may be unavailable.

If you believe this to be in error, please report this problem via https://gradle.com/scans/help/plugin and include the following via copy/paste:

----------
Gradle version: 5.2.1
Plugin version: 2.3
Request URL: https://scans-in.gradle.com/in/5.2.1/2.3
Request ID: e9b32a28-5c48-4b49-80c8-21ba9b0c5eda
Response status code: 503
Response content type: text/html; charset=utf-8
Response server type: cloudflare
----------

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1643

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1643/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/28 00:24:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:37353
19/11/28 00:24:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 00:24:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 00:24:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 00:24:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574900679.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49575', 'job_port': u'0'}
19/11/28 00:24:42 INFO statecache.__init__: Creating state cache with size 0
19/11/28 00:24:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38163.
19/11/28 00:24:42 INFO sdk_worker.__init__: Control channel established.
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/28 00:24:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 00:24:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38419.
19/11/28 00:24:42 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 00:24:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:39403
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 00:24:42 INFO sdk_worker.run: No more requests from control plane
19/11/28 00:24:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 00:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 00:24:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 00:24:42 INFO sdk_worker.run: Done consuming work.
19/11/28 00:24:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 00:24:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 00:24:43 INFO sdk_worker_main.main: Logging handler created.
19/11/28 00:24:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:35081
19/11/28 00:24:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 00:24:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 00:24:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 00:24:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574900679.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49575', 'job_port': u'0'}
19/11/28 00:24:43 INFO statecache.__init__: Creating state cache with size 0
19/11/28 00:24:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35029.
19/11/28 00:24:43 INFO sdk_worker.__init__: Control channel established.
19/11/28 00:24:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/28 00:24:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41163.
19/11/28 00:24:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 00:24:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:36973
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 00:24:43 INFO sdk_worker.run: No more requests from control plane
19/11/28 00:24:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 00:24:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 00:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 00:24:43 INFO sdk_worker.run: Done consuming work.
19/11/28 00:24:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 00:24:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 00:24:44 INFO sdk_worker_main.main: Logging handler created.
19/11/28 00:24:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:38333
19/11/28 00:24:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 00:24:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 00:24:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 00:24:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574900679.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49575', 'job_port': u'0'}
19/11/28 00:24:44 INFO statecache.__init__: Creating state cache with size 0
19/11/28 00:24:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40641.
19/11/28 00:24:44 INFO sdk_worker.__init__: Control channel established.
19/11/28 00:24:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/28 00:24:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38087.
19/11/28 00:24:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 00:24:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:38195
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 00:24:44 INFO sdk_worker.run: No more requests from control plane
19/11/28 00:24:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 00:24:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 00:24:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 00:24:44 INFO sdk_worker.run: Done consuming work.
19/11/28 00:24:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 00:24:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/28 00:24:45 INFO sdk_worker_main.main: Logging handler created.
19/11/28 00:24:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:36119
19/11/28 00:24:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/28 00:24:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/28 00:24:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/28 00:24:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574900679.75', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49575', 'job_port': u'0'}
19/11/28 00:24:45 INFO statecache.__init__: Creating state cache with size 0
19/11/28 00:24:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44975.
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/28 00:24:45 INFO sdk_worker.__init__: Control channel established.
19/11/28 00:24:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/28 00:24:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35463.
19/11/28 00:24:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/28 00:24:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:42265
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/28 00:24:45 INFO sdk_worker.run: No more requests from control plane
19/11/28 00:24:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/28 00:24:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/28 00:24:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/28 00:24:45 INFO sdk_worker.run: Done consuming work.
19/11/28 00:24:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/28 00:24:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/28 00:24:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/28 00:24:45 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574900679.75_82e20890-2ede-486f-9881-8e2e1c1f158a finished.
19/11/28 00:24:45 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/28 00:24:45 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_ea82671f-5d7b-4086-a7e2-d8e91a1bbd60","basePath":"/tmp/sparktesthlH2f7"}: {}
java.io.FileNotFoundException: /tmp/sparktesthlH2f7/job_ea82671f-5d7b-4086-a7e2-d8e91a1bbd60/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
==================== Timed out after 60 seconds. ====================
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()

# Thread: <Thread(wait_until_finish_read, started daemon 139722815670016)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-117, started daemon 139722824062720)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
# Thread: <_MainThread(MainThread, started 139723603801856)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139722315781888)>

    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-123, started daemon 139722324174592)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-117, started daemon 139722824062720)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 139722815670016)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139723603801856)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574900670.01_e0d5e542-4f01-4a61-ad1f-9b3db37e1a04 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 309.085s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/lh4xlz3qqv6wi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1642

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1642/display/redirect?page=changes>

Changes:

[github] [BEAM-8840][BEAM-3713] Remove setup_requires, tests_require from


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 21:36:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa on Spark master local
19/11/27 21:36:36 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 21:36:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa: Pipeline translated successfully. Computing outputs
19/11/27 21:36:36 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:37 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:43379
19/11/27 21:36:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:37 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35007.
19/11/27 21:36:37 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 21:36:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37341.
19/11/27 21:36:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:34115
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:37 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:37 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:37 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:37 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:37 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:37 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:37 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:38 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:34201
19/11/27 21:36:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:38 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40217.
19/11/27 21:36:38 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 21:36:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43483.
19/11/27 21:36:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:35257
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:38 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:38 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:39 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:38545
19/11/27 21:36:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:39 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35047.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 21:36:39 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44247.
19/11/27 21:36:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:46685
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:39 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:39 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:39 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:34973
19/11/27 21:36:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:39 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35087.
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 21:36:39 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41475.
19/11/27 21:36:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:45697
19/11/27 21:36:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:40 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:40 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:36:40 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:36:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:40441
19/11/27 21:36:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:36:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:36:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:36:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574890595.83', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:40005', 'job_port': u'0'}
19/11/27 21:36:40 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:36:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39257.
19/11/27 21:36:40 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 21:36:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:36:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38643.
19/11/27 21:36:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:36:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:37091
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:36:40 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:36:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:36:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:36:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:36:40 INFO sdk_worker.run: Done consuming work.
19/11/27 21:36:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:36:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:36:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:36:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574890595.83_2d1715c3-2926-46b5-99e6-6f5dd5359faa finished.
19/11/27 21:36:40 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 21:36:40 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f72d2e2f-43c8-446a-938e-8a7c709cbac4","basePath":"/tmp/sparktestHxgVKm"}: {}
java.io.FileNotFoundException: /tmp/sparktestHxgVKm/job_f72d2e2f-43c8-446a-938e-8a7c709cbac4/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(wait_until_finish_read, started daemon 139941151139584)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 139941142746880)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 139941930587904)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574890586.85_359eff10-6d2c-4b18-8ecf-9716841a3781 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 281.439s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 17s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/iflcvv5vfb2go

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1641

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1641/display/redirect?page=changes>

Changes:

[amyrvold] [BEAM-8832] Allow GCS staging upload chunk size to be increased >1M when


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 21:08:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c on Spark master local
19/11/27 21:08:19 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 21:08:19 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c: Pipeline translated successfully. Computing outputs
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:19 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:19 INFO sdk_worker_main.start: Status HTTP server running at localhost:39307
19/11/27 21:08:19 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:19 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:19 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:19 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:19 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:19 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39955.
19/11/27 21:08:19 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 21:08:19 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:19 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34849.
19/11/27 21:08:19 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:19 INFO data_plane.create_data_channel: Creating client data channel for localhost:35721
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:19 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:19 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:20 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:35961
19/11/27 21:08:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:20 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34315.
19/11/27 21:08:20 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 21:08:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35491.
19/11/27 21:08:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:43137
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:20 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:20 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:21 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:39041
19/11/27 21:08:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:21 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37555.
19/11/27 21:08:21 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 21:08:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35615.
19/11/27 21:08:21 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:33027
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:21 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:21 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:21 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:22 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:33153
19/11/27 21:08:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:22 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36967.
19/11/27 21:08:22 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 21:08:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37871.
19/11/27 21:08:22 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:35453
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:22 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:22 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:22 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:22 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:22 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:22 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 21:08:23 INFO sdk_worker_main.main: Logging handler created.
19/11/27 21:08:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:35099
19/11/27 21:08:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 21:08:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 21:08:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 21:08:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574888898.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:41381', 'job_port': u'0'}
19/11/27 21:08:23 INFO statecache.__init__: Creating state cache with size 0
19/11/27 21:08:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45727.
19/11/27 21:08:23 INFO sdk_worker.__init__: Control channel established.
19/11/27 21:08:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 21:08:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42735.
19/11/27 21:08:23 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 21:08:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:41493
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 21:08:23 INFO sdk_worker.run: No more requests from control plane
19/11/27 21:08:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 21:08:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:23 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 21:08:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 21:08:23 INFO sdk_worker.run: Done consuming work.
19/11/27 21:08:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 21:08:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 21:08:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 21:08:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574888898.16_2f6b5dad-d8be-49ac-be2d-e7cac187169c finished.
19/11/27 21:08:23 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 21:08:23 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f5268c2a-b8ed-4a33-baf4-dd82831f332c","basePath":"/tmp/sparktestgJa7j9"}: {}
java.io.FileNotFoundException: /tmp/sparktestgJa7j9/job_f5268c2a-b8ed-4a33-baf4-dd82831f332c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140558235772672)>

# Thread: <Thread(Thread-119, started daemon 140558218987264)>

# Thread: <_MainThread(MainThread, started 140559022552832)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574888889.07_ca793a48-37fa-49b6-8753-55eb43334f17 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 290.348s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 29s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/5puhpxlat2lty

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1640

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1640/display/redirect?page=changes>

Changes:

[github] Bump python precommit timeout to 3hrs


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 19:21:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3 on Spark master local
19/11/27 19:21:44 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 19:21:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3: Pipeline translated successfully. Computing outputs
19/11/27 19:21:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:45 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:33585
19/11/27 19:21:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:45 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42671.
19/11/27 19:21:45 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 19:21:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43543.
19/11/27 19:21:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:40521
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:45 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:45 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:45 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:43567
19/11/27 19:21:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:45 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46057.
19/11/27 19:21:45 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 19:21:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46867.
19/11/27 19:21:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:46627
19/11/27 19:21:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:46 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:46 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:46 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:41987
19/11/27 19:21:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:46 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44257.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 19:21:46 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38389.
19/11/27 19:21:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:43397
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:46 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:46 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:47 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:41757
19/11/27 19:21:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:47 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33851.
19/11/27 19:21:47 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 19:21:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33091.
19/11/27 19:21:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:42197
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:47 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:47 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 19:21:48 INFO sdk_worker_main.main: Logging handler created.
19/11/27 19:21:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:43493
19/11/27 19:21:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 19:21:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 19:21:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 19:21:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574882503.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43037', 'job_port': u'0'}
19/11/27 19:21:48 INFO statecache.__init__: Creating state cache with size 0
19/11/27 19:21:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38923.
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 19:21:48 INFO sdk_worker.__init__: Control channel established.
19/11/27 19:21:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 19:21:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46733.
19/11/27 19:21:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 19:21:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:39941
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 19:21:48 INFO sdk_worker.run: No more requests from control plane
19/11/27 19:21:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 19:21:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 19:21:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 19:21:48 INFO sdk_worker.run: Done consuming work.
19/11/27 19:21:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 19:21:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 19:21:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 19:21:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574882503.37_891a5744-359f-4084-a700-2f616dfcfcd3 finished.
19/11/27 19:21:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 19:21:48 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_751fd32a-68ba-4208-a568-0e1fd3946593","basePath":"/tmp/sparktestuhB3Us"}: {}
java.io.FileNotFoundException: /tmp/sparktestuhB3Us/job_751fd32a-68ba-4208-a568-0e1fd3946593/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140140168128256)>

# Thread: <Thread(Thread-117, started daemon 140139808417536)>

# Thread: <_MainThread(MainThread, started 140140950038272)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574882494.21_cc9e8cc7-0fb4-4f1e-a371-7620feae9e59 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 288.292s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 31s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/5ftv7xlogh7fe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1639

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1639/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/27 18:17:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:37169
19/11/27 18:17:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 18:17:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 18:17:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 18:17:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574878641.53', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47043', 'job_port': u'0'}
19/11/27 18:17:24 INFO statecache.__init__: Creating state cache with size 0
19/11/27 18:17:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45027.
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 18:17:24 INFO sdk_worker.__init__: Control channel established.
19/11/27 18:17:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 18:17:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36665.
19/11/27 18:17:24 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 18:17:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:35847
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 18:17:24 INFO sdk_worker.run: No more requests from control plane
19/11/27 18:17:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 18:17:24 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 18:17:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 18:17:24 INFO sdk_worker.run: Done consuming work.
19/11/27 18:17:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 18:17:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 18:17:25 INFO sdk_worker_main.main: Logging handler created.
19/11/27 18:17:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:44165
19/11/27 18:17:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 18:17:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 18:17:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 18:17:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574878641.53', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47043', 'job_port': u'0'}
19/11/27 18:17:25 INFO statecache.__init__: Creating state cache with size 0
19/11/27 18:17:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39813.
19/11/27 18:17:25 INFO sdk_worker.__init__: Control channel established.
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 18:17:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 18:17:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38279.
19/11/27 18:17:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 18:17:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:43777
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 18:17:25 INFO sdk_worker.run: No more requests from control plane
19/11/27 18:17:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 18:17:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 18:17:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 18:17:25 INFO sdk_worker.run: Done consuming work.
19/11/27 18:17:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 18:17:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 18:17:26 INFO sdk_worker_main.main: Logging handler created.
19/11/27 18:17:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:43089
19/11/27 18:17:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 18:17:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 18:17:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 18:17:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574878641.53', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47043', 'job_port': u'0'}
19/11/27 18:17:26 INFO statecache.__init__: Creating state cache with size 0
19/11/27 18:17:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39643.
19/11/27 18:17:26 INFO sdk_worker.__init__: Control channel established.
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 18:17:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 18:17:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39875.
19/11/27 18:17:26 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 18:17:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:38185
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 18:17:26 INFO sdk_worker.run: No more requests from control plane
19/11/27 18:17:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 18:17:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 18:17:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 18:17:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:26 INFO sdk_worker.run: Done consuming work.
19/11/27 18:17:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 18:17:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 18:17:27 INFO sdk_worker_main.main: Logging handler created.
19/11/27 18:17:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:46209
19/11/27 18:17:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 18:17:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 18:17:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 18:17:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574878641.53', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:47043', 'job_port': u'0'}
19/11/27 18:17:27 INFO statecache.__init__: Creating state cache with size 0
19/11/27 18:17:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37653.
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 18:17:27 INFO sdk_worker.__init__: Control channel established.
19/11/27 18:17:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 18:17:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38259.
19/11/27 18:17:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 18:17:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:45893
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 18:17:27 INFO sdk_worker.run: No more requests from control plane
19/11/27 18:17:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 18:17:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 18:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 18:17:27 INFO sdk_worker.run: Done consuming work.
19/11/27 18:17:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 18:17:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 18:17:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 18:17:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574878641.53_27107844-74cf-4a96-9fdf-65925e7f6ebb finished.
19/11/27 18:17:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 18:17:27 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_d23c780e-adca-4061-9daa-070339f340a6","basePath":"/tmp/sparktestsOchFV"}: {}
java.io.FileNotFoundException: /tmp/sparktestsOchFV/job_d23c780e-adca-4061-9daa-070339f340a6/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139722964215552)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 139722955822848)>

# Thread: <_MainThread(MainThread, started 139723746125568)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139722861045504)>

# Thread: <Thread(Thread-123, started daemon 139722869438208)>

# Thread: <_MainThread(MainThread, started 139723746125568)>

# Thread: <Thread(Thread-119, started daemon 139722955822848)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 139722964215552)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574878631.59_cb74ba48-70b2-4791-b26b-424c8f8b3a6d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 312.334s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 57s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/3slrxck6qxx26

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1638

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1638/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 12:12:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec on Spark master local
19/11/27 12:12:44 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 12:12:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec: Pipeline translated successfully. Computing outputs
19/11/27 12:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:45 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:34677
19/11/27 12:12:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:45 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:45 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:45 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:45 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34373.
19/11/27 12:12:45 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 12:12:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33767.
19/11/27 12:12:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:34599
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:45 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:45 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:46 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:38201
19/11/27 12:12:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:46 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41665.
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 12:12:46 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35261.
19/11/27 12:12:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:46265
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:46 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:46 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:47 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:37223
19/11/27 12:12:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:47 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44145.
19/11/27 12:12:47 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 12:12:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46137.
19/11/27 12:12:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:46131
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:47 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:47 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:48 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:39895
19/11/27 12:12:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:48 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36641.
19/11/27 12:12:48 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 12:12:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35045.
19/11/27 12:12:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:41351
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:48 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:48 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 12:12:49 INFO sdk_worker_main.main: Logging handler created.
19/11/27 12:12:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:32913
19/11/27 12:12:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 12:12:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 12:12:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 12:12:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574856763.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:60381', 'job_port': u'0'}
19/11/27 12:12:49 INFO statecache.__init__: Creating state cache with size 0
19/11/27 12:12:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45329.
19/11/27 12:12:49 INFO sdk_worker.__init__: Control channel established.
19/11/27 12:12:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 12:12:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44791.
19/11/27 12:12:49 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 12:12:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:36881
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 12:12:49 INFO sdk_worker.run: No more requests from control plane
19/11/27 12:12:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 12:12:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 12:12:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 12:12:49 INFO sdk_worker.run: Done consuming work.
19/11/27 12:12:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 12:12:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 12:12:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 12:12:49 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574856763.17_cd9ac24b-231f-4937-9f49-414ef30729ec finished.
19/11/27 12:12:49 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 12:12:49 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_67042f10-937e-47bc-a621-bf08e882644c","basePath":"/tmp/sparktesthUpP7J"}: {}
java.io.FileNotFoundException: /tmp/sparktesthUpP7J/job_67042f10-937e-47bc-a621-bf08e882644c/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 140071634204416)>

======================================================================
# Thread: <Thread(Thread-119, started daemon 140071625811712)>
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)

# Thread: <_MainThread(MainThread, started 140072495224576)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574856751.86_11a8bf6d-283d-4fb7-8081-dfa792e92529 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 307.614s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 48s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/yomtenbhvhx6g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1637

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1637/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-8470] Exclude failed ValidatesRunner tests


------------------------------------------
[...truncated 1.32 MB...]
19/11/27 09:11:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 09:11:42 INFO sdk_worker_main.main: Logging handler created.
19/11/27 09:11:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:34781
19/11/27 09:11:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 09:11:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 09:11:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 09:11:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574845899.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44175', 'job_port': u'0'}
19/11/27 09:11:42 INFO statecache.__init__: Creating state cache with size 0
19/11/27 09:11:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39415.
19/11/27 09:11:42 INFO sdk_worker.__init__: Control channel established.
19/11/27 09:11:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 09:11:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42703.
19/11/27 09:11:42 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 09:11:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:35661
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 09:11:42 INFO sdk_worker.run: No more requests from control plane
19/11/27 09:11:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 09:11:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 09:11:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 09:11:42 INFO sdk_worker.run: Done consuming work.
19/11/27 09:11:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 09:11:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 09:11:43 INFO sdk_worker_main.main: Logging handler created.
19/11/27 09:11:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:35321
19/11/27 09:11:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 09:11:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 09:11:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 09:11:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574845899.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44175', 'job_port': u'0'}
19/11/27 09:11:43 INFO statecache.__init__: Creating state cache with size 0
19/11/27 09:11:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44535.
19/11/27 09:11:43 INFO sdk_worker.__init__: Control channel established.
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 09:11:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 09:11:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41019.
19/11/27 09:11:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 09:11:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:45101
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 09:11:43 INFO sdk_worker.run: No more requests from control plane
19/11/27 09:11:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 09:11:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 09:11:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 09:11:43 INFO sdk_worker.run: Done consuming work.
19/11/27 09:11:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 09:11:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 09:11:44 INFO sdk_worker_main.main: Logging handler created.
19/11/27 09:11:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:35757
19/11/27 09:11:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 09:11:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 09:11:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 09:11:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574845899.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44175', 'job_port': u'0'}
19/11/27 09:11:44 INFO statecache.__init__: Creating state cache with size 0
19/11/27 09:11:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36847.
19/11/27 09:11:44 INFO sdk_worker.__init__: Control channel established.
19/11/27 09:11:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 09:11:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38115.
19/11/27 09:11:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 09:11:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:35125
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 09:11:44 INFO sdk_worker.run: No more requests from control plane
19/11/27 09:11:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 09:11:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 09:11:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 09:11:44 INFO sdk_worker.run: Done consuming work.
19/11/27 09:11:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 09:11:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 09:11:44 INFO sdk_worker_main.main: Logging handler created.
19/11/27 09:11:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:39573
19/11/27 09:11:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 09:11:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 09:11:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 09:11:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574845899.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44175', 'job_port': u'0'}
19/11/27 09:11:44 INFO statecache.__init__: Creating state cache with size 0
19/11/27 09:11:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33195.
19/11/27 09:11:44 INFO sdk_worker.__init__: Control channel established.
19/11/27 09:11:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 09:11:45 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 09:11:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34841.
19/11/27 09:11:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 09:11:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:34321
19/11/27 09:11:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 09:11:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 09:11:45 INFO sdk_worker.run: No more requests from control plane
19/11/27 09:11:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 09:11:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 09:11:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 09:11:45 INFO sdk_worker.run: Done consuming work.
19/11/27 09:11:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 09:11:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 09:11:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 09:11:45 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574845899.59_04729cc3-758d-4bd5-9679-7b2d6344d85c finished.
19/11/27 09:11:45 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 09:11:45 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_92bf4d4d-91f4-4edc-a642-a4f396b9a00a","basePath":"/tmp/sparktestAnP_bS"}: {}
java.io.FileNotFoundException: /tmp/sparktestAnP_bS/job_92bf4d4d-91f4-4edc-a642-a4f396b9a00a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140051493152512)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140051577751296)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140052357199616)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140051475842816)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140051484497664)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140052357199616)>
======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574845889.71_7a0355b2-6632-4d2d-a05b-82152cb681d2 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 310.837s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 4s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/fa44qpnkwi6pg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1636

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1636/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/27 06:19:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 06:19:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 06:19:48 INFO sdk_worker_main.main: Logging handler created.
19/11/27 06:19:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:39025
19/11/27 06:19:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 06:19:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 06:19:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 06:19:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574835586.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49593', 'job_port': u'0'}
19/11/27 06:19:48 INFO statecache.__init__: Creating state cache with size 0
19/11/27 06:19:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45673.
19/11/27 06:19:48 INFO sdk_worker.__init__: Control channel established.
19/11/27 06:19:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 06:19:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 06:19:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38925.
19/11/27 06:19:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 06:19:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:44269
19/11/27 06:19:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 06:19:49 INFO sdk_worker.run: No more requests from control plane
19/11/27 06:19:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 06:19:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 06:19:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 06:19:49 INFO sdk_worker.run: Done consuming work.
19/11/27 06:19:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 06:19:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 06:19:49 INFO sdk_worker_main.main: Logging handler created.
19/11/27 06:19:49 INFO sdk_worker_main.start: Status HTTP server running at localhost:37517
19/11/27 06:19:49 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 06:19:49 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 06:19:49 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 06:19:49 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574835586.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49593', 'job_port': u'0'}
19/11/27 06:19:49 INFO statecache.__init__: Creating state cache with size 0
19/11/27 06:19:49 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44249.
19/11/27 06:19:49 INFO sdk_worker.__init__: Control channel established.
19/11/27 06:19:49 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 06:19:49 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45855.
19/11/27 06:19:49 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 06:19:49 INFO data_plane.create_data_channel: Creating client data channel for localhost:36305
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 06:19:49 INFO sdk_worker.run: No more requests from control plane
19/11/27 06:19:49 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 06:19:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:49 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 06:19:49 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 06:19:49 INFO sdk_worker.run: Done consuming work.
19/11/27 06:19:49 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 06:19:49 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:49 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 06:19:50 INFO sdk_worker_main.main: Logging handler created.
19/11/27 06:19:50 INFO sdk_worker_main.start: Status HTTP server running at localhost:45343
19/11/27 06:19:50 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 06:19:50 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 06:19:50 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 06:19:50 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574835586.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49593', 'job_port': u'0'}
19/11/27 06:19:50 INFO statecache.__init__: Creating state cache with size 0
19/11/27 06:19:50 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46561.
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 06:19:50 INFO sdk_worker.__init__: Control channel established.
19/11/27 06:19:50 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 06:19:50 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36283.
19/11/27 06:19:50 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 06:19:50 INFO data_plane.create_data_channel: Creating client data channel for localhost:38449
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 06:19:50 INFO sdk_worker.run: No more requests from control plane
19/11/27 06:19:50 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 06:19:50 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 06:19:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:50 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 06:19:50 INFO sdk_worker.run: Done consuming work.
19/11/27 06:19:50 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 06:19:50 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:50 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 06:19:51 INFO sdk_worker_main.main: Logging handler created.
19/11/27 06:19:51 INFO sdk_worker_main.start: Status HTTP server running at localhost:43807
19/11/27 06:19:51 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 06:19:51 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 06:19:51 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 06:19:51 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574835586.17', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:49593', 'job_port': u'0'}
19/11/27 06:19:51 INFO statecache.__init__: Creating state cache with size 0
19/11/27 06:19:51 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39155.
19/11/27 06:19:51 INFO sdk_worker.__init__: Control channel established.
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 06:19:51 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 06:19:51 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46167.
19/11/27 06:19:51 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 06:19:51 INFO data_plane.create_data_channel: Creating client data channel for localhost:36131
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 06:19:51 INFO sdk_worker.run: No more requests from control plane
19/11/27 06:19:51 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 06:19:51 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 06:19:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:51 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 06:19:51 INFO sdk_worker.run: Done consuming work.
19/11/27 06:19:51 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 06:19:51 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 06:19:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 06:19:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574835586.17_32374364-5bbc-444f-a125-f699d3afe24f finished.
19/11/27 06:19:51 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 06:19:51 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_89b7e81c-acae-4f1d-b18b-5a4b38f11fbf","basePath":"/tmp/sparktestBElStw"}: {}
java.io.FileNotFoundException: /tmp/sparktestBElStw/job_89b7e81c-acae-4f1d-b18b-5a4b38f11fbf/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)

# Thread: <Thread(wait_until_finish_read, started daemon 139884032538368)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(Thread-119, started daemon 139884015752960)>

# Thread: <_MainThread(MainThread, started 139884814448384)>
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

# Thread: <Thread(wait_until_finish_read, started daemon 139883393840896)>

# Thread: <Thread(Thread-125, started daemon 139884007360256)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <_MainThread(MainThread, started 139884814448384)>
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574835576.58_d351b1c6-ffa7-4ca6-8036-512795e2e32e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 298.384s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 33s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/u6krsw5yaw6oe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1635

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1635/display/redirect?page=changes>

Changes:

[aaltay] [BEAM-7390] Add code snippets for Count (#9923)

[aaltay] [BEAM-7390] Add code snippets for CombineGlobally (#9920)

[aaltay] [BEAM-7390] Add code snippets for CombineValues (#9922)

[aaltay] Fix sorting order bug. (#9883)


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 02:02:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790 on Spark master local
19/11/27 02:02:40 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 02:02:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790: Pipeline translated successfully. Computing outputs
19/11/27 02:02:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:40 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:35153
19/11/27 02:02:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:40 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37599.
19/11/27 02:02:40 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 02:02:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33409.
19/11/27 02:02:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:41287
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:41 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:41 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:41 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:43983
19/11/27 02:02:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:41 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45603.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 02:02:41 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38699.
19/11/27 02:02:41 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:37881
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:41 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:41 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:42 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:45947
19/11/27 02:02:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:42 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44405.
19/11/27 02:02:42 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 02:02:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40483.
19/11/27 02:02:42 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:43347
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:42 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:42 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:43 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:46271
19/11/27 02:02:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:43 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46519.
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 02:02:43 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34085.
19/11/27 02:02:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:40701
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:43 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:43 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:43 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 02:02:44 INFO sdk_worker_main.main: Logging handler created.
19/11/27 02:02:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:46439
19/11/27 02:02:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 02:02:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 02:02:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 02:02:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574820159.28', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45491', 'job_port': u'0'}
19/11/27 02:02:44 INFO statecache.__init__: Creating state cache with size 0
19/11/27 02:02:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34815.
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 02:02:44 INFO sdk_worker.__init__: Control channel established.
19/11/27 02:02:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 02:02:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42857.
19/11/27 02:02:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 02:02:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:40013
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 02:02:44 INFO sdk_worker.run: No more requests from control plane
19/11/27 02:02:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 02:02:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 02:02:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 02:02:44 INFO sdk_worker.run: Done consuming work.
19/11/27 02:02:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 02:02:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 02:02:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 02:02:44 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574820159.28_d050456c-0004-4029-90c8-ecc41d0e8790 finished.
19/11/27 02:02:44 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 02:02:44 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_84c6fcb7-ed28-47ca-88fe-907f2164c508","basePath":"/tmp/sparktestLvQ5zZ"}: {}
java.io.FileNotFoundException: /tmp/sparktestLvQ5zZ/job_84c6fcb7-ed28-47ca-88fe-907f2164c508/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574820150.13_9202a452-c727-4c63-93c5-3c34c4e4c94a failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140059464742656)>

----------------------------------------------------------------------
Ran 38 tests in 293.167s

FAILED (errors=2, skipped=9)
# Thread: <Thread(Thread-120, started daemon 140059456349952)>

# Thread: <_MainThread(MainThread, started 140060259915520)>

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 39s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/m6ityq6fe32dk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1634

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1634/display/redirect>

Changes:


------------------------------------------
[...truncated 1.31 MB...]
19/11/27 00:19:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949 on Spark master local
19/11/27 00:19:57 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/27 00:19:57 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949: Pipeline translated successfully. Computing outputs
19/11/27 00:19:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:19:58 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:19:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:42799
19/11/27 00:19:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:19:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:19:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:19:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:19:58 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:19:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39549.
19/11/27 00:19:58 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:19:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/27 00:19:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42099.
19/11/27 00:19:58 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:19:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:35621
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:19:58 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:19:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:19:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:19:58 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:19:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:19:58 INFO sdk_worker.run: Done consuming work.
19/11/27 00:19:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:19:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:19:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:19:59 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:19:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:41413
19/11/27 00:19:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:19:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:19:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:19:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:19:59 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:19:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45611.
19/11/27 00:19:59 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:19:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/27 00:19:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38737.
19/11/27 00:19:59 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:19:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:37199
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:19:59 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:19:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:19:59 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:19:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:19:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:19:59 INFO sdk_worker.run: Done consuming work.
19/11/27 00:19:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:19:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:19:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:20:00 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:20:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:41525
19/11/27 00:20:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:20:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:20:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:20:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:20:00 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:20:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43349.
19/11/27 00:20:00 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:20:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/27 00:20:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44705.
19/11/27 00:20:00 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:20:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:40319
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:20:00 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:20:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:20:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:20:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:20:00 INFO sdk_worker.run: Done consuming work.
19/11/27 00:20:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:20:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:20:01 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:20:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:38693
19/11/27 00:20:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:20:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:20:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:20:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:20:01 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:20:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40735.
19/11/27 00:20:01 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:20:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/27 00:20:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45307.
19/11/27 00:20:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:20:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:36653
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:20:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:01 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:20:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:20:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:20:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:20:01 INFO sdk_worker.run: Done consuming work.
19/11/27 00:20:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:20:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/27 00:20:02 INFO sdk_worker_main.main: Logging handler created.
19/11/27 00:20:02 INFO sdk_worker_main.start: Status HTTP server running at localhost:35515
19/11/27 00:20:02 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/27 00:20:02 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/27 00:20:02 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/27 00:20:02 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574813996.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:44171', 'job_port': u'0'}
19/11/27 00:20:02 INFO statecache.__init__: Creating state cache with size 0
19/11/27 00:20:02 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44231.
19/11/27 00:20:02 INFO sdk_worker.__init__: Control channel established.
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/27 00:20:02 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/27 00:20:02 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40867.
19/11/27 00:20:02 INFO sdk_worker.create_state_handler: State channel established.
19/11/27 00:20:02 INFO data_plane.create_data_channel: Creating client data channel for localhost:42137
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/27 00:20:02 INFO sdk_worker.run: No more requests from control plane
19/11/27 00:20:02 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/27 00:20:02 INFO data_plane.close: Closing all cached grpc data channels.
19/11/27 00:20:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:02 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/27 00:20:02 INFO sdk_worker.run: Done consuming work.
19/11/27 00:20:02 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/27 00:20:02 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/27 00:20:02 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/27 00:20:02 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574813996.88_3852a7f0-07fc-480e-a1bc-c581a09d9949 finished.
19/11/27 00:20:02 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/27 00:20:02 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_339cbc4b-1b82-49ed-9760-dc29ab2c1633","basePath":"/tmp/sparktestvgC75n"}: {}
java.io.FileNotFoundException: /tmp/sparktestvgC75n/job_339cbc4b-1b82-49ed-9760-dc29ab2c1633/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140047058089728)>

Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
# Thread: <Thread(Thread-118, started daemon 140047049697024)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 140047839999744)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574813987.29_6bef242d-f018-4d69-8726-3a5fcb49baa6 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 306.483s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 49s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/zv5flclpq2636

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1633

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1633/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 18:24:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:32817
19/11/26 18:24:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 18:24:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 18:24:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 18:24:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574792675.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59823', 'job_port': u'0'}
19/11/26 18:24:38 INFO statecache.__init__: Creating state cache with size 0
19/11/26 18:24:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40621.
19/11/26 18:24:38 INFO sdk_worker.__init__: Control channel established.
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 18:24:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 18:24:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40649.
19/11/26 18:24:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 18:24:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:34243
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 18:24:38 INFO sdk_worker.run: No more requests from control plane
19/11/26 18:24:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 18:24:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 18:24:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 18:24:38 INFO sdk_worker.run: Done consuming work.
19/11/26 18:24:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 18:24:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 18:24:39 INFO sdk_worker_main.main: Logging handler created.
19/11/26 18:24:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:37383
19/11/26 18:24:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 18:24:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 18:24:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 18:24:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574792675.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59823', 'job_port': u'0'}
19/11/26 18:24:39 INFO statecache.__init__: Creating state cache with size 0
19/11/26 18:24:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35947.
19/11/26 18:24:39 INFO sdk_worker.__init__: Control channel established.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 18:24:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 18:24:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35705.
19/11/26 18:24:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 18:24:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:43591
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 18:24:39 INFO sdk_worker.run: No more requests from control plane
19/11/26 18:24:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 18:24:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 18:24:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 18:24:39 INFO sdk_worker.run: Done consuming work.
19/11/26 18:24:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 18:24:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 18:24:39 INFO sdk_worker_main.main: Logging handler created.
19/11/26 18:24:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:38005
19/11/26 18:24:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 18:24:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 18:24:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 18:24:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574792675.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59823', 'job_port': u'0'}
19/11/26 18:24:39 INFO statecache.__init__: Creating state cache with size 0
19/11/26 18:24:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45711.
19/11/26 18:24:39 INFO sdk_worker.__init__: Control channel established.
19/11/26 18:24:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 18:24:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33819.
19/11/26 18:24:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 18:24:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:38013
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 18:24:39 INFO sdk_worker.run: No more requests from control plane
19/11/26 18:24:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 18:24:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 18:24:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 18:24:39 INFO sdk_worker.run: Done consuming work.
19/11/26 18:24:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 18:24:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 18:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 18:24:40 INFO sdk_worker_main.main: Logging handler created.
19/11/26 18:24:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:34671
19/11/26 18:24:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 18:24:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 18:24:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 18:24:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574792675.76', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:59823', 'job_port': u'0'}
19/11/26 18:24:40 INFO statecache.__init__: Creating state cache with size 0
19/11/26 18:24:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35257.
19/11/26 18:24:40 INFO sdk_worker.__init__: Control channel established.
19/11/26 18:24:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 18:24:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37541.
19/11/26 18:24:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 18:24:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:34145
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 18:24:40 INFO sdk_worker.run: No more requests from control plane
19/11/26 18:24:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 18:24:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 18:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 18:24:40 INFO sdk_worker.run: Done consuming work.
19/11/26 18:24:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 18:24:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 18:24:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 18:24:40 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574792675.76_3de1076c-6b3d-4a7e-a3c1-d142805cb3b6 finished.
19/11/26 18:24:40 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 18:24:40 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_13dc8541-e8e9-4cf3-91a0-be671526cfc4","basePath":"/tmp/sparktestd_SpbU"}: {}
java.io.FileNotFoundException: /tmp/sparktestd_SpbU/job_13dc8541-e8e9-4cf3-91a0-be671526cfc4/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
==================== Timed out after 60 seconds. ====================
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()

# Thread: <Thread(wait_until_finish_read, started daemon 140602456925952)>

  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-117, started daemon 140602465318656)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140603596281600)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140602440140544)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-123, started daemon 140602448533248)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-117, started daemon 140602465318656)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140602456925952)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apach# Thread: <_MainThread(MainThread, started 140603596281600)>
e_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574792666.71_6e024191-3c3b-4113-80c9-7707333d735d failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 315.409s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 50s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/r72nscqulew4a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1632

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1632/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-8470] move enableSparkMetricSinks option to common spark pipeline


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 13:12:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:37569
19/11/26 13:12:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 13:12:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 13:12:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 13:12:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574773961.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56319', 'job_port': u'0'}
19/11/26 13:12:43 INFO statecache.__init__: Creating state cache with size 0
19/11/26 13:12:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37633.
19/11/26 13:12:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 13:12:43 INFO sdk_worker.__init__: Control channel established.
19/11/26 13:12:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 13:12:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36129.
19/11/26 13:12:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 13:12:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:44235
19/11/26 13:12:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 13:12:44 INFO sdk_worker.run: No more requests from control plane
19/11/26 13:12:44 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 13:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 13:12:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 13:12:44 INFO sdk_worker.run: Done consuming work.
19/11/26 13:12:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 13:12:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 13:12:44 INFO sdk_worker_main.main: Logging handler created.
19/11/26 13:12:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:39027
19/11/26 13:12:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 13:12:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 13:12:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 13:12:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574773961.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56319', 'job_port': u'0'}
19/11/26 13:12:44 INFO statecache.__init__: Creating state cache with size 0
19/11/26 13:12:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43541.
19/11/26 13:12:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 13:12:44 INFO sdk_worker.__init__: Control channel established.
19/11/26 13:12:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 13:12:45 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33311.
19/11/26 13:12:45 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 13:12:45 INFO data_plane.create_data_channel: Creating client data channel for localhost:45359
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 13:12:45 INFO sdk_worker.run: No more requests from control plane
19/11/26 13:12:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 13:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 13:12:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 13:12:45 INFO sdk_worker.run: Done consuming work.
19/11/26 13:12:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 13:12:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 13:12:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 13:12:45 INFO sdk_worker_main.main: Logging handler created.
19/11/26 13:12:45 INFO sdk_worker_main.start: Status HTTP server running at localhost:45469
19/11/26 13:12:45 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 13:12:45 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 13:12:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 13:12:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574773961.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56319', 'job_port': u'0'}
19/11/26 13:12:46 INFO statecache.__init__: Creating state cache with size 0
19/11/26 13:12:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43479.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 13:12:46 INFO sdk_worker.__init__: Control channel established.
19/11/26 13:12:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 13:12:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44813.
19/11/26 13:12:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 13:12:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:40145
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 13:12:46 INFO sdk_worker.run: No more requests from control plane
19/11/26 13:12:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 13:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 13:12:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 13:12:46 INFO sdk_worker.run: Done consuming work.
19/11/26 13:12:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 13:12:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 13:12:46 INFO sdk_worker_main.main: Logging handler created.
19/11/26 13:12:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:39459
19/11/26 13:12:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 13:12:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 13:12:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 13:12:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574773961.06', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:56319', 'job_port': u'0'}
19/11/26 13:12:46 INFO statecache.__init__: Creating state cache with size 0
19/11/26 13:12:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40591.
19/11/26 13:12:46 INFO sdk_worker.__init__: Control channel established.
19/11/26 13:12:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 13:12:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 13:12:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44659.
19/11/26 13:12:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 13:12:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:44603
19/11/26 13:12:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 13:12:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 13:12:47 INFO sdk_worker.run: No more requests from control plane
19/11/26 13:12:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 13:12:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 13:12:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 13:12:47 INFO sdk_worker.run: Done consuming work.
19/11/26 13:12:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 13:12:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 13:12:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 13:12:47 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574773961.06_aa845bd5-0215-4311-b66a-d96d8cd0162b finished.
19/11/26 13:12:47 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 13:12:47 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_f229307b-9dd8-49ef-b472-8c3266a2adcb","basePath":"/tmp/sparktesthGpFaD"}: {}
java.io.FileNotFoundException: /tmp/sparktesthGpFaD/job_f229307b-9dd8-49ef-b472-8c3266a2adcb/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139790431278848)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
# Thread: <Thread(Thread-119, started daemon 139790439671552)>

    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 139791566620416)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(wait_until_finish_read, started daemon 139790414493440)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-125, started daemon 139790422886144)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-119, started daemon 139790439671552)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139791566620416)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574773950.71_ec3309df-46eb-4cbc-b027-b2109e1d1f22 failed in state FAILED: java.lang.UnsupportedOperationException: The A# Thread: <Thread(wait_until_finish_read, started daemon 139790431278848)>
ctiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 323.475s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 10s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/6p2phocuswu3y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1631

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1631/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 12:14:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 12:14:40 INFO sdk_worker_main.main: Logging handler created.
19/11/26 12:14:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:46119
19/11/26 12:14:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 12:14:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 12:14:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 12:14:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574770478.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32881', 'job_port': u'0'}
19/11/26 12:14:40 INFO statecache.__init__: Creating state cache with size 0
19/11/26 12:14:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33029.
19/11/26 12:14:40 INFO sdk_worker.__init__: Control channel established.
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 12:14:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 12:14:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36359.
19/11/26 12:14:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 12:14:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:44923
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 12:14:40 INFO sdk_worker.run: No more requests from control plane
19/11/26 12:14:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 12:14:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 12:14:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 12:14:40 INFO sdk_worker.run: Done consuming work.
19/11/26 12:14:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 12:14:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 12:14:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 12:14:41 INFO sdk_worker_main.main: Logging handler created.
19/11/26 12:14:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:36265
19/11/26 12:14:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 12:14:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 12:14:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 12:14:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574770478.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32881', 'job_port': u'0'}
19/11/26 12:14:41 INFO statecache.__init__: Creating state cache with size 0
19/11/26 12:14:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46849.
19/11/26 12:14:41 INFO sdk_worker.__init__: Control channel established.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 12:14:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 12:14:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44575.
19/11/26 12:14:41 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 12:14:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:43327
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 12:14:41 INFO sdk_worker.run: No more requests from control plane
19/11/26 12:14:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 12:14:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 12:14:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 12:14:41 INFO sdk_worker.run: Done consuming work.
19/11/26 12:14:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 12:14:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 12:14:42 INFO sdk_worker_main.main: Logging handler created.
19/11/26 12:14:42 INFO sdk_worker_main.start: Status HTTP server running at localhost:42449
19/11/26 12:14:42 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 12:14:42 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 12:14:42 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 12:14:42 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574770478.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32881', 'job_port': u'0'}
19/11/26 12:14:42 INFO statecache.__init__: Creating state cache with size 0
19/11/26 12:14:42 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41229.
19/11/26 12:14:42 INFO sdk_worker.__init__: Control channel established.
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 12:14:42 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 12:14:42 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42527.
19/11/26 12:14:42 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 12:14:42 INFO data_plane.create_data_channel: Creating client data channel for localhost:40719
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 12:14:42 INFO sdk_worker.run: No more requests from control plane
19/11/26 12:14:42 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 12:14:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:42 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 12:14:42 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 12:14:42 INFO sdk_worker.run: Done consuming work.
19/11/26 12:14:42 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 12:14:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:42 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 12:14:43 INFO sdk_worker_main.main: Logging handler created.
19/11/26 12:14:43 INFO sdk_worker_main.start: Status HTTP server running at localhost:35763
19/11/26 12:14:43 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 12:14:43 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 12:14:43 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 12:14:43 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574770478.3', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:32881', 'job_port': u'0'}
19/11/26 12:14:43 INFO statecache.__init__: Creating state cache with size 0
19/11/26 12:14:43 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45531.
19/11/26 12:14:43 INFO sdk_worker.__init__: Control channel established.
19/11/26 12:14:43 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 12:14:43 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37081.
19/11/26 12:14:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 12:14:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:37545
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 12:14:43 INFO sdk_worker.run: No more requests from control plane
19/11/26 12:14:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 12:14:43 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 12:14:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:43 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 12:14:43 INFO sdk_worker.run: Done consuming work.
19/11/26 12:14:43 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 12:14:43 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 12:14:43 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 12:14:43 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574770478.3_62e3d065-73c0-40ef-ad5b-7a7d8bd213d2 finished.
19/11/26 12:14:43 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 12:14:43 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_9cc2a8af-11cc-4b4a-97a4-84b56434bf50","basePath":"/tmp/sparktestsgI2fG"}: {}
java.io.FileNotFoundException: /tmp/sparktestsgI2fG/job_9cc2a8af-11cc-4b4a-97a4-84b56434bf50/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================
    assert_that(actual, equal_to(expected))

  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(wait_until_finish_read, started daemon 139653596305152)>

# Thread: <Thread(Thread-117, started daemon 139653604697856)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 139654588397312)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 139653571127040)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, i# Thread: <Thread(Thread-122, started daemon 139653579519744)>

n test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <_MainThread(MainThread, started 139654588397312)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574770469.12_b6cbd08f-d55d-49da-a933-4e65dd2a684f failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 288.881s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 56s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/f2jkmhu2fy736

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1630

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1630/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 06:13:24 INFO sdk_worker_main.start: Status HTTP server running at localhost:42783
19/11/26 06:13:24 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 06:13:24 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 06:13:24 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 06:13:24 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574748801.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54415', 'job_port': u'0'}
19/11/26 06:13:24 INFO statecache.__init__: Creating state cache with size 0
19/11/26 06:13:24 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46007.
19/11/26 06:13:24 INFO sdk_worker.__init__: Control channel established.
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 06:13:24 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 06:13:24 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46323.
19/11/26 06:13:24 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 06:13:24 INFO data_plane.create_data_channel: Creating client data channel for localhost:44327
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 06:13:24 INFO sdk_worker.run: No more requests from control plane
19/11/26 06:13:24 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 06:13:24 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 06:13:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:24 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 06:13:24 INFO sdk_worker.run: Done consuming work.
19/11/26 06:13:24 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 06:13:24 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:24 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 06:13:25 INFO sdk_worker_main.main: Logging handler created.
19/11/26 06:13:25 INFO sdk_worker_main.start: Status HTTP server running at localhost:41749
19/11/26 06:13:25 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 06:13:25 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 06:13:25 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 06:13:25 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574748801.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54415', 'job_port': u'0'}
19/11/26 06:13:25 INFO statecache.__init__: Creating state cache with size 0
19/11/26 06:13:25 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37541.
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 06:13:25 INFO sdk_worker.__init__: Control channel established.
19/11/26 06:13:25 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 06:13:25 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37283.
19/11/26 06:13:25 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 06:13:25 INFO data_plane.create_data_channel: Creating client data channel for localhost:32819
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 06:13:25 INFO sdk_worker.run: No more requests from control plane
19/11/26 06:13:25 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 06:13:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:25 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 06:13:25 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 06:13:25 INFO sdk_worker.run: Done consuming work.
19/11/26 06:13:25 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 06:13:25 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:25 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 06:13:26 INFO sdk_worker_main.main: Logging handler created.
19/11/26 06:13:26 INFO sdk_worker_main.start: Status HTTP server running at localhost:42179
19/11/26 06:13:26 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 06:13:26 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 06:13:26 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 06:13:26 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574748801.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54415', 'job_port': u'0'}
19/11/26 06:13:26 INFO statecache.__init__: Creating state cache with size 0
19/11/26 06:13:26 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46041.
19/11/26 06:13:26 INFO sdk_worker.__init__: Control channel established.
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 06:13:26 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 06:13:26 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36743.
19/11/26 06:13:26 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 06:13:26 INFO data_plane.create_data_channel: Creating client data channel for localhost:41435
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 06:13:26 INFO sdk_worker.run: No more requests from control plane
19/11/26 06:13:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 06:13:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 06:13:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 06:13:26 INFO sdk_worker.run: Done consuming work.
19/11/26 06:13:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 06:13:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:26 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 06:13:27 INFO sdk_worker_main.main: Logging handler created.
19/11/26 06:13:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:34225
19/11/26 06:13:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 06:13:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 06:13:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 06:13:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574748801.88', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:54415', 'job_port': u'0'}
19/11/26 06:13:27 INFO statecache.__init__: Creating state cache with size 0
19/11/26 06:13:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39659.
19/11/26 06:13:27 INFO sdk_worker.__init__: Control channel established.
19/11/26 06:13:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 06:13:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:32791.
19/11/26 06:13:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 06:13:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:34521
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 06:13:27 INFO sdk_worker.run: No more requests from control plane
19/11/26 06:13:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 06:13:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 06:13:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 06:13:27 INFO sdk_worker.run: Done consuming work.
19/11/26 06:13:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 06:13:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 06:13:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 06:13:27 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574748801.88_a44ffbe1-f921-4bcf-a8c3-7eaddea9af78 finished.
19/11/26 06:13:27 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 06:13:27 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_724f73c0-f079-4e0d-8a7b-183f8de058ac","basePath":"/tmp/sparktestS7dEH3"}: {}
java.io.FileNotFoundException: /tmp/sparktestS7dEH3/job_724f73c0-f079-4e0d-8a7b-183f8de058ac/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================
    assert_that(actual, equal_to(expected))

  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <Thread(wait_until_finish_read, started daemon 140466384054016)>

    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-119, started daemon 140466024605440)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <_MainThread(MainThread, started 140467163793152)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(wait_until_finish_read, started daemon 140466016212736)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-125, started daemon 140466007820032)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(Thread-119, started daemon 140466024605440)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140467163793152)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140466384054016)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574748791.61_c5a7d937-c1d9-4432-9922-3d6bc0b05cee failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 322.805s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 12s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/4azdl35p4qdpa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1629

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1629/display/redirect?page=changes>

Changes:

[suztomo] Dataflow Java worker to avoid undeclared Guava

[suztomo] Beam SQL JDBC driver not to declare unused Guava

[suztomo] KinesisIO to declare Guava dependency

[suztomo] ZetaSQL to declare Guava dependency

[suztomo] Removed unused dependency from elasticsearch-tests-2


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 05:19:15 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:15 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 05:19:16 INFO sdk_worker_main.main: Logging handler created.
19/11/26 05:19:16 INFO sdk_worker_main.start: Status HTTP server running at localhost:38553
19/11/26 05:19:16 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 05:19:16 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 05:19:16 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 05:19:16 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574745553.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48359', 'job_port': u'0'}
19/11/26 05:19:16 INFO statecache.__init__: Creating state cache with size 0
19/11/26 05:19:16 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42363.
19/11/26 05:19:16 INFO sdk_worker.__init__: Control channel established.
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 05:19:16 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 05:19:16 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35309.
19/11/26 05:19:16 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 05:19:16 INFO data_plane.create_data_channel: Creating client data channel for localhost:40083
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 05:19:16 INFO sdk_worker.run: No more requests from control plane
19/11/26 05:19:16 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 05:19:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:16 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 05:19:16 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 05:19:16 INFO sdk_worker.run: Done consuming work.
19/11/26 05:19:16 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 05:19:16 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:16 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 05:19:17 INFO sdk_worker_main.main: Logging handler created.
19/11/26 05:19:17 INFO sdk_worker_main.start: Status HTTP server running at localhost:41367
19/11/26 05:19:17 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 05:19:17 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 05:19:17 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 05:19:17 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574745553.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48359', 'job_port': u'0'}
19/11/26 05:19:17 INFO statecache.__init__: Creating state cache with size 0
19/11/26 05:19:17 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35339.
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 05:19:17 INFO sdk_worker.__init__: Control channel established.
19/11/26 05:19:17 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 05:19:17 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46309.
19/11/26 05:19:17 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 05:19:17 INFO data_plane.create_data_channel: Creating client data channel for localhost:34135
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 05:19:17 INFO sdk_worker.run: No more requests from control plane
19/11/26 05:19:17 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 05:19:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:17 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 05:19:17 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 05:19:17 INFO sdk_worker.run: Done consuming work.
19/11/26 05:19:17 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 05:19:17 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 05:19:17 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 05:19:18 INFO sdk_worker_main.main: Logging handler created.
19/11/26 05:19:18 INFO sdk_worker_main.start: Status HTTP server running at localhost:44701
19/11/26 05:19:18 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 05:19:18 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 05:19:18 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 05:19:18 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574745553.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48359', 'job_port': u'0'}
19/11/26 05:19:18 INFO statecache.__init__: Creating state cache with size 0
19/11/26 05:19:18 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33047.
19/11/26 05:19:18 INFO sdk_worker.__init__: Control channel established.
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 05:19:18 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 05:19:18 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:37171.
19/11/26 05:19:18 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 05:19:18 INFO data_plane.create_data_channel: Creating client data channel for localhost:38725
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 05:19:18 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 05:19:19 INFO sdk_worker.run: No more requests from control plane
19/11/26 05:19:19 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 05:19:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:19 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 05:19:19 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 05:19:19 INFO sdk_worker.run: Done consuming work.
19/11/26 05:19:19 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 05:19:19 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 05:19:19 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:19 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 05:19:20 INFO sdk_worker_main.main: Logging handler created.
19/11/26 05:19:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:40209
19/11/26 05:19:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 05:19:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 05:19:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 05:19:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574745553.61', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48359', 'job_port': u'0'}
19/11/26 05:19:20 INFO statecache.__init__: Creating state cache with size 0
19/11/26 05:19:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36163.
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 05:19:20 INFO sdk_worker.__init__: Control channel established.
19/11/26 05:19:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 05:19:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42695.
19/11/26 05:19:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 05:19:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:42323
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 05:19:20 INFO sdk_worker.run: No more requests from control plane
19/11/26 05:19:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 05:19:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 05:19:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 05:19:20 INFO sdk_worker.run: Done consuming work.
19/11/26 05:19:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 05:19:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 05:19:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 05:19:20 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574745553.61_de983ce8-fcfe-4364-a487-858d27dff9a3 finished.
19/11/26 05:19:20 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 05:19:20 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6ad509d7-1470-499e-8411-e64526fa6bc8","basePath":"/tmp/sparktestsRNiHC"}: {}
java.io.FileNotFoundException: /tmp/sparktestsRNiHC/job_6ad509d7-1470-499e-8411-e64526fa6bc8/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
==================== Timed out after 60 seconds. ====================
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140059546216192)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-120, started daemon 140059537823488)>

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
# Thread: <_MainThread(MainThread, started 140060530018048)>
==================== Timed out after 60 seconds. ====================

----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(wait_until_finish_read, started daemon 140059512645376)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(Thread-126, started daemon 140059521038080)>

# Thread: <_MainThread(MainThread, started 140060530018048)>
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574745540.23_5b6d0961-fce7-4d65-b1cc-3c7d96385fd9 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 325.238s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 16s
60 actionable tasks: 48 executed, 12 from cache

Publishing build scan...
https://gradle.com/s/mafgopxei3aqu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1628

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1628/display/redirect>

Changes:


------------------------------------------
[...truncated 1.32 MB...]
19/11/26 00:26:04 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:04 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 00:26:05 INFO sdk_worker_main.main: Logging handler created.
19/11/26 00:26:05 INFO sdk_worker_main.start: Status HTTP server running at localhost:34421
19/11/26 00:26:05 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 00:26:05 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 00:26:05 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 00:26:05 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574727962.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45139', 'job_port': u'0'}
19/11/26 00:26:05 INFO statecache.__init__: Creating state cache with size 0
19/11/26 00:26:05 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42615.
19/11/26 00:26:05 INFO sdk_worker.__init__: Control channel established.
19/11/26 00:26:05 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/26 00:26:05 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:35983.
19/11/26 00:26:05 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 00:26:05 INFO data_plane.create_data_channel: Creating client data channel for localhost:44765
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 00:26:05 INFO sdk_worker.run: No more requests from control plane
19/11/26 00:26:05 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 00:26:05 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 00:26:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:05 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 00:26:05 INFO sdk_worker.run: Done consuming work.
19/11/26 00:26:05 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 00:26:05 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:05 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 00:26:06 INFO sdk_worker_main.main: Logging handler created.
19/11/26 00:26:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:37381
19/11/26 00:26:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 00:26:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 00:26:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 00:26:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574727962.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45139', 'job_port': u'0'}
19/11/26 00:26:06 INFO statecache.__init__: Creating state cache with size 0
19/11/26 00:26:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39623.
19/11/26 00:26:06 INFO sdk_worker.__init__: Control channel established.
19/11/26 00:26:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/26 00:26:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40549.
19/11/26 00:26:06 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 00:26:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:45079
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 00:26:06 INFO sdk_worker.run: No more requests from control plane
19/11/26 00:26:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 00:26:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 00:26:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 00:26:06 INFO sdk_worker.run: Done consuming work.
19/11/26 00:26:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 00:26:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 00:26:06 INFO sdk_worker_main.main: Logging handler created.
19/11/26 00:26:06 INFO sdk_worker_main.start: Status HTTP server running at localhost:32967
19/11/26 00:26:06 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 00:26:06 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 00:26:06 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 00:26:06 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574727962.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45139', 'job_port': u'0'}
19/11/26 00:26:06 INFO statecache.__init__: Creating state cache with size 0
19/11/26 00:26:06 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45457.
19/11/26 00:26:06 INFO sdk_worker.__init__: Control channel established.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/26 00:26:06 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 00:26:06 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40439.
19/11/26 00:26:06 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 00:26:06 INFO data_plane.create_data_channel: Creating client data channel for localhost:44967
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 00:26:06 INFO sdk_worker.run: No more requests from control plane
19/11/26 00:26:06 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 00:26:06 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:06 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 00:26:06 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 00:26:06 INFO sdk_worker.run: Done consuming work.
19/11/26 00:26:06 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 00:26:06 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 00:26:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/26 00:26:07 INFO sdk_worker_main.main: Logging handler created.
19/11/26 00:26:07 INFO sdk_worker_main.start: Status HTTP server running at localhost:40403
19/11/26 00:26:07 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/26 00:26:07 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/26 00:26:07 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/26 00:26:07 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574727962.59', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:45139', 'job_port': u'0'}
19/11/26 00:26:07 INFO statecache.__init__: Creating state cache with size 0
19/11/26 00:26:07 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39789.
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/26 00:26:07 INFO sdk_worker.__init__: Control channel established.
19/11/26 00:26:07 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/26 00:26:07 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38997.
19/11/26 00:26:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/26 00:26:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:46621
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/26 00:26:07 INFO sdk_worker.run: No more requests from control plane
19/11/26 00:26:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/26 00:26:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:07 INFO data_plane.close: Closing all cached grpc data channels.
19/11/26 00:26:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/26 00:26:07 INFO sdk_worker.run: Done consuming work.
19/11/26 00:26:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/26 00:26:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/26 00:26:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/26 00:26:07 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574727962.59_29075974-d5df-44bd-bb30-d361466f63bf finished.
19/11/26 00:26:07 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/26 00:26:07 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_6b7e2412-94d3-4d5d-9142-ac83919c029a","basePath":"/tmp/sparktest2t2UBP"}: {}
java.io.FileNotFoundException: /tmp/sparktest2t2UBP/job_6b7e2412-94d3-4d5d-9142-ac83919c029a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
==================== Timed out after 60 seconds. ====================

  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
# Thread: <Thread(wait_until_finish_read, started daemon 139639831103232)>

# Thread: <Thread(Thread-120, started daemon 139639822710528)>

# Thread: <_MainThread(MainThread, started 139640618936064)>
==================== Timed out after 60 seconds. ====================

BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 139639805138688)>

  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(Thread-126, started daemon 139639813793536)>

RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574727953.16_5a97825c-4808-4c27-a4e7-1dbc56a65d7e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 313.508s

# Thread: <_MainThread(MainThread, started 139640618936064)>
FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 42s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/kebpftbuvad3a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1627

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1627/display/redirect?page=changes>

Changes:

[thw] [BEAM-8815] Skip manifest when no artifacts are staged


------------------------------------------
[...truncated 1.31 MB...]
19/11/25 22:38:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66 on Spark master local
19/11/25 22:38:37 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/25 22:38:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66: Pipeline translated successfully. Computing outputs
19/11/25 22:38:37 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:37 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:37 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:37 INFO sdk_worker_main.start: Status HTTP server running at localhost:39679
19/11/25 22:38:37 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:37 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:37 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:37 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:37 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:37 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38589.
19/11/25 22:38:37 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/25 22:38:37 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:37 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:37 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34345.
19/11/25 22:38:37 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:37 INFO data_plane.create_data_channel: Creating client data channel for localhost:43641
19/11/25 22:38:37 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:38 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:38 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:38 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:38 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:38 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:38 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:38 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:38 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:38 INFO sdk_worker_main.start: Status HTTP server running at localhost:36277
19/11/25 22:38:38 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:38 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:38 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:38 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:38 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:38 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40909.
19/11/25 22:38:38 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 22:38:38 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:38 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46531.
19/11/25 22:38:38 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:38 INFO data_plane.create_data_channel: Creating client data channel for localhost:35217
19/11/25 22:38:38 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:39 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:39 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:39 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:39 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:39 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:39 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:39 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:39 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:39 INFO sdk_worker_main.start: Status HTTP server running at localhost:44277
19/11/25 22:38:39 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:39 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:39 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:39 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:39 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:39 INFO sdk_worker.__init__: Creating insecure control channel for localhost:44761.
19/11/25 22:38:39 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 22:38:39 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:39 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41277.
19/11/25 22:38:39 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:39 INFO data_plane.create_data_channel: Creating client data channel for localhost:43829
19/11/25 22:38:39 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:40 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:40 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:40 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:40 INFO sdk_worker_main.start: Status HTTP server running at localhost:45581
19/11/25 22:38:40 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:40 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:40 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:40 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:40 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:40 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33463.
19/11/25 22:38:40 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 22:38:40 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:40 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:45105.
19/11/25 22:38:40 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:40 INFO data_plane.create_data_channel: Creating client data channel for localhost:43837
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:40 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:40 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:40 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:40 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:40 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:40 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:40 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:40 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for __no_artifacts_staged__
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 22:38:41 INFO sdk_worker_main.main: Logging handler created.
19/11/25 22:38:41 INFO sdk_worker_main.start: Status HTTP server running at localhost:40571
19/11/25 22:38:41 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 22:38:41 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 22:38:41 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 22:38:41 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574721516.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:58283', 'job_port': u'0'}
19/11/25 22:38:41 INFO statecache.__init__: Creating state cache with size 0
19/11/25 22:38:41 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42003.
19/11/25 22:38:41 INFO sdk_worker.__init__: Control channel established.
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 22:38:41 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 22:38:41 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38081.
19/11/25 22:38:41 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 22:38:41 INFO data_plane.create_data_channel: Creating client data channel for localhost:44451
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 22:38:41 INFO sdk_worker.run: No more requests from control plane
19/11/25 22:38:41 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 22:38:41 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:41 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 22:38:41 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 22:38:41 INFO sdk_worker.run: Done consuming work.
19/11/25 22:38:41 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 22:38:41 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 22:38:42 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 22:38:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574721516.16_3f5025b7-1561-4649-9eb5-14b47768dd66 finished.
19/11/25 22:38:42 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 22:38:42 ERROR org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Failed to remove job staging directory for token {"sessionId":"job_c31878fc-d29b-4a84-977f-9a018813be9b","basePath":"/tmp/sparktestw3oExX"}: {}
java.io.FileNotFoundException: /tmp/sparktestw3oExX/job_c31878fc-d29b-4a84-977f-9a018813be9b/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:226)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:46)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:107)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:93)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139913911588608)>
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.


======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-120, started daemon 139913919981312)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <_MainThread(MainThread, started 139914903885568)>
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574721506.25_bd164bee-224b-4c11-9b5e-d4aa5e22bdbe failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 306.313s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 59s
60 actionable tasks: 51 executed, 9 from cache

Publishing build scan...
https://gradle.com/s/rru2uo5j44wro

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1626

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1626/display/redirect?page=changes>

Changes:

[worldkzd] fix typos

[github] Update class_test.go

[worldkzd] keep 'a https'


------------------------------------------
[...truncated 1.34 MB...]
19/11/25 20:53:43 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:43 INFO data_plane.create_data_channel: Creating client data channel for localhost:37353
19/11/25 20:53:43 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:43 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:43 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:43 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:44 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:44 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:44 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:44 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:44 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST -> 0 artifacts
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 20:53:44 INFO sdk_worker_main.main: Logging handler created.
19/11/25 20:53:44 INFO sdk_worker_main.start: Status HTTP server running at localhost:41129
19/11/25 20:53:44 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 20:53:44 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 20:53:44 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 20:53:44 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574715221.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48315', 'job_port': u'0'}
19/11/25 20:53:44 INFO statecache.__init__: Creating state cache with size 0
19/11/25 20:53:44 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46269.
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 20:53:44 INFO sdk_worker.__init__: Control channel established.
19/11/25 20:53:44 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 20:53:44 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46493.
19/11/25 20:53:44 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:44 INFO data_plane.create_data_channel: Creating client data channel for localhost:46561
19/11/25 20:53:44 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:45 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:45 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:45 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:45 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:45 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:45 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:45 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:45 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:45 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST
19/11/25 20:53:45 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST -> 0 artifacts
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 20:53:46 INFO sdk_worker_main.main: Logging handler created.
19/11/25 20:53:46 INFO sdk_worker_main.start: Status HTTP server running at localhost:45085
19/11/25 20:53:46 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 20:53:46 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 20:53:46 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 20:53:46 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574715221.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48315', 'job_port': u'0'}
19/11/25 20:53:46 INFO statecache.__init__: Creating state cache with size 0
19/11/25 20:53:46 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34715.
19/11/25 20:53:46 INFO sdk_worker.__init__: Control channel established.
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 20:53:46 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 20:53:46 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46687.
19/11/25 20:53:46 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:46 INFO data_plane.create_data_channel: Creating client data channel for localhost:40203
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:46 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:46 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:46 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:46 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:46 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:46 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:46 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST -> 0 artifacts
19/11/25 20:53:46 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 20:53:47 INFO sdk_worker_main.main: Logging handler created.
19/11/25 20:53:47 INFO sdk_worker_main.start: Status HTTP server running at localhost:38943
19/11/25 20:53:47 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 20:53:47 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 20:53:47 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 20:53:47 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574715221.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48315', 'job_port': u'0'}
19/11/25 20:53:47 INFO statecache.__init__: Creating state cache with size 0
19/11/25 20:53:47 INFO sdk_worker.__init__: Creating insecure control channel for localhost:36843.
19/11/25 20:53:47 INFO sdk_worker.__init__: Control channel established.
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 20:53:47 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 20:53:47 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44809.
19/11/25 20:53:47 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:47 INFO data_plane.create_data_channel: Creating client data channel for localhost:37811
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:47 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:47 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:47 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:47 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:47 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:47 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:47 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST
19/11/25 20:53:47 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST -> 0 artifacts
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 20:53:48 INFO sdk_worker_main.main: Logging handler created.
19/11/25 20:53:48 INFO sdk_worker_main.start: Status HTTP server running at localhost:36555
19/11/25 20:53:48 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 20:53:48 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 20:53:48 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 20:53:48 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574715221.49', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:48315', 'job_port': u'0'}
19/11/25 20:53:48 INFO statecache.__init__: Creating state cache with size 0
19/11/25 20:53:48 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41469.
19/11/25 20:53:48 INFO sdk_worker.__init__: Control channel established.
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 20:53:48 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 20:53:48 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36009.
19/11/25 20:53:48 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 20:53:48 INFO data_plane.create_data_channel: Creating client data channel for localhost:42787
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 20:53:48 INFO sdk_worker.run: No more requests from control plane
19/11/25 20:53:48 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 20:53:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:48 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 20:53:48 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 20:53:48 INFO sdk_worker.run: Done consuming work.
19/11/25 20:53:48 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 20:53:48 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 20:53:48 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574715221.49_a5254e23-9906-4e43-9525-49ef97636999 finished.
19/11/25 20:53:48 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/MANIFEST has 0 artifact locations
19/11/25 20:53:48 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest3ZGYBH/job_fb8f0e8b-0b65-43b2-a654-35f9f4899f7b/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140190636517120)>

    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <Thread(Thread-120, started daemon 140190907688704)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <_MainThread(MainThread, started 140191423080192)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
# Thread: <Thread(wait_until_finish_read, started daemon 140190139016960)>

  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <Thread(Thread-126, started daemon 140190147671808)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 140191423080192)>
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574715211.25_5fe56ce4-06a8-4c15-af24-c39004005585 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 313.578s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 6s
60 actionable tasks: 49 executed, 11 from cache

Publishing build scan...
https://gradle.com/s/5x7h2f6dswfhs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1625

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1625/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8575] Added a unit test that Reshuffle preserves timestamps


------------------------------------------
[...truncated 1.34 MB...]
19/11/25 19:27:07 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:07 INFO data_plane.create_data_channel: Creating client data channel for localhost:43261
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:07 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:07 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:07 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:07 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:07 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:07 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:07 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST
19/11/25 19:27:07 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST -> 0 artifacts
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 19:27:08 INFO sdk_worker_main.main: Logging handler created.
19/11/25 19:27:08 INFO sdk_worker_main.start: Status HTTP server running at localhost:39387
19/11/25 19:27:08 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 19:27:08 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 19:27:08 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 19:27:08 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574710026.02', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43289', 'job_port': u'0'}
19/11/25 19:27:08 INFO statecache.__init__: Creating state cache with size 0
19/11/25 19:27:08 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45087.
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 19:27:08 INFO sdk_worker.__init__: Control channel established.
19/11/25 19:27:08 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 19:27:08 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41533.
19/11/25 19:27:08 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:08 INFO data_plane.create_data_channel: Creating client data channel for localhost:40457
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:08 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:08 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:08 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:08 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:08 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:08 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:08 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST
19/11/25 19:27:08 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST -> 0 artifacts
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 19:27:09 INFO sdk_worker_main.main: Logging handler created.
19/11/25 19:27:09 INFO sdk_worker_main.start: Status HTTP server running at localhost:45817
19/11/25 19:27:09 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 19:27:09 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 19:27:09 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 19:27:09 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574710026.02', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43289', 'job_port': u'0'}
19/11/25 19:27:09 INFO statecache.__init__: Creating state cache with size 0
19/11/25 19:27:09 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43743.
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 19:27:09 INFO sdk_worker.__init__: Control channel established.
19/11/25 19:27:09 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 19:27:09 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44775.
19/11/25 19:27:09 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:09 INFO data_plane.create_data_channel: Creating client data channel for localhost:40719
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:09 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:09 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:09 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:09 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:09 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:09 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:09 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST
19/11/25 19:27:09 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST -> 0 artifacts
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 19:27:10 INFO sdk_worker_main.main: Logging handler created.
19/11/25 19:27:10 INFO sdk_worker_main.start: Status HTTP server running at localhost:45001
19/11/25 19:27:10 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 19:27:10 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 19:27:10 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 19:27:10 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574710026.02', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43289', 'job_port': u'0'}
19/11/25 19:27:10 INFO statecache.__init__: Creating state cache with size 0
19/11/25 19:27:10 INFO sdk_worker.__init__: Creating insecure control channel for localhost:39953.
19/11/25 19:27:10 INFO sdk_worker.__init__: Control channel established.
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 19:27:10 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 19:27:10 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40245.
19/11/25 19:27:10 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:10 INFO data_plane.create_data_channel: Creating client data channel for localhost:46551
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:10 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:10 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:10 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:10 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:10 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:10 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:10 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST
19/11/25 19:27:10 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST -> 0 artifacts
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 19:27:11 INFO sdk_worker_main.main: Logging handler created.
19/11/25 19:27:11 INFO sdk_worker_main.start: Status HTTP server running at localhost:40923
19/11/25 19:27:11 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 19:27:11 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 19:27:11 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 19:27:11 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574710026.02', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:43289', 'job_port': u'0'}
19/11/25 19:27:11 INFO statecache.__init__: Creating state cache with size 0
19/11/25 19:27:11 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43769.
19/11/25 19:27:11 INFO sdk_worker.__init__: Control channel established.
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 19:27:11 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 19:27:11 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41647.
19/11/25 19:27:11 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 19:27:11 INFO data_plane.create_data_channel: Creating client data channel for localhost:40895
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 19:27:11 INFO sdk_worker.run: No more requests from control plane
19/11/25 19:27:11 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 19:27:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:11 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 19:27:11 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 19:27:11 INFO sdk_worker.run: Done consuming work.
19/11/25 19:27:11 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 19:27:11 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 19:27:11 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574710026.02_bb0c8209-df8e-4caa-a8f0-dbcd2163b6b1 finished.
19/11/25 19:27:11 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/MANIFEST has 0 artifact locations
19/11/25 19:27:11 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktest5DeGYq/job_06365812-f64b-4977-951a-7f95ddd36557/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139978944124672)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)

# Thread: <Thread(Thread-118, started daemon 139978935731968)>

# Thread: <_MainThread(MainThread, started 139979723355904)>
BaseException: Timed out after 60 seconds.

==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 139978909505280)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <Thread(Thread-124, started daemon 139978918160128)>

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
# Thread: <_MainThread(MainThread, started 139979723355904)>
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574710016.39_c5e4dd49-5c46-48de-a5f2-dba53c6ab232 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 301.054s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 19s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/ywtai6yipuhzo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1624

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1624/display/redirect?page=changes>

Changes:

[pabloem] [BEAM-876] Support schemaUpdateOption in BigQueryIO (#9524)


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:28:56 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:28:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:28:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:56 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:28:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:28:56 INFO sdk_worker.run: Done consuming work.
19/11/25 18:28:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:28:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:28:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST
19/11/25 18:28:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST -> 0 artifacts
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:28:57 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:28:57 INFO sdk_worker_main.start: Status HTTP server running at localhost:39573
19/11/25 18:28:57 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:28:57 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:28:57 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:28:57 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574706534.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36165', 'job_port': u'0'}
19/11/25 18:28:57 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:28:57 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42527.
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 18:28:57 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:28:57 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:28:57 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46191.
19/11/25 18:28:57 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:28:57 INFO data_plane.create_data_channel: Creating client data channel for localhost:44641
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:28:57 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:28:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:28:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:57 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:28:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:28:57 INFO sdk_worker.run: Done consuming work.
19/11/25 18:28:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:28:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST
19/11/25 18:28:57 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST -> 0 artifacts
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:28:58 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:28:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:41997
19/11/25 18:28:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:28:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:28:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:28:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574706534.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36165', 'job_port': u'0'}
19/11/25 18:28:58 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:28:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35043.
19/11/25 18:28:58 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 18:28:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:28:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38185.
19/11/25 18:28:58 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:28:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:45325
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:28:58 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:28:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:28:58 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:28:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:28:58 INFO sdk_worker.run: Done consuming work.
19/11/25 18:28:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:28:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST
19/11/25 18:28:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST -> 0 artifacts
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:28:59 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:28:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:46085
19/11/25 18:28:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:28:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:28:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:28:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574706534.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36165', 'job_port': u'0'}
19/11/25 18:28:59 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:28:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:42507.
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 18:28:59 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:28:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:28:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39541.
19/11/25 18:28:59 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:28:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:45217
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:28:59 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:28:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:28:59 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:28:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:28:59 INFO sdk_worker.run: Done consuming work.
19/11/25 18:28:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:28:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST
19/11/25 18:28:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST -> 0 artifacts
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:29:00 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:29:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:41575
19/11/25 18:29:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:29:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:29:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:29:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574706534.31', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:36165', 'job_port': u'0'}
19/11/25 18:29:00 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:29:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40795.
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 18:29:00 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:29:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:29:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:39147.
19/11/25 18:29:00 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:29:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:36415
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:29:00 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:29:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:29:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:29:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:29:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:29:00 INFO sdk_worker.run: Done consuming work.
19/11/25 18:29:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:29:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:29:00 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574706534.31_06e05ea0-b99e-415a-9c52-4c48deec4c71 finished.
19/11/25 18:29:00 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/MANIFEST has 0 artifact locations
19/11/25 18:29:00 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestikWkB3/job_c47c43d1-9598-4280-860e-c65c5768047e/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
==================== Timed out after 60 seconds. ====================

    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
# Thread: <Thread(wait_until_finish_read, started daemon 140261138609920)>

  File "apache_beam/runners/portability/portable_ru# Thread: <Thread(Thread-117, started daemon 140261130217216)>

nner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <_MainThread(MainThread, started 140261920012032)>
==================== Timed out after 60 seconds. ====================

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(wait_until_finish_read, started daemon 140261105039104)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <Thread(Thread-123, started daemon 140261113431808)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 140261130217216)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

# Thread: <_MainThread(MainThread, started 140261920012032)>

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
# Thread: <Thread(wait_until_finish_read, started daemon 140261138609920)>
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574706523.59_34fd9717-f676-40a1-a3e0-77e094720053 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 338.116s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 24s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/akdxcgncuten2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1623

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1623/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:20 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:20 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST -> 0 artifacts
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:16:20 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:16:20 INFO sdk_worker_main.start: Status HTTP server running at localhost:45193
19/11/25 18:16:20 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:16:20 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:16:20 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:16:20 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574705778.15', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46059', 'job_port': u'0'}
19/11/25 18:16:20 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:16:20 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40919.
19/11/25 18:16:20 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 18:16:20 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:16:20 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38741.
19/11/25 18:16:20 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:16:20 INFO data_plane.create_data_channel: Creating client data channel for localhost:36275
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:20 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:20 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:20 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:20 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:20 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:20 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:20 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:20 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST -> 0 artifacts
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:16:21 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:16:21 INFO sdk_worker_main.start: Status HTTP server running at localhost:40223
19/11/25 18:16:21 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:16:21 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:16:21 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:16:21 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574705778.15', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46059', 'job_port': u'0'}
19/11/25 18:16:21 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:16:21 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34533.
19/11/25 18:16:21 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 18:16:21 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:16:21 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36123.
19/11/25 18:16:21 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:16:21 INFO data_plane.create_data_channel: Creating client data channel for localhost:32887
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:21 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:21 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:21 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:21 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:21 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:21 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:21 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:21 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST -> 0 artifacts
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:16:22 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:16:22 INFO sdk_worker_main.start: Status HTTP server running at localhost:40559
19/11/25 18:16:22 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:16:22 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:16:22 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:16:22 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574705778.15', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46059', 'job_port': u'0'}
19/11/25 18:16:22 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:16:22 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46197.
19/11/25 18:16:22 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:16:22 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 18:16:22 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:43217.
19/11/25 18:16:22 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:16:22 INFO data_plane.create_data_channel: Creating client data channel for localhost:40773
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:22 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:22 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:22 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:22 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:22 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:22 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:22 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST
19/11/25 18:16:22 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST -> 0 artifacts
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:16:23 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:16:23 INFO sdk_worker_main.start: Status HTTP server running at localhost:35041
19/11/25 18:16:23 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:16:23 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:16:23 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:16:23 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574705778.15', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:46059', 'job_port': u'0'}
19/11/25 18:16:23 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:16:23 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35799.
19/11/25 18:16:23 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 18:16:23 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:16:23 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44101.
19/11/25 18:16:23 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:16:23 INFO data_plane.create_data_channel: Creating client data channel for localhost:38201
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:16:23 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:16:23 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:16:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:23 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:16:23 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:16:23 INFO sdk_worker.run: Done consuming work.
19/11/25 18:16:23 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:16:23 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:16:23 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574705778.15_153f640c-4c42-4b26-84ac-6e62a59e5b50 finished.
19/11/25 18:16:23 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/MANIFEST has 0 artifact locations
19/11/25 18:16:23 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestBZ3zK_/job_694de2b4-d1be-423b-b717-7bd965e47230/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
# Thread: <Thread(wait_until_finish_read, started daemon 140318498010880)>

    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-117, started daemon 140318489618176)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
# Thread: <_MainThread(MainThread, started 140319279621888)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140317856626432)>

# Thread: <Thread(Thread-123, started daemon 140318472832768)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 140318489618176)>

# Thread: <_MainThread(MainThread, started 140319279621888)>

# Thread: <Thread(wait_until_finish_read, started daemon 140318498010880)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574705766.73_cce27912-03b8-4bf9-838f-6754d487ee92 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 311.314s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 1s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/4ux44a3gk6sy6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1622

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1622/display/redirect?page=changes>

Changes:

[kirillkozlov] Fix MongoDb SQL Integration Tests

[kirillkozlov] Add MongoDbIT back to build file

[kirillkozlov] Update JavaDoc comment and remove pipeline options


------------------------------------------
[...truncated 1.33 MB...]
19/11/25 18:02:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1574704965.96_86f8fc35-b902-475f-b232-756f5ca4d93f finished.
19/11/25 18:02:51 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 18:02:51 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestn5Xgq_/job_75e9ec83-f03c-4c4a-8c2c-30f486aafb98/MANIFEST has 0 artifact locations
19/11/25 18:02:51 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestn5Xgq_/job_75e9ec83-f03c-4c4a-8c2c-30f486aafb98/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7f77bda41230> ====================
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8
19/11/25 18:02:52 INFO org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job invocation test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 1 files. (Enable logging at DEBUG level to see which files will be staged.)
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8 on Spark master local
19/11/25 18:02:52 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
19/11/25 18:02:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8: Pipeline translated successfully. Computing outputs
19/11/25 18:02:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST has 0 artifact locations
19/11/25 18:02:52 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:53 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:53 INFO sdk_worker_main.start: Status HTTP server running at localhost:40841
19/11/25 18:02:53 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:53 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:53 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:53 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:53 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:53 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40113.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 261-1
19/11/25 18:02:53 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:53 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:53 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:34777.
19/11/25 18:02:53 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:53 INFO data_plane.create_data_channel: Creating client data channel for localhost:36689
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:53 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:53 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:53 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:53 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:53 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:53 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:53 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:53 INFO sdk_worker_main.start: Status HTTP server running at localhost:41363
19/11/25 18:02:53 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:53 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:53 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:53 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:53 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:53 INFO sdk_worker.__init__: Creating insecure control channel for localhost:34201.
19/11/25 18:02:53 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:53 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 18:02:53 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44899.
19/11/25 18:02:53 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:53 INFO data_plane.create_data_channel: Creating client data channel for localhost:42715
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:53 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:53 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:53 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:53 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:53 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:53 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:53 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:53 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:54 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:54 INFO sdk_worker_main.start: Status HTTP server running at localhost:35859
19/11/25 18:02:54 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:54 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:54 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:54 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:54 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:54 INFO sdk_worker.__init__: Creating insecure control channel for localhost:40877.
19/11/25 18:02:54 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:54 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 18:02:54 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:41493.
19/11/25 18:02:54 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:54 INFO data_plane.create_data_channel: Creating client data channel for localhost:40425
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:54 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:54 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:54 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:54 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:54 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:54 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:54 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:54 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:55 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:55 INFO sdk_worker_main.start: Status HTTP server running at localhost:40325
19/11/25 18:02:55 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:55 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:55 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:55 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:55 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:55 INFO sdk_worker.__init__: Creating insecure control channel for localhost:43215.
19/11/25 18:02:55 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:55 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 18:02:55 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:36697.
19/11/25 18:02:55 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:55 INFO data_plane.create_data_channel: Creating client data channel for localhost:46427
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:55 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:55 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:55 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:55 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:55 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:55 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:55 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST
19/11/25 18:02:55 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST -> 0 artifacts
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 18:02:56 INFO sdk_worker_main.main: Logging handler created.
19/11/25 18:02:56 INFO sdk_worker_main.start: Status HTTP server running at localhost:46675
19/11/25 18:02:56 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 18:02:56 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 18:02:56 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 18:02:56 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574704971.37', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:33413', 'job_port': u'0'}
19/11/25 18:02:56 INFO statecache.__init__: Creating state cache with size 0
19/11/25 18:02:56 INFO sdk_worker.__init__: Creating insecure control channel for localhost:46777.
19/11/25 18:02:56 INFO sdk_worker.__init__: Control channel established.
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 18:02:56 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 18:02:56 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44623.
19/11/25 18:02:56 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 18:02:56 INFO data_plane.create_data_channel: Creating client data channel for localhost:41611
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 18:02:56 INFO sdk_worker.run: No more requests from control plane
19/11/25 18:02:56 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 18:02:56 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 18:02:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:56 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 18:02:56 INFO sdk_worker.run: Done consuming work.
19/11/25 18:02:56 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 18:02:56 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 18:02:56 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574704971.37_eb845bb5-00c1-4c5b-906b-006f0a3cc0e8 finished.
19/11/25 18:02:56 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/MANIFEST has 0 artifact locations
19/11/25 18:02:56 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestn5Xgq_/job_72db3842-f758-4c7a-a5e6-0639753c0db6/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler

# Thread: <Thread(wait_until_finish_read, started daemon 140151745402624)>

# Thread: <Thread(Thread-117, started daemon 140151737009920)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
# Thread: <_MainThread(MainThread, started 140152740226816)>
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574704962.48_5d4d7c3e-4fdb-41cb-96b3-148f2977583e failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 285.147s

FAILED (errors=2, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/vnndw2fs3wdhg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1621

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1621/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:10:57 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:10:57 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:10:57 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:57 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:10:57 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:10:57 INFO sdk_worker.run: Done consuming work.
19/11/25 12:10:57 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:10:57 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:10:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST -> 0 artifacts
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 12:10:58 INFO sdk_worker_main.main: Logging handler created.
19/11/25 12:10:58 INFO sdk_worker_main.start: Status HTTP server running at localhost:33809
19/11/25 12:10:58 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 12:10:58 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 12:10:58 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 12:10:58 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574683856.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35875', 'job_port': u'0'}
19/11/25 12:10:58 INFO statecache.__init__: Creating state cache with size 0
19/11/25 12:10:58 INFO sdk_worker.__init__: Creating insecure control channel for localhost:41607.
19/11/25 12:10:58 INFO sdk_worker.__init__: Control channel established.
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 12:10:58 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 12:10:58 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46043.
19/11/25 12:10:58 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 12:10:58 INFO data_plane.create_data_channel: Creating client data channel for localhost:44049
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:10:58 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:10:58 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:10:58 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:58 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:10:58 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:10:58 INFO sdk_worker.run: Done consuming work.
19/11/25 12:10:58 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:10:58 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:10:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST -> 0 artifacts
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 12:10:59 INFO sdk_worker_main.main: Logging handler created.
19/11/25 12:10:59 INFO sdk_worker_main.start: Status HTTP server running at localhost:39955
19/11/25 12:10:59 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 12:10:59 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 12:10:59 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 12:10:59 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574683856.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35875', 'job_port': u'0'}
19/11/25 12:10:59 INFO statecache.__init__: Creating state cache with size 0
19/11/25 12:10:59 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45485.
19/11/25 12:10:59 INFO sdk_worker.__init__: Control channel established.
19/11/25 12:10:59 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 12:10:59 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42405.
19/11/25 12:10:59 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 12:10:59 INFO data_plane.create_data_channel: Creating client data channel for localhost:42259
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:10:59 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:10:59 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:10:59 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:10:59 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:10:59 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:10:59 INFO sdk_worker.run: Done consuming work.
19/11/25 12:10:59 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:10:59 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:11:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST -> 0 artifacts
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 12:11:00 INFO sdk_worker_main.main: Logging handler created.
19/11/25 12:11:00 INFO sdk_worker_main.start: Status HTTP server running at localhost:36559
19/11/25 12:11:00 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 12:11:00 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 12:11:00 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 12:11:00 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574683856.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35875', 'job_port': u'0'}
19/11/25 12:11:00 INFO statecache.__init__: Creating state cache with size 0
19/11/25 12:11:00 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33507.
19/11/25 12:11:00 INFO sdk_worker.__init__: Control channel established.
19/11/25 12:11:00 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 12:11:00 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:42291.
19/11/25 12:11:00 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 12:11:00 INFO data_plane.create_data_channel: Creating client data channel for localhost:44873
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:11:00 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:11:00 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:11:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:00 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:11:00 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:11:00 INFO sdk_worker.run: Done consuming work.
19/11/25 12:11:00 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:11:00 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:11:00 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST -> 0 artifacts
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 12:11:01 INFO sdk_worker_main.main: Logging handler created.
19/11/25 12:11:01 INFO sdk_worker_main.start: Status HTTP server running at localhost:35669
19/11/25 12:11:01 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 12:11:01 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 12:11:01 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 12:11:01 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574683856.16', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:35875', 'job_port': u'0'}
19/11/25 12:11:01 INFO statecache.__init__: Creating state cache with size 0
19/11/25 12:11:01 INFO sdk_worker.__init__: Creating insecure control channel for localhost:37843.
19/11/25 12:11:01 INFO sdk_worker.__init__: Control channel established.
19/11/25 12:11:01 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 12:11:01 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:46211.
19/11/25 12:11:01 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 12:11:01 INFO data_plane.create_data_channel: Creating client data channel for localhost:38813
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 12:11:01 INFO sdk_worker.run: No more requests from control plane
19/11/25 12:11:01 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 12:11:01 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 12:11:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:01 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 12:11:01 INFO sdk_worker.run: Done consuming work.
19/11/25 12:11:01 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 12:11:01 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 12:11:01 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574683856.16_6a8c438d-2aa7-4dea-b5b6-5c3dc9251dcd finished.
19/11/25 12:11:01 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/MANIFEST has 0 artifact locations
19/11/25 12:11:01 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestzb50d6/job_ef50b933-2ad7-4339-91db-ef785b85da23/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler

# Thread: <Thread(wait_until_finish_read, started daemon 139803754198784)>

    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
# Thread: <Thread(Thread-120, started daemon 139803745806080)>

ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
# Thread: <_MainThread(MainThread, started 139804549363456)>
==================== Timed out after 60 seconds. ====================

  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers
    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 139803257464576)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
# Thread: <Thread(Thread-124, started daemon 139803265857280)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
# Thread: <_MainThread(MainThread, started 139804549363456)>

# Thread: <Thread(wait_until_finish_read, started daemon 139803754198784)>

    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
# Thread: <Thread(Thread-120, started daemon 139803745806080)>
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574683844.64_ab82ddb2-947f-4863-9a8b-3c188b83bed6 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 324.524s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 9s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/on2lff5fisjbs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #1620

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/1620/display/redirect>

Changes:


------------------------------------------
[...truncated 1.34 MB...]
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:26 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:26 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:26 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:26 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:26 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:26 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:26 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:26 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST -> 0 artifacts
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 06:16:27 INFO sdk_worker_main.main: Logging handler created.
19/11/25 06:16:27 INFO sdk_worker_main.start: Status HTTP server running at localhost:45403
19/11/25 06:16:27 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 06:16:27 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 06:16:27 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 06:16:27 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574662584.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52089', 'job_port': u'0'}
19/11/25 06:16:27 INFO statecache.__init__: Creating state cache with size 0
19/11/25 06:16:27 INFO sdk_worker.__init__: Creating insecure control channel for localhost:33469.
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 262-1
19/11/25 06:16:27 INFO sdk_worker.__init__: Control channel established.
19/11/25 06:16:27 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 06:16:27 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:33503.
19/11/25 06:16:27 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 06:16:27 INFO data_plane.create_data_channel: Creating client data channel for localhost:33411
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:27 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:27 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:27 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:27 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:27 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:27 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:27 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:27 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:28 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST -> 0 artifacts
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 06:16:28 INFO sdk_worker_main.main: Logging handler created.
19/11/25 06:16:28 INFO sdk_worker_main.start: Status HTTP server running at localhost:37509
19/11/25 06:16:28 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 06:16:28 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 06:16:28 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 06:16:28 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574662584.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52089', 'job_port': u'0'}
19/11/25 06:16:28 INFO statecache.__init__: Creating state cache with size 0
19/11/25 06:16:28 INFO sdk_worker.__init__: Creating insecure control channel for localhost:45343.
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 263-1
19/11/25 06:16:28 INFO sdk_worker.__init__: Control channel established.
19/11/25 06:16:28 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 06:16:28 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:38331.
19/11/25 06:16:28 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 06:16:28 INFO data_plane.create_data_channel: Creating client data channel for localhost:43569
19/11/25 06:16:28 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:29 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:29 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:29 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:29 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:29 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:29 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:29 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST -> 0 artifacts
19/11/25 06:16:29 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 06:16:30 INFO sdk_worker_main.main: Logging handler created.
19/11/25 06:16:30 INFO sdk_worker_main.start: Status HTTP server running at localhost:43721
19/11/25 06:16:30 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 06:16:30 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 06:16:30 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 06:16:30 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574662584.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52089', 'job_port': u'0'}
19/11/25 06:16:30 INFO statecache.__init__: Creating state cache with size 0
19/11/25 06:16:30 INFO sdk_worker.__init__: Creating insecure control channel for localhost:38623.
19/11/25 06:16:30 INFO sdk_worker.__init__: Control channel established.
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 264-1
19/11/25 06:16:30 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 06:16:30 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:40411.
19/11/25 06:16:30 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 06:16:30 INFO data_plane.create_data_channel: Creating client data channel for localhost:38903
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:30 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:30 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:30 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:30 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:30 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:30 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:30 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST
19/11/25 06:16:30 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: GetManifest for /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST -> 0 artifacts
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Beam Fn Logging client connected.
19/11/25 06:16:31 INFO sdk_worker_main.main: Logging handler created.
19/11/25 06:16:31 INFO sdk_worker_main.start: Status HTTP server running at localhost:43027
19/11/25 06:16:31 INFO sdk_worker_main.main: semi_persistent_directory: /tmp
19/11/25 06:16:31 WARN sdk_worker_main._load_main_session: No session file found: /tmp/staged/pickled_main_session. Functions defined in __main__ (interactive session) may fail. 
19/11/25 06:16:31 WARN pipeline_options.get_all_options: Discarding unparseable args: [u'--app_name=test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de', u'--job_server_timeout=60', u'--pipeline_type_check', u'--direct_runner_use_stacked_bundle', u'--spark_master=local', u'--options_id=30', u'--enable_spark_metric_sinks'] 
19/11/25 06:16:31 INFO sdk_worker_main.main: Python sdk harness started with pipeline_options: {'runner': u'None', 'experiments': [u'beam_fn_api'], 'environment_cache_millis': u'0', 'artifact_port': u'0', 'environment_type': u'PROCESS', 'sdk_location': u'container', 'job_name': u'test_windowing_1574662584.74', 'environment_config': u'{"command": "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh"}',> 'expansion_port': u'0', 'sdk_worker_parallelism': u'1', 'job_endpoint': u'localhost:52089', 'job_port': u'0'}
19/11/25 06:16:31 INFO statecache.__init__: Creating state cache with size 0
19/11/25 06:16:31 INFO sdk_worker.__init__: Creating insecure control channel for localhost:35533.
19/11/25 06:16:31 INFO sdk_worker.__init__: Control channel established.
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 265-1
19/11/25 06:16:31 INFO sdk_worker.__init__: Initializing SDKHarness with unbounded number of workers.
19/11/25 06:16:31 INFO sdk_worker.create_state_handler: Creating insecure state channel for localhost:44111.
19/11/25 06:16:31 INFO sdk_worker.create_state_handler: State channel established.
19/11/25 06:16:31 INFO data_plane.create_data_channel: Creating client data channel for localhost:42837
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:process:v1"
payload: "\032\202\001<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build/sdk_worker.sh">

19/11/25 06:16:31 INFO sdk_worker.run: No more requests from control plane
19/11/25 06:16:31 INFO sdk_worker.run: SDK Harness waiting for in-flight requests to complete
19/11/25 06:16:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:31 INFO data_plane.close: Closing all cached grpc data channels.
19/11/25 06:16:31 INFO sdk_worker.close: Closing all cached gRPC state handlers.
19/11/25 06:16:31 INFO sdk_worker.run: Done consuming work.
19/11/25 06:16:31 INFO sdk_worker_main.main: Python sdk harness exiting.
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService: Logging client hanged up.
19/11/25 06:16:31 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
19/11/25 06:16:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1574662584.74_29599733-7164-427a-85a1-6373e04db2de finished.
19/11/25 06:16:31 WARN org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: Manifest at /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/MANIFEST has 0 artifact locations
19/11/25 06:16:31 INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService: Removed dir /tmp/sparktestDjyB2X/job_cad19b92-71fc-48ce-8889-c139730105bf/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
======================================================================
ERROR: test_pardo_state_with_custom_key_coder (__main__.SparkRunnerTest)
Tests that state requests work correctly when the key coder is an
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/portable_runner_test.py", line 231, in test_pardo_state_with_custom_key_coder
    equal_to(expected))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_pardo_timers (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
==================== Timed out after 60 seconds. ====================
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 328, in test_pardo_timers

    assert_that(actual, equal_to(expected))
# Thread: <Thread(wait_until_finish_read, started daemon 140247383926528)>

  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 428, in wait_until_finish
    for state_response in self._state_stream:
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next
    return self._next()
# Thread: <Thread(Thread-117, started daemon 140247375533824)>

# Thread: <_MainThread(MainThread, started 140248510146304)>
==================== Timed out after 60 seconds. ====================

# Thread: <Thread(wait_until_finish_read, started daemon 140247358748416)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 552, in _next
    _common.wait(self._state.condition.wait, _response_ready)

# Thread: <Thread(Thread-123, started daemon 140247367141120)>

# Thread: <_MainThread(MainThread, started 140248510146304)>

  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 140, in wait
    _wait_once(wait_fn, MAXIMUM_WAIT_TIMEOUT, spin_cb)
# Thread: <Thread(Thread-117, started daemon 140247375533824)>

# Thread: <Thread(wait_until_finish_read, started daemon 140247383926528)>
  File "<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_common.py",> line 105, in _wait_once
    wait_fn(timeout=timeout)
  File "/usr/lib/python2.7/threading.py", line 359, in wait
    _sleep(delay)
  File "apache_beam/runners/portability/portable_runner_test.py", line 75, in handler
    raise BaseException(msg)
BaseException: Timed out after 60 seconds.

======================================================================
ERROR: test_sdf_with_watermark_tracking (__main__.SparkRunnerTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "apache_beam/runners/portability/fn_api_runner_test.py", line 499, in test_sdf_with_watermark_tracking
    assert_that(actual, equal_to(list(''.join(data))))
  File "apache_beam/pipeline.py", line 436, in __exit__
    self.run().wait_until_finish()
  File "apache_beam/runners/portability/portable_runner.py", line 438, in wait_until_finish
    self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline test_sdf_with_watermark_tracking_1574662573.02_5011cda4-931d-40ef-a51b-56a93311a735 failed in state FAILED: java.lang.UnsupportedOperationException: The ActiveBundle does not have a registered bundle checkpoint handler.

----------------------------------------------------------------------
Ran 38 tests in 352.103s

FAILED (errors=3, skipped=9)

> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 196

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:sparkValidatesRunner'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 51s
60 actionable tasks: 47 executed, 13 from cache

Publishing build scan...
https://gradle.com/s/3mnmfgcocm5lk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org