You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/03/11 16:22:58 UTC
Build failed in Jenkins: beam_PostCommit_XVR_Flink #1991
See <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/1991/display/redirect?page=changes>
Changes:
[12602502+Ardagan] [BEAM-9431] Remove ReadFromPubSub/Read-out0-ElementCount from the
------------------------------------------
[...truncated 937.89 KB...]
--------------------- >> end captured logging << ---------------------
======================================================================
ERROR: test_partition (apache_beam.transforms.validate_runner_xlang_test.ValidateRunnerXlangTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/validate_runner_xlang_test.py",> line 185, in test_partition
assert_that(res['1'], equal_to([1, 3, 5]), label='check_odd')
File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/pipeline.py",> line 522, in __exit__
self.run().wait_until_finish()
File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
state = result.wait_until_finish()
File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 532, in wait_until_finish
(self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline BeamApp-jenkins-0311162253-f112b4b3_cc22dd42-fefa-47d5-82d8-71062850d047 failed in state FAILED: org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.21.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
apache_beam.runners.portability.fn_api_runner_transforms: INFO: ==================== <function lift_combiners at 0x7f574b199410> ====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 35 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2643>)_4\n Create/FlatMap(<lambda at core.py:2643>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_9_AppliedPTransform_ExternalTransform(beam:transforms:xlang:test:partition)/Partition(CallableWrapperPartitionFn)/ParDo(ApplyPartitionFnFn)/ParDo(ApplyPartitionFnFn)_5\n ExternalTransform(beam:transforms:xlang:test:partition)/Partition(CallableWrapperPartitionFn)/ParDo(ApplyPartitionFnFn)/ParDo(ApplyPartitionFnFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_9_AppliedPTransform_ExternalTransform(beam:transforms:xlang:test:partition)/Map(<lambda at expansion_service_test.py:216>)_6\n ExternalTransform(beam:transforms:xlang:test:partition)/Map(<lambda at expansion_service_test.py:216>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_9_AppliedPTransform_ExternalTransform(beam:transforms:xlang:test:partition)/Map(<lambda at expansion_service_test.py:217>)_7\n ExternalTransform(beam:transforms:xlang:test:partition)/Map(<lambda at expansion_service_test.py:217>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Create/Impulse_20\n check_even/Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Create/FlatMap(<lambda at core.py:2643>)_21\n check_even/Create/FlatMap(<lambda at core.py:2643>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Create/Map(decode)_23\n check_even/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/WindowInto(WindowIntoFn)_24\n check_even/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/ToVoidKey_25\n check_even/ToVoidKey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Group/pair_with_0_27\n check_even/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Group/pair_with_1_28\n check_even/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Group/Flatten_29\n check_even/Group/Flatten:beam:transform:flatten:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Group/GroupByKey_30\n check_even/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Group/Map(_merge_tagged_vals_under_key)_34\n check_even/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Unkey_35\n check_even/Unkey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_even/Match_36\n check_even/Match:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Create/Impulse_39\n check_odd/Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Create/FlatMap(<lambda at core.py:2643>)_40\n check_odd/Create/FlatMap(<lambda at core.py:2643>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Create/Map(decode)_42\n check_odd/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/WindowInto(WindowIntoFn)_43\n check_odd/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/ToVoidKey_44\n check_odd/ToVoidKey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Group/pair_with_0_46\n check_odd/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Group/pair_with_1_47\n check_odd/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Group/Flatten_48\n check_odd/Group/Flatten:beam:transform:flatten:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Group/GroupByKey_49\n check_odd/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Group/Map(_merge_tagged_vals_under_key)_53\n check_odd/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Unkey_54\n check_odd/Unkey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_check_odd/Match_55\n check_odd/Match:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'job_name' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'runner' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'temp_location' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'experiments' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'streaming' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'dataflow_kms_key' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'enable_streaming_engine' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'project' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'worker_region' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'worker_zone' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'zone' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'environment_cache_millis' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'files_to_stage' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'job_endpoint' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'output_executable_path' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'sdk_worker_parallelism' was already added
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2483)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2479)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2479)
at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2568)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:85)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
at org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
at org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
root: ERROR: org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------
======================================================================
ERROR: test_prefix (apache_beam.transforms.validate_runner_xlang_test.ValidateRunnerXlangTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/transforms/validate_runner_xlang_test.py",> line 69, in test_prefix
assert_that(res, equal_to(['0a', '0b']))
File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/pipeline.py",> line 522, in __exit__
self.run().wait_until_finish()
File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run
state = result.wait_until_finish()
File "<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 532, in wait_until_finish
(self._job_id, self._state, self._last_error_message()))
RuntimeError: Pipeline BeamApp-jenkins-0311162255-16428003_67fb3802-4850-435b-8d53-8468f8a0db0f failed in state FAILED: org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
-------------------- >> begin captured logging << --------------------
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.21.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
apache_beam.runners.portability.fn_api_runner_transforms: INFO: ==================== <function lift_combiners at 0x7f574b199410> ====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 21 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: ['ref_AppliedPTransform_Create/Impulse_3\n Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/FlatMap(<lambda at core.py:2643>)_4\n Create/FlatMap(<lambda at core.py:2643>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n Create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create/Map(decode)_16\n Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'external_10_AppliedPTransform_ExternalTransform(beam:transforms:xlang:test:prefix)/TestLabel_3\n ExternalTransform(beam:transforms:xlang:test:prefix)/TestLabel:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_20\n assert_that/Create/Impulse:beam:transform:impulse:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2643>)_21\n assert_that/Create/FlatMap(<lambda at core.py:2643>):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_23\n assert_that/Create/Map(decode):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_24\n assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_25\n assert_that/ToVoidKey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_27\n assert_that/Group/pair_with_0:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_28\n assert_that/Group/pair_with_1:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_29\n assert_that/Group/Flatten:beam:transform:flatten:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_30\n assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_34\n assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_35\n assert_that/Unkey:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_36\n assert_that/Match:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: <unknown>']
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'job_name' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'runner' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'temp_location' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'experiments' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'streaming' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'dataflow_kms_key' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'enable_streaming_engine' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'project' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'worker_region' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'worker_zone' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'zone' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'environment_cache_millis' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'files_to_stage' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'job_endpoint' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'output_executable_path' was already added
apache_beam.runners.portability.portable_runner: DEBUG: Runner option 'sdk_worker_parallelism' was already added
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to RUNNING
root: DEBUG: org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2483)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2479)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2479)
at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2568)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:85)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
at org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
at org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
root: ERROR: org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:98)
org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:64)
org.apache.beam.runners.spark.SparkPipelineRunner.run(SparkPipelineRunner.java:108)
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:83)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 10 tests in 53.268s
FAILED (errors=4)
> Task :runners:flink:1.9:job-server:validatesCrossLanguageRunnerPythonUsingPython FAILED
> Task :runners:flink:1.9:job-server:validatesCrossLanguageRunnerCleanup
> Task :runners:flink:1.9:job-server:flinkJobServerCleanup
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:1.9:job-server:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_XVR_Flink/ws/src/runners/flink/1.9/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:1.9:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:1.9:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 21m 6s
115 actionable tasks: 87 executed, 26 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/patlh2u4vdz7m
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_XVR_Flink #1992
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Flink/1992/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org