You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/06/06 00:34:53 UTC

Build failed in Jenkins: beam_PostCommit_XVR_Spark #928

See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/928/display/redirect?page=changes>

Changes:

[heejong] [BEAM-9869] adding self-contained Kafka service jar for testing

[kamil.wasilewski] [BEAM-10145] Delete persistent disks after every KafkaIO performance

[github] [BEAM-9615] Add string coder utility functions. (#11925)

[github] [BEAM-9615] finish standardizing proto import name (#11927)

[github] remove LP indicator (#11937)

[github] [BEAM-2939] Fix FnApiDoFnRunner to ensure that we output within the

[github] [BEAM-10204] @Ignore: re-enable LIKE operator related unit tests.


------------------------------------------
[...truncated 1.04 MB...]
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:933)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:931)
	at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
	at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:931)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:2130)
	at org.apache.spark.util.EventLoop.stop(EventLoop.scala:84)
	at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2043)
	at org.apache.spark.SparkContext$$anonfun$stop$6.apply$mcV$sp(SparkContext.scala:1949)
	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1340)
	at org.apache.spark.SparkContext.stop(SparkContext.scala:1948)
	at org.apache.spark.SparkContext$$anonfun$2.apply$mcV$sp(SparkContext.scala:575)
	at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1945)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
	at scala.util.Try$.apply(Try.scala:192)
	at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
	at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:738)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
	at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:972)
	at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:970)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
	at org.apache.spark.rdd.RDD.foreach(RDD.scala:970)
	at org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:351)
	at org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:45)
	at org.apache.beam.runners.spark.translation.BoundedDataset.action(BoundedDataset.java:124)
	at org.apache.beam.runners.spark.translation.SparkTranslationContext.computeOutputs(SparkTranslationContext.java:82)
	at org.apache.beam.runners.spark.SparkPipelineRunner.lambda$run$1(SparkPipelineRunner.java:127)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	... 3 more

root: ERROR: org.apache.spark.SparkException: Job 0 cancelled because SparkContext was shut down
apache_beam.runners.portability.portable_runner: INFO: Job state changed to FAILED
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_tagged_join (apache_beam.transforms.sql_test.SqlTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/transforms/sql_test.py",> line 139, in test_tagged_join
    assert_that(out, equal_to([(1, "a"), (26, "z"), (1, "a")]))
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/pipeline.py",> line 547, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/pipeline.py",> line 526, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 424, in run_pipeline
    job_service_handle = self.create_job_service(options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 331, in create_job_service
    return self.create_job_service_handle(server.start(), options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 56, in start
    grpc.channel_ready_future(channel).result(timeout=self._timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/grpc/_utilities.py",> line 140, in result
    self._block(timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/grpc/_utilities.py",> line 86, in _block
    raise grpc.FutureTimeoutError()
FutureTimeoutError: 
-------------------- >> begin captured logging << --------------------
apache_beam.utils.subprocess_server: INFO: Using pre-built snapshot at <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.23.0-SNAPSHOT.jar>
apache_beam.utils.subprocess_server: INFO: Starting service with ['java' '-jar' '<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.23.0-SNAPSHOT.jar'> '55537']
root: DEBUG: Waiting for grpc channel to be ready at localhost:55537.
apache_beam.utils.subprocess_server: INFO: Starting expansion service at localhost:55537
root: DEBUG: Waiting for grpc channel to be ready at localhost:55537.
apache_beam.utils.subprocess_server: INFO: 	beam:external:java:sql:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/1327763628@593634ad
apache_beam.utils.subprocess_server: INFO: 	beam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/1327763628@20fa23c1
root: DEBUG: Waiting for grpc channel to be ready at localhost:55537.
root: DEBUG: Waiting for grpc channel to be ready at localhost:55537.
root: DEBUG: Waiting for grpc channel to be ready at localhost:55537.
root: DEBUG: Waiting for grpc channel to be ready at localhost:55537.
apache_beam.utils.subprocess_server: INFO: Jun 06, 2020 12:32:29 AM org.apache.beam.sdk.expansion.service.ExpansionService expand
apache_beam.utils.subprocess_server: INFO: INFO: Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'
apache_beam.utils.subprocess_server: INFO: Jun 06, 2020 12:32:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
apache_beam.utils.subprocess_server: INFO: INFO: SQL:
apache_beam.utils.subprocess_server: INFO: SELECT `simple`.`id` AS `id`, `enrich`.`metadata` AS `metadata`
apache_beam.utils.subprocess_server: INFO: FROM `beam`.`simple` AS `simple`
apache_beam.utils.subprocess_server: INFO: INNER JOIN `beam`.`enrich` AS `enrich` ON `simple`.`id` = `enrich`.`id`
apache_beam.utils.subprocess_server: INFO: Jun 06, 2020 12:32:31 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
apache_beam.utils.subprocess_server: INFO: INFO: SQLPlan>
apache_beam.utils.subprocess_server: INFO: LogicalProject(id=[$0], metadata=[$4])
apache_beam.utils.subprocess_server: INFO:   LogicalJoin(condition=[=($0, $3)], joinType=[inner])
apache_beam.utils.subprocess_server: INFO:     BeamIOSourceRel(table=[[beam, simple]])
apache_beam.utils.subprocess_server: INFO:     BeamIOSourceRel(table=[[beam, enrich]])
apache_beam.utils.subprocess_server: INFO: 
apache_beam.utils.subprocess_server: INFO: Jun 06, 2020 12:32:32 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
apache_beam.utils.subprocess_server: INFO: INFO: BEAMPlan>
apache_beam.utils.subprocess_server: INFO: BeamCalcRel(expr#0..4=[{inputs}], id=[$t2], metadata=[$t1])
apache_beam.utils.subprocess_server: INFO:   BeamCoGBKJoinRel(condition=[=($2, $0)], joinType=[inner])
apache_beam.utils.subprocess_server: INFO:     BeamIOSourceRel(table=[[beam, enrich]])
apache_beam.utils.subprocess_server: INFO:     BeamIOSourceRel(table=[[beam, simple]])
apache_beam.utils.subprocess_server: INFO: 
root: DEBUG: Sending SIGINT to job_server
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.23.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.23.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function lift_combiners at 0x7fc70c0d7ed8> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 42 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['ref_AppliedPTransform_Create enrich/Impulse_3\n  Create enrich/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/FlatMap(<lambda at core.py:2623>)_4\n  Create enrich/FlatMap(<lambda at core.py:2623>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n  Create enrich/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n  Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n  Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_11\n  Create enrich/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys_12\n  Create enrich/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create enrich/Map(decode)_13\n  Create enrich/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/Impulse_15\n  Create simple/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/FlatMap(<lambda at core.py:2623>)_16\n  Create simple/FlatMap(<lambda at core.py:2623>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/AddRandomKeys_19\n  Create simple/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_21\n  Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_22\n  Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_23\n  Create simple/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys_24\n  Create simple/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_Create simple/Map(decode)_25\n  Create simple/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_6/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamIOSourceRel_5/Convert.ConvertTransform/ParDo(Anonymous)/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/left_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/right_TimestampCombiner/Flatten.PCollections:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeylhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/extractKeyrhs/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/GBK:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Join.Impl/CoGroup.ExpandCrossProduct/ParDo(ConvertCoGbkResult)/ParMultiDo(ConvertCoGbkResult):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select)\n  SqlTransform(beam:external:java:sql:v1)/BeamCoGBKJoinRel_95/Select.Fields/ParDo(Select)/ParMultiDo(Select):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_5SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc)\n  SqlTransform(beam:external:java:sql:v1)/BeamCalcRel_96/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_29\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2623>)_30\n  assert_that/Create/FlatMap(<lambda at core.py:2623>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_32\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_33\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_34\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_36\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_37\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_38\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_39\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_41\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_42\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_zetasql_generate_data (apache_beam.transforms.sql_test.SqlTransformTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/transforms/sql_test.py",> line 149, in test_zetasql_generate_data
    assert_that(out, equal_to([(1, "foo", 3.14)]))
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/pipeline.py",> line 547, in __exit__
    self.run().wait_until_finish()
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/pipeline.py",> line 526, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 424, in run_pipeline
    job_service_handle = self.create_job_service(options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 331, in create_job_service
    return self.create_job_service_handle(server.start(), options)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 56, in start
    grpc.channel_ready_future(channel).result(timeout=self._timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/grpc/_utilities.py",> line 140, in result
    self._block(timeout)
  File "<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/grpc/_utilities.py",> line 86, in _block
    raise grpc.FutureTimeoutError()
FutureTimeoutError: 
-------------------- >> begin captured logging << --------------------
apache_beam.utils.subprocess_server: INFO: Using pre-built snapshot at <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.23.0-SNAPSHOT.jar>
apache_beam.utils.subprocess_server: INFO: Starting service with ['java' '-jar' '<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/java/extensions/sql/expansion-service/build/libs/beam-sdks-java-extensions-sql-expansion-service-2.23.0-SNAPSHOT.jar'> '52427']
root: DEBUG: Waiting for grpc channel to be ready at localhost:52427.
apache_beam.utils.subprocess_server: INFO: Starting expansion service at localhost:52427
root: DEBUG: Waiting for grpc channel to be ready at localhost:52427.
apache_beam.utils.subprocess_server: INFO: 	beam:external:java:sql:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/1327763628@593634ad
apache_beam.utils.subprocess_server: INFO: 	beam:external:java:generate_sequence:v1: org.apache.beam.sdk.expansion.service.ExpansionService$ExternalTransformRegistrarLoader$$Lambda$1/1327763628@20fa23c1
root: DEBUG: Waiting for grpc channel to be ready at localhost:52427.
root: DEBUG: Waiting for grpc channel to be ready at localhost:52427.
root: DEBUG: Waiting for grpc channel to be ready at localhost:52427.
root: DEBUG: Waiting for grpc channel to be ready at localhost:52427.
apache_beam.utils.subprocess_server: INFO: Jun 06, 2020 12:33:38 AM org.apache.beam.sdk.expansion.service.ExpansionService expand
apache_beam.utils.subprocess_server: INFO: INFO: Expanding 'SqlTransform(beam:external:java:sql:v1)' with URN 'beam:external:java:sql:v1'
root: DEBUG: Sending SIGINT to job_server
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.23.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.23.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
apache_beam.runners.portability.fn_api_runner.translations: INFO: ==================== <function lift_combiners at 0x7fc70c0d7ed8> ====================
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: 16 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner.translations: DEBUG: Stages: ['external_6SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_6SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/MapElements/Map/ParMultiDo(Anonymous):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_6SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)\n  SqlTransform(beam:external:java:sql:v1)/BeamValuesRel_13/Create.Values/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'external_6SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc)\n  SqlTransform(beam:external:java:sql:v1)/BeamZetaSqlCalcRel_17/ParDo(Calc)/ParMultiDo(Calc):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Impulse_5\n  assert_that/Create/Impulse:beam:transform:impulse:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2623>)_6\n  assert_that/Create/FlatMap(<lambda at core.py:2623>):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Create/Map(decode)_8\n  assert_that/Create/Map(decode):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_9\n  assert_that/WindowInto(WindowIntoFn):beam:transform:window_into:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/ToVoidKey_10\n  assert_that/ToVoidKey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_0_12\n  assert_that/Group/pair_with_0:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/pair_with_1_13\n  assert_that/Group/pair_with_1:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Flatten_14\n  assert_that/Group/Flatten:beam:transform:flatten:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/GroupByKey_15\n  assert_that/Group/GroupByKey:beam:transform:group_by_key:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_16\n  assert_that/Group/Map(_merge_tagged_vals_under_key):beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Unkey_17\n  assert_that/Unkey:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>', 'ref_AppliedPTransform_assert_that/Match_18\n  assert_that/Match:beam:transform:pardo:v1\n  must follow: \n  downstream_side_inputs: <unknown>']
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-xlangSqlValidateRunner.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 6 tests in 208.567s

FAILED (errors=5)

> Task :runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingSql FAILED
> Task :runners:spark:job-server:sparkJobServerCleanup

FAILURE: Build completed with 5 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerJavaUsingJava'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingJava/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerJavaUsingPython'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PostCommit_XVR_Spark/ws/src/runners/spark/job-server/build/reports/tests/validatesCrossLanguageRunnerJavaUsingPython/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingPython'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

5: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:job-server:validatesCrossLanguageRunnerPythonUsingSql'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 25m 17s
134 actionable tasks: 100 executed, 32 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/7zdgjfpxajx6g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_XVR_Spark #929

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_XVR_Spark/929/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org