You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@uniffle.apache.org by GitBox <gi...@apache.org> on 2022/11/25 01:16:21 UTC

[GitHub] [incubator-uniffle] melodyyangaws commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

melodyyangaws commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1326930624

   i am still having the error while running a benchmark application with Spark on k8s Operator. The source code from Uniffle is the latest up to Nov 2022. Please help!
   
   Versions:
   RSS version: "0.7.0-snapshot"
   Spark:3.2.0
   hadoop-version 3.2.1
   
   Code snippet from my Dockerfile:
   ```
   # compile RSS client
   RUN git clone https://github.com/apache/incubator-uniffle.git /tmp/uniffle
   WORKDIR /tmp/uniffle
   RUN mvn clean package -Pspark3.2.0 -DskipTests -Dmaven.javadoc.skip=true 
   COPY client-spark/spark3/target/shaded/rss-client-spark3-*-shaded.jar  ${SPARK_HOME}/jars/uniffle-rss-client.jar
   ```
   
   The error mesage from Spark application logs:
   ```
   22/11/24 13:29:05 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND HEADERS: streamId=89 headers=GrpcHttp2OutboundHeaders[:authority: rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997, :path: /rss.common.CoordinatorServer/getShuffleAssignments, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.47.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
       22/11/24 13:29:05 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND RST_STREAM: streamId=89 errorCode=8
       22/11/24 13:30:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:30:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:30:10 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND HEADERS: streamId=91 headers=GrpcHttp2OutboundHeaders[:authority: rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997, :path: /rss.common.CoordinatorServer/getShuffleAssignments, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.47.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
       22/11/24 13:30:10 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND RST_STREAM: streamId=91 errorCode=8
       22/11/24 13:31:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:31:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:31:15 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND HEADERS: streamId=93 headers=GrpcHttp2OutboundHeaders[:authority: rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997, :path: /rss.common.CoordinatorServer/getShuffleAssignments, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.47.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
       22/11/24 13:31:15 ERROR AdaptiveSparkPlanExec: Exception in cancelling query stage: ShuffleQueryStage 23
   +- ReusedExchange [c_customer_sk#14226, c_customer_id#14227, c_first_name#14234, c_last_name#14235, c_preferred_cust_flag#14236, c_birth_country#14240, c_login#14241, c_email_address#14242], Exchange hashpartitioning(c_customer_sk#412, 200), ENSURE_REQUIREMENTS, [id=#3167]
   
       org.apache.uniffle.common.exception.RssException: registerShuffle failed!
   	at org.apache.spark.shuffle.RssShuffleManager.registerShuffle(RssShuffleManager.java:328)
   	at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:99)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$.prepareShuffleDependency(ShuffleExchangeExec.scala:395)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.shuffleDependency$lzycompute(ShuffleExchangeExec.scala:173)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.shuffleDependency(ShuffleExchangeExec.scala:167)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture$lzycompute(ShuffleExchangeExec.scala:143)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture(ShuffleExchangeExec.scala:139)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$submitShuffleJob$1(ShuffleExchangeExec.scala:68)
   	at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:222)
   	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
   	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:219)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob(ShuffleExchangeExec.scala:68)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob$(ShuffleExchangeExec.scala:67)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.submitShuffleJob(ShuffleExchangeExec.scala:115)
   	at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture$lzycompute(QueryStageExec.scala:170)
   	at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture(QueryStageExec.scala:170)
   	at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.cancel(QueryStageExec.scala:183)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$cleanUpAndThrowException$1(AdaptiveSparkPlanExec.scala:723)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$cleanUpAndThrowException$1$adapted(AdaptiveSparkPlanExec.scala:718)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:253)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.cleanUpAndThrowException(AdaptiveSparkPlanExec.scala:718)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5(AdaptiveSparkPlanExec.scala:265)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5$adapted(AdaptiveSparkPlanExec.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:254)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:226)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:365)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.executeCollect(AdaptiveSparkPlanExec.scala:338)
   	at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3715)
   	at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2971)
   	at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3706)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
   	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
   	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
   	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3704)
   	at org.apache.spark.sql.Dataset.collect(Dataset.scala:2971)
   	at com.databricks.spark.sql.perf.Query.$anonfun$doBenchmark$10(Query.scala:122)
   	at com.databricks.spark.sql.perf.Benchmarkable.measureTimeMs(Benchmarkable.scala:114)
   	at com.databricks.spark.sql.perf.Benchmarkable.measureTimeMs$(Benchmarkable.scala:112)
   	at com.databricks.spark.sql.perf.Query.measureTimeMs(Query.scala:29)
   	at com.databricks.spark.sql.perf.Query.doBenchmark(Query.scala:121)
   	at com.databricks.spark.sql.perf.Benchmarkable$$anon$1.run(Benchmarkable.scala:78)
   Caused by: java.lang.NoSuchMethodError: org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z
   	at org.apache.uniffle.proto.RssProtos$GetShuffleServerRequest.getSerializedSize(RssProtos.java:31666)
   	at io.grpc.protobuf.lite.ProtoInputStream.available(ProtoInputStream.java:108)
   	at io.grpc.internal.MessageFramer.getKnownLength(MessageFramer.java:205)
   	at io.grpc.internal.MessageFramer.writePayload(MessageFramer.java:137)
   	at io.grpc.internal.AbstractStream.writeMessage(AbstractStream.java:65)
   	at io.grpc.internal.ForwardingClientStream.writeMessage(ForwardingClientStream.java:37)
   	at io.grpc.internal.RetriableStream$1SendMessageEntry.runWith(RetriableStream.java:552)
   	at io.grpc.internal.RetriableStream.delayOrExecute(RetriableStream.java:529)
   	at io.grpc.internal.RetriableStream.sendMessage(RetriableStream.java:556)
   	at io.grpc.internal.ClientCallImpl.sendMessageInternal(ClientCallImpl.java:520)
   	at io.grpc.internal.ClientCallImpl.sendMessage(ClientCallImpl.java:506)
   	at io.grpc.stub.ClientCalls.asyncUnaryRequestCall(ClientCalls.java:317)
   	at io.grpc.stub.ClientCalls.futureUnaryCall(ClientCalls.java:227)
   	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:154)
   	at org.apache.uniffle.proto.CoordinatorServerGrpc$CoordinatorServerBlockingStub.getShuffleAssignments(CoordinatorServerGrpc.java:829)
   	at org.apache.uniffle.client.impl.grpc.CoordinatorGrpcClient.doGetShuffleAssignments(CoordinatorGrpcClient.java:178)
   	at org.apache.uniffle.client.impl.grpc.CoordinatorGrpcClient.getShuffleAssignments(CoordinatorGrpcClient.java:248)
   	at org.apache.uniffle.client.impl.ShuffleWriteClientImpl.getShuffleAssignments(ShuffleWriteClientImpl.java:410)
   	at org.apache.spark.shuffle.RssShuffleManager.lambda$registerShuffle$0(RssShuffleManager.java:316)
   	at org.apache.uniffle.common.util.RetryUtils.retry(RetryUtils.java:54)
   	at org.apache.uniffle.common.util.RetryUtils.retry(RetryUtils.java:31)
   	at org.apache.spark.shuffle.RssShuffleManager.registerShuffle(RssShuffleManager.java:315)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org