You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@uniffle.apache.org by GitBox <gi...@apache.org> on 2022/09/19 01:23:54 UTC

[GitHub] [incubator-uniffle] wfxxh opened a new issue, #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

wfxxh opened a new issue, #228:
URL: https://github.com/apache/incubator-uniffle/issues/228

   I build the Uniffle after read the README, and start the coordnator and shuffle-server success,but when i requset the Uniffle use spark3.2.1, it dose not work ,the error is : org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z.
   
   when i use the "rss-client-spark2-0.6.0-snapshot-shaded.jar" and set spark version to 2.x,it worked.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] tsface commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
tsface commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1306607565

   > > I had the same problem.
   > > 
   > > * Compile using community documentation.
   > > * I decompiled the rss-client-spark3-0.6.0-shaded.jar package, and found no `org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z` method.
   > 
   > We already shade this jar
   > 
   > https://github.com/apache/incubator-uniffle/blob/cf63eae4539264f694760fdf48d882d931ac8e20/client-spark/spark3/pom.xml#L95
   
   @jerqi thx!
   
   In version 0.6.0, I specified the protobuf java version as 3.19.2.
   Currently, the RSS service is working


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] tsface commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
tsface commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1306558630

   I had the same problem.
   - Compile using community documentation.
   - I decompiled the rss-client-spark3-0.6.0-shaded.jar package, and found no `org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z` method.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] melodyyangaws commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
melodyyangaws commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1326930624

   i am still having the error while running a benchmark application with Spark on k8s Operator. The source code from Uniffle is the latest up to Nov 2022. Please help!
   
   Versions:
   RSS version: "0.7.0-snapshot"
   Spark:3.2.0
   hadoop-version 3.2.1
   
   Code snippet from my Dockerfile:
   ```
   # compile RSS client
   RUN git clone https://github.com/apache/incubator-uniffle.git /tmp/uniffle
   WORKDIR /tmp/uniffle
   RUN mvn clean package -Pspark3.2.0 -DskipTests -Dmaven.javadoc.skip=true 
   COPY client-spark/spark3/target/shaded/rss-client-spark3-*-shaded.jar  ${SPARK_HOME}/jars/uniffle-rss-client.jar
   ```
   
   The error mesage from Spark application logs:
   ```
   22/11/24 13:29:05 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND HEADERS: streamId=89 headers=GrpcHttp2OutboundHeaders[:authority: rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997, :path: /rss.common.CoordinatorServer/getShuffleAssignments, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.47.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
       22/11/24 13:29:05 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND RST_STREAM: streamId=89 errorCode=8
       22/11/24 13:30:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:30:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:30:10 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND HEADERS: streamId=91 headers=GrpcHttp2OutboundHeaders[:authority: rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997, :path: /rss.common.CoordinatorServer/getShuffleAssignments, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.47.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
       22/11/24 13:30:10 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND RST_STREAM: streamId=91 errorCode=8
       22/11/24 13:31:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:31:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:31:15 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND HEADERS: streamId=93 headers=GrpcHttp2OutboundHeaders[:authority: rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997, :path: /rss.common.CoordinatorServer/getShuffleAssignments, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.47.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
       22/11/24 13:31:15 ERROR AdaptiveSparkPlanExec: Exception in cancelling query stage: ShuffleQueryStage 23
   +- ReusedExchange [c_customer_sk#14226, c_customer_id#14227, c_first_name#14234, c_last_name#14235, c_preferred_cust_flag#14236, c_birth_country#14240, c_login#14241, c_email_address#14242], Exchange hashpartitioning(c_customer_sk#412, 200), ENSURE_REQUIREMENTS, [id=#3167]
   
       org.apache.uniffle.common.exception.RssException: registerShuffle failed!
   	at org.apache.spark.shuffle.RssShuffleManager.registerShuffle(RssShuffleManager.java:328)
   	at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:99)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$.prepareShuffleDependency(ShuffleExchangeExec.scala:395)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.shuffleDependency$lzycompute(ShuffleExchangeExec.scala:173)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.shuffleDependency(ShuffleExchangeExec.scala:167)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture$lzycompute(ShuffleExchangeExec.scala:143)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture(ShuffleExchangeExec.scala:139)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$submitShuffleJob$1(ShuffleExchangeExec.scala:68)
   	at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:222)
   	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
   	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:219)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob(ShuffleExchangeExec.scala:68)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob$(ShuffleExchangeExec.scala:67)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.submitShuffleJob(ShuffleExchangeExec.scala:115)
   	at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture$lzycompute(QueryStageExec.scala:170)
   	at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture(QueryStageExec.scala:170)
   	at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.cancel(QueryStageExec.scala:183)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$cleanUpAndThrowException$1(AdaptiveSparkPlanExec.scala:723)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$cleanUpAndThrowException$1$adapted(AdaptiveSparkPlanExec.scala:718)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:253)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.cleanUpAndThrowException(AdaptiveSparkPlanExec.scala:718)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5(AdaptiveSparkPlanExec.scala:265)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5$adapted(AdaptiveSparkPlanExec.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:254)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:226)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:365)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.executeCollect(AdaptiveSparkPlanExec.scala:338)
   	at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3715)
   	at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2971)
   	at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3706)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
   	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
   	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
   	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3704)
   	at org.apache.spark.sql.Dataset.collect(Dataset.scala:2971)
   	at com.databricks.spark.sql.perf.Query.$anonfun$doBenchmark$10(Query.scala:122)
   	at com.databricks.spark.sql.perf.Benchmarkable.measureTimeMs(Benchmarkable.scala:114)
   	at com.databricks.spark.sql.perf.Benchmarkable.measureTimeMs$(Benchmarkable.scala:112)
   	at com.databricks.spark.sql.perf.Query.measureTimeMs(Query.scala:29)
   	at com.databricks.spark.sql.perf.Query.doBenchmark(Query.scala:121)
   	at com.databricks.spark.sql.perf.Benchmarkable$$anon$1.run(Benchmarkable.scala:78)
   Caused by: java.lang.NoSuchMethodError: org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z
   	at org.apache.uniffle.proto.RssProtos$GetShuffleServerRequest.getSerializedSize(RssProtos.java:31666)
   	at io.grpc.protobuf.lite.ProtoInputStream.available(ProtoInputStream.java:108)
   	at io.grpc.internal.MessageFramer.getKnownLength(MessageFramer.java:205)
   	at io.grpc.internal.MessageFramer.writePayload(MessageFramer.java:137)
   	at io.grpc.internal.AbstractStream.writeMessage(AbstractStream.java:65)
   	at io.grpc.internal.ForwardingClientStream.writeMessage(ForwardingClientStream.java:37)
   	at io.grpc.internal.RetriableStream$1SendMessageEntry.runWith(RetriableStream.java:552)
   	at io.grpc.internal.RetriableStream.delayOrExecute(RetriableStream.java:529)
   	at io.grpc.internal.RetriableStream.sendMessage(RetriableStream.java:556)
   	at io.grpc.internal.ClientCallImpl.sendMessageInternal(ClientCallImpl.java:520)
   	at io.grpc.internal.ClientCallImpl.sendMessage(ClientCallImpl.java:506)
   	at io.grpc.stub.ClientCalls.asyncUnaryRequestCall(ClientCalls.java:317)
   	at io.grpc.stub.ClientCalls.futureUnaryCall(ClientCalls.java:227)
   	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:154)
   	at org.apache.uniffle.proto.CoordinatorServerGrpc$CoordinatorServerBlockingStub.getShuffleAssignments(CoordinatorServerGrpc.java:829)
   	at org.apache.uniffle.client.impl.grpc.CoordinatorGrpcClient.doGetShuffleAssignments(CoordinatorGrpcClient.java:178)
   	at org.apache.uniffle.client.impl.grpc.CoordinatorGrpcClient.getShuffleAssignments(CoordinatorGrpcClient.java:248)
   	at org.apache.uniffle.client.impl.ShuffleWriteClientImpl.getShuffleAssignments(ShuffleWriteClientImpl.java:410)
   	at org.apache.spark.shuffle.RssShuffleManager.lambda$registerShuffle$0(RssShuffleManager.java:316)
   	at org.apache.uniffle.common.util.RetryUtils.retry(RetryUtils.java:54)
   	at org.apache.uniffle.common.util.RetryUtils.retry(RetryUtils.java:31)
   	at org.apache.spark.shuffle.RssShuffleManager.registerShuffle(RssShuffleManager.java:315)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] leixm commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
leixm commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1328154421

   > > @leixm @smallzhongfeng @tsface Is there this problem? Do we need a pr to fix this issue?
   > 
   > @jerqi Currently, i have only tested version 0.6.0. Like I mentioned above, I just specified the `protobuf-java` version as `3.19.2` in the pom file.
   > 
   > I checked the compile-time log, and I guessed that it was caused by the unspecified version of `protobuf-java` when compiling the shade jar package of the `rss-client-spark3` module.
   > 
   > I can upgrade to the latest version, if the problem still exists, I will try to submit a PR.
   
   You're right, it's a problem, we avoid this problem by excluding protobuf-java in the internal version, like:
   `client-spark/spark3/pom.xml`
   ```xml
           <dependency>
               <groupId>org.apache.spark</groupId>
               <artifactId>spark-core_${scala.binary.version}</artifactId>
               <version>${spark.version}</version>
               <scope>provided</scope>
               <exclusions>
                   <exclusion>
                       <groupId>com.google.protobuf</groupId>
                       <artifactId>protobuf-java</artifactId>
                   </exclusion>
               </exclusions>
           </dependency>
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] kaijchen commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
kaijchen commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1250782873

   What's the build command you used for building Uniffle against spark 3.2.1?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] a140262 commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
a140262 commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1326992687

   the server & coordinator configurations are using the github's [example](https://github.com/apache/incubator-uniffle/blob/master/deploy/kubernetes/operator/examples/configuration.yaml), Spark configure is also using the Uniffle's example:
   ```
   apiVersion: "sparkoperator.k8s.io/v1beta2"
   kind: SparkApplication
   metadata:
     name: spark-benchmark
   spec:
     type: Scala
     mode: cluster
     image: xxxx/uniffle-spark-benchmark:3.2.0
     sparkVersion: 3.2.0
     mainClass: .....
     mainApplicationFile: .....
     arguments:  .....
     sparkConf:
       "spark.network.timeout": "2000s"
       "spark.executor.heartbeatInterval": "300s" 
       "spark.serializer": "org.apache.spark.serializer.KryoSerializer"
       # Enable RSS
       "spark.shuffle.manager": "org.apache.spark.shuffle.RssShuffleManager"
       "spark.rss.coordinator.quorum": "rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997,rss-coordinator-uniffle-rss-1.uniffle.svc.cluster.local:19997"
       "spark.rss.storage.type": "MEMORY_LOCALFILE"
       "spark.rss.remote.storage.path": "/rss1/rssdata,/rss2/rssdata"
     driver:
     .....
     executor:
     ...
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] tsface commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
tsface commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1328132237

   > @leixm @smallzhongfeng @tsface Is there this problem? Do we need a pr to fix this issue?
   
   @jerqi 
   Currently, i have only tested version 0.6.0. 
   Like I mentioned above, I just specified the `protobuf-java` version as `3.19.2` in the pom file.
   
   I checked the compile-time log, and I guessed that it was caused by the unspecified version of `protobuf-java` when compiling the shade jar package of the `rss-client-spark3` module.
   
   I can upgrade to the latest version, if the problem still exists, I will try to submit a PR.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] jerqi commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
jerqi commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1255747099

   I never met this problem


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] jerqi commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
jerqi commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1306580731

   > I had the same problem.
   > 
   > * Compile using community documentation.
   > * I decompiled the rss-client-spark3-0.6.0-shaded.jar package, and found no `org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z` method.
   
   We already shade this jar https://github.com/apache/incubator-uniffle/blob/cf63eae4539264f694760fdf48d882d931ac8e20/client-spark/spark3/pom.xml#L95


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] leixm commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
leixm commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1328155394

   You can submit a new pr to fix this issue. @tsface 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] jerqi commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
jerqi commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1326950775

   @leixm @smallzhongfeng @tsface Is there this problem? Do we need a pr to fix this issue?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] jerqi closed issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
jerqi closed issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"
URL: https://github.com/apache/incubator-uniffle/issues/228


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] jerqi commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
jerqi commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1328169015

   Thanks @tsface @leixm 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] wfxxh commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
wfxxh commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1251690789

   ![image](https://user-images.githubusercontent.com/22764286/191139485-e8e4f892-caee-4890-b1d2-4bd55f05685b.png)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] leixm commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
leixm commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1328155300

   You can submit a new pr to fix this issue.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] jerqi commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
jerqi commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1326947762

   Could you give your configurations (including Spark, Shuffle Server, Coordinator) and logs (including drivers, executor, driver and coordinator)?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] jerqi commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
jerqi commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1328479298

   @a140262 The fix pr have been merged into the master branch.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-uniffle] a140262 commented on issue #228: [Problem] Build against Spark 3.2.x,but when I request the Uniffle it does not work,error is "org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z"

Posted by GitBox <gi...@apache.org>.
a140262 commented on issue #228:
URL: https://github.com/apache/incubator-uniffle/issues/228#issuecomment-1326934150

   i am still having the error while running a benchmark application with Spark on k8s Operator. The source code from Uniffle is the latest up to Nov 2022. Please help!
   
   Versions:
   RSS version: "0.7.0-snapshot"
   Spark:3.2.0
   hadoop-version 3.2.1
   
   Code snippet from my client Dockerfile:
   ```
   # compile RSS client
   RUN git clone https://github.com/apache/incubator-uniffle.git /tmp/uniffle
   WORKDIR /tmp/uniffle
   RUN mvn clean package -Pspark3.2.0 -DskipTests -Dmaven.javadoc.skip=true 
   COPY client-spark/spark3/target/shaded/rss-client-spark3-*-shaded.jar  ${SPARK_HOME}/jars/uniffle-rss-client.jar
   ```
   
   Server side Dockerfile:
   ```
   # compile RSS server
   RUN git clone https://github.com/apache/incubator-uniffle.git /tmp/uniffle
   WORKDIR /tmp/uniffle
   COPY build_distribution.sh .
   RUN ./build_distribution.sh --spark3-profile 'spark3.2.0'
   ```
   
   The error mesage from Spark application logs:
   ```
   22/11/24 13:29:05 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND HEADERS: streamId=89 headers=GrpcHttp2OutboundHeaders[:authority: rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997, :path: /rss.common.CoordinatorServer/getShuffleAssignments, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.47.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
       22/11/24 13:29:05 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND RST_STREAM: streamId=89 errorCode=8
       22/11/24 13:30:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:30:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:30:10 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND HEADERS: streamId=91 headers=GrpcHttp2OutboundHeaders[:authority: rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997, :path: /rss.common.CoordinatorServer/getShuffleAssignments, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.47.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
       22/11/24 13:30:10 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND RST_STREAM: streamId=91 errorCode=8
       22/11/24 13:31:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:31:04 DEBUG PoolingHttpClientConnectionManager: Closing connections idle longer than 60000 MILLISECONDS
       22/11/24 13:31:15 DEBUG NettyClientHandler: [id: 0x0364344e, L:/192.168.3.45:57834 - R:rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local/10.100.241.145:19997] OUTBOUND HEADERS: streamId=93 headers=GrpcHttp2OutboundHeaders[:authority: rss-coordinator-uniffle-rss-0.uniffle.svc.cluster.local:19997, :path: /rss.common.CoordinatorServer/getShuffleAssignments, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.47.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
       22/11/24 13:31:15 ERROR AdaptiveSparkPlanExec: Exception in cancelling query stage: ShuffleQueryStage 23
   +- ReusedExchange [c_customer_sk#14226, c_customer_id#14227, c_first_name#14234, c_last_name#14235, c_preferred_cust_flag#14236, c_birth_country#14240, c_login#14241, c_email_address#14242], Exchange hashpartitioning(c_customer_sk#412, 200), ENSURE_REQUIREMENTS, [id=#3167]
   
       org.apache.uniffle.common.exception.RssException: registerShuffle failed!
   	at org.apache.spark.shuffle.RssShuffleManager.registerShuffle(RssShuffleManager.java:328)
   	at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:99)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$.prepareShuffleDependency(ShuffleExchangeExec.scala:395)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.shuffleDependency$lzycompute(ShuffleExchangeExec.scala:173)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.shuffleDependency(ShuffleExchangeExec.scala:167)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture$lzycompute(ShuffleExchangeExec.scala:143)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture(ShuffleExchangeExec.scala:139)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$submitShuffleJob$1(ShuffleExchangeExec.scala:68)
   	at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:222)
   	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
   	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:219)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob(ShuffleExchangeExec.scala:68)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.submitShuffleJob$(ShuffleExchangeExec.scala:67)
   	at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.submitShuffleJob(ShuffleExchangeExec.scala:115)
   	at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture$lzycompute(QueryStageExec.scala:170)
   	at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.shuffleFuture(QueryStageExec.scala:170)
   	at org.apache.spark.sql.execution.adaptive.ShuffleQueryStageExec.cancel(QueryStageExec.scala:183)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$cleanUpAndThrowException$1(AdaptiveSparkPlanExec.scala:723)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$cleanUpAndThrowException$1$adapted(AdaptiveSparkPlanExec.scala:718)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:253)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1(TreeNode.scala:254)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreach$1$adapted(TreeNode.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.catalyst.trees.TreeNode.foreach(TreeNode.scala:254)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.cleanUpAndThrowException(AdaptiveSparkPlanExec.scala:718)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5(AdaptiveSparkPlanExec.scala:265)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$5$adapted(AdaptiveSparkPlanExec.scala:254)
   	at scala.collection.Iterator.foreach(Iterator.scala:943)
   	at scala.collection.Iterator.foreach$(Iterator.scala:943)
   	at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
   	at scala.collection.IterableLike.foreach(IterableLike.scala:74)
   	at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
   	at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.$anonfun$getFinalPhysicalPlan$1(AdaptiveSparkPlanExec.scala:254)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.getFinalPhysicalPlan(AdaptiveSparkPlanExec.scala:226)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.withFinalPlanUpdate(AdaptiveSparkPlanExec.scala:365)
   	at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.executeCollect(AdaptiveSparkPlanExec.scala:338)
   	at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3715)
   	at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2971)
   	at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3706)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
   	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
   	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
   	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
   	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
   	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3704)
   	at org.apache.spark.sql.Dataset.collect(Dataset.scala:2971)
   	at com.databricks.spark.sql.perf.Query.$anonfun$doBenchmark$10(Query.scala:122)
   	at com.databricks.spark.sql.perf.Benchmarkable.measureTimeMs(Benchmarkable.scala:114)
   	at com.databricks.spark.sql.perf.Benchmarkable.measureTimeMs$(Benchmarkable.scala:112)
   	at com.databricks.spark.sql.perf.Query.measureTimeMs(Query.scala:29)
   	at com.databricks.spark.sql.perf.Query.doBenchmark(Query.scala:121)
   	at com.databricks.spark.sql.perf.Benchmarkable$$anon$1.run(Benchmarkable.scala:78)
   Caused by: java.lang.NoSuchMethodError: org.apache.uniffle.com.google.protobuf.GeneratedMessageV3.isStringEmpty(Ljava/lang/Object;)Z
   	at org.apache.uniffle.proto.RssProtos$GetShuffleServerRequest.getSerializedSize(RssProtos.java:31666)
   	at io.grpc.protobuf.lite.ProtoInputStream.available(ProtoInputStream.java:108)
   	at io.grpc.internal.MessageFramer.getKnownLength(MessageFramer.java:205)
   	at io.grpc.internal.MessageFramer.writePayload(MessageFramer.java:137)
   	at io.grpc.internal.AbstractStream.writeMessage(AbstractStream.java:65)
   	at io.grpc.internal.ForwardingClientStream.writeMessage(ForwardingClientStream.java:37)
   	at io.grpc.internal.RetriableStream$1SendMessageEntry.runWith(RetriableStream.java:552)
   	at io.grpc.internal.RetriableStream.delayOrExecute(RetriableStream.java:529)
   	at io.grpc.internal.RetriableStream.sendMessage(RetriableStream.java:556)
   	at io.grpc.internal.ClientCallImpl.sendMessageInternal(ClientCallImpl.java:520)
   	at io.grpc.internal.ClientCallImpl.sendMessage(ClientCallImpl.java:506)
   	at io.grpc.stub.ClientCalls.asyncUnaryRequestCall(ClientCalls.java:317)
   	at io.grpc.stub.ClientCalls.futureUnaryCall(ClientCalls.java:227)
   	at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:154)
   	at org.apache.uniffle.proto.CoordinatorServerGrpc$CoordinatorServerBlockingStub.getShuffleAssignments(CoordinatorServerGrpc.java:829)
   	at org.apache.uniffle.client.impl.grpc.CoordinatorGrpcClient.doGetShuffleAssignments(CoordinatorGrpcClient.java:178)
   	at org.apache.uniffle.client.impl.grpc.CoordinatorGrpcClient.getShuffleAssignments(CoordinatorGrpcClient.java:248)
   	at org.apache.uniffle.client.impl.ShuffleWriteClientImpl.getShuffleAssignments(ShuffleWriteClientImpl.java:410)
   	at org.apache.spark.shuffle.RssShuffleManager.lambda$registerShuffle$0(RssShuffleManager.java:316)
   	at org.apache.uniffle.common.util.RetryUtils.retry(RetryUtils.java:54)
   	at org.apache.uniffle.common.util.RetryUtils.retry(RetryUtils.java:31)
   	at org.apache.spark.shuffle.RssShuffleManager.registerShuffle(RssShuffleManager.java:315)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@uniffle.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org