You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2023/02/10 02:58:32 UTC

[spark] branch master updated: [SPARK-42276][BUILD][CONNECT] Add `ServicesResourceTransformer` rule to connect server module shade configuration

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new af50b47e120 [SPARK-42276][BUILD][CONNECT] Add `ServicesResourceTransformer` rule to connect server module shade configuration
af50b47e120 is described below

commit af50b47e12040f86c4f81ff84407ad820cb252c1
Author: yangjie01 <ya...@baidu.com>
AuthorDate: Fri Feb 10 11:58:19 2023 +0900

    [SPARK-42276][BUILD][CONNECT] Add `ServicesResourceTransformer` rule to connect server module shade configuration
    
    ### What changes were proposed in this pull request?
    This pr aims add `ServicesResourceTransformer` rule to connect server module shade configuration to make sure `grpc.ManagedChannelProvider` and `grpc.ServerProvider` can be used in server side.
    
    ### Why are the changes needed?
    Keep `grpc.ManagedChannelProvider` and `grpc.ServerProvider` and other spi  usable after grpc being shaded, sbt doesn't need to be fixed because `sbt-assembly` does this by default.
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    - Pass Github Actions
    - Manual test, build a spark client and do as follows:
    
    ```
    bin/spark-shell --jars spark-connect_2.12-3.5.0-SNAPSHOT.jar --driver-class-path spark-connect_2.12-3.5.0-SNAPSHOT.jar --conf spark.plugins=org.apache.spark.sql.connect.SparkConnectPlugin --conf spark.connect.grpc.binding.port=15102
    ```
    
    then run some code in spark-shell
    
    Before
    
    ```scala
    23/02/01 20:44:58 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    Spark context Web UI available at http://localhost:4040
    Spark context available as 'sc' (master = local[*], app id = local-1675255501816).
    Spark session available as 'spark'.
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 3.5.0-SNAPSHOT
          /_/
    
    Using Scala version 2.12.17 (OpenJDK 64-Bit Server VM, Java 11.0.17)
    Type in expressions to have them evaluated.
    Type :help for more information.
    
    scala> org.sparkproject.connect.grpc.ServerProvider.provider
    org.sparkproject.connect.grpc.ManagedChannelProvider$ProviderNotFoundException: No functional server found. Try adding a dependency on the grpc-netty or grpc-netty-shaded artifact
      at org.sparkproject.connect.grpc.ServerProvider.provider(ServerProvider.java:44)
      ... 47 elided
    
    scala> org.sparkproject.connect.grpc.ManagedChannelProvider.provider
    org.sparkproject.connect.grpc.ManagedChannelProvider$ProviderNotFoundException: No functional channel service provider found. Try adding a dependency on the grpc-okhttp, grpc-netty, or grpc-netty-shaded artifact
      at org.sparkproject.connect.grpc.ManagedChannelProvider.provider(ManagedChannelProvider.java:45)
      ... 47 elided
    
    ```
    
    After
    
    ```scala
    23/02/01 21:00:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    Spark context Web UI available at http://localhost:4040
    Spark context available as 'sc' (master = local[*], app id = local-1675256417224).
    Spark session available as 'spark'.
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 3.5.0-SNAPSHOT
          /_/
    
    Using Scala version 2.12.17 (OpenJDK 64-Bit Server VM, Java 11.0.17)
    Type in expressions to have them evaluated.
    Type :help for more information.
    
    scala> org.sparkproject.connect.grpc.ManagedChannelProvider.provider
    res0: org.sparkproject.connect.grpc.ManagedChannelProvider = org.sparkproject.connect.grpc.netty.NettyChannelProvider68aa505b
    
    scala> org.sparkproject.connect.grpc.ServerProvider.provider
    res2: org.sparkproject.connect.grpc.ServerProvider = org.sparkproject.connect.grpc.netty.NettyServerProvider4a5d8ae4
    
    ```
    
    Closes #39848 from LuciferYang/SPARK-42276.
    
    Lead-authored-by: yangjie01 <ya...@baidu.com>
    Co-authored-by: YangJie <ya...@baidu.com>
    Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
 connector/connect/server/pom.xml | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/connector/connect/server/pom.xml b/connector/connect/server/pom.xml
index 00d5fb4d323..405f1e2bb47 100644
--- a/connector/connect/server/pom.xml
+++ b/connector/connect/server/pom.xml
@@ -349,6 +349,9 @@
               <shadedPattern>${spark.shade.packageName}.connect.google_protos.type</shadedPattern>
             </relocation>
           </relocations>
+          <transformers>
+            <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
+          </transformers>
         </configuration>
       </plugin>
     </plugins>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org