You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/12/02 15:21:36 UTC

[GitHub] [spark] vicennial opened a new pull request, #38889: [WIP][SPARK-41369] Refactor connect directory structure

vicennial opened a new pull request, #38889:
URL: https://github.com/apache/spark/pull/38889

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the guideline first in
        'core/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   Refactor the `connector/connect` directory into `connector/connect/server` and `connector/connect/common`.
   
   
   ### Why are the changes needed?
   Currently, `spark/connector/connect/` is a single module that contains both the "server"/service as well as the protobuf definitions.
   
   However, this module can be split into multiple modules - "server" and "common". This brings the advantage of separating out the protobuf generation from the core "server" module for efficient reuse.
   
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   
   ### How was this patch tested?
   Existing tests
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] AmplabJenkins commented on pull request #38889: [SPARK-41369] Refactor connect directory structure

Posted by GitBox <gi...@apache.org>.
AmplabJenkins commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1335572079

   Can one of the admins verify this patch?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] grundprinzip commented on pull request #38889: [SPARK-41369] Refactor connect directory structure

Posted by GitBox <gi...@apache.org>.
grundprinzip commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1336529182

   I think it would be great to add some more color to the PR description saying that you're doing this in preparation for the Scala client.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects

Posted by GitBox <gi...@apache.org>.
LuciferYang commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1338674594

   maven build with `-Puser-defined-protoc` passed, but:
   
   ```
   export CONNECT_PROTOC_EXEC_PATH="/${path}/protoc-3.21.9-osx-aarch_64.exe"    
   export CONNECT_PLUGIN_EXEC_PATH="/${path}/protoc-gen-grpc-java-1.47.0-osx-aarch_64.exe"
   build/sbt clean "connect-common/compile" -Puser-defined-protoc
   ```
   failed
   
   ```
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Aggregate.java:626:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Relation.java:2478:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Expression.java:577:1:  错误: 找不到符号
   [error]                 if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                           ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Expression.java:1876:1:  错误: 找不到符号
   [error]                   if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                             ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Expression.java:5097:1:  错误: 找不到符号
   [error]                     if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                               ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Expression.java:14433:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/RelationCommon.java:401:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Read.java:462:1:  错误: 找不到符号
   [error]                 if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                           ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Read.java:2340:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Project.java:544:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Filter.java:484:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Join.java:878:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/SetOperation.java:783:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Sort.java:789:1:  错误: 找不到符号
   [error]                 if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                           ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Sort.java:1641:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Limit.java:444:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/SQL.java:401:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/LocalRelation.java:371:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Sample.java:605:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Offset.java:446:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Deduplicate.java:569:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Range.java:529:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/SubqueryAlias.java:569:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Repartition.java:506:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/RenameColumnsBySameLengthNames.java:508:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/RenameColumnsByNameToNameMap.java:590:1:  错误: 找不到符 
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/ShowString.java:522:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Drop.java:542:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Tail.java:444:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/WithColumns.java:562:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Hint.java:610:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/NAFill.java:655:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/NADrop.java:576:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/NAReplace.java:592:1:  错误: 找不到符号
   [error]                 if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                           ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/NAReplace.java:1581:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/StatSummary.java:548:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/StatCrosstab.java:553:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Unknown.java:332:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/DataType.java:412:1:  错误: 找不到符号
   [error]                 if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                           ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/DataType.java:15307:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/AnalyzePlanRequest.java:730:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/UserContext.java:601:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Plan.java:530:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Explain.java:577:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/Command.java:595:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/CreateScalarFunction.java:1059:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/WriteOperation.java:664:1:  错误: 找不到符号
   [error]                 if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                           ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/WriteOperation.java:1992:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/CreateDataFrameViewCommand.java:554:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/AnalyzePlanResponse.java:766:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/ExecutePlanRequest.java:655:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/ExecutePlanResponse.java:462:1:  错误: 找不到符号
   [error]                 if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                           ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/ExecutePlanResponse.java:1336:1:  错误: 找不到符号
   [error]                   if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                             ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   [error] /spark-source/connector/connect/common/target/scala-2.12/src_managed/main/org/apache/spark/connect/proto/ExecutePlanResponse.java:3643:1:  错误: 找不到符号
   [error]               if (!super.parseUnknownField(input, extensionRegistry, tag)) {
   [error]                         ^  符号: 方法 parseUnknownField(CodedInputStream,ExtensionRegistryLite,int)
   
   ```
   @HyukjinKwon @hvanhovell @grundprinzip 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] grundprinzip commented on pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects

Posted by GitBox <gi...@apache.org>.
grundprinzip commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1338770120

   > https://github.com/vicennial/spark/actions/runs/3620662718/jobs/6103638688
   > 
   > 
   > 
   > ```
   > 
   > Finished test(python3.9): pyspark.sql.tests.connect.test_connect_function (1s) ... 5 tests were skipped
   > 
   > Starting test(python3.9): pyspark.sql.tests.connect.test_connect_plan_only (temp output: /__w/spark/spark/python/target/0ffb8674-bbe9-44dc-86ab-f68ce4eda3f9/python3.9__pyspark.sql.tests.connect.test_connect_plan_only__fnnvnv4i.log)
   > 
   > Finished test(python3.9): pyspark.sql.tests.connect.test_connect_plan_only (0s) ... 28 tests were skipped
   > 
   > Starting test(python3.9): pyspark.sql.tests.connect.test_connect_select_ops (temp output: /__w/spark/spark/python/target/e1d26883-9178-4dba-a700-7d620f9dd8e5/python3.9__pyspark.sql.tests.connect.test_connect_select_ops__afogxb26.log)
   > 
   > Finished test(python3.9): pyspark.sql.tests.connect.test_connect_select_ops (0s) ... 2 tests were skipped
   > 
   > Tests passed in 12 seconds
   > 
   > 
   > 
   > Skipped tests in pyspark.sql.tests.connect.test_connect_basic with python3.9:
   > 
   >       test_agg_with_avg (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.002s)
   > 
   >       test_agg_with_two_agg_exprs (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.001s)
   > 
   >       test_collect (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.001s)
   > 
   >       test_columns (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.002s)
   > 
   >       test_count (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.000s)
   > 
   >       test_create_global_temp_view (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.001s)
   > 
   > ```
   > 
   > 
   > 
   > this PR disabled all the tests in PySpark-Connect
   > 
   > 
   > 
   > @HyukjinKwon @grundprinzip @hvanhovell @vicennial 
   
   How did this happen :(


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects

Posted by GitBox <gi...@apache.org>.
LuciferYang commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1338691198

   found the problem and not related to this pr, I will give a pr to fix it
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] amaliujia commented on pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects

Posted by GitBox <gi...@apache.org>.
amaliujia commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1337766852

   +1 to do this refactoring. With the proto split out, clients that need to depend on proto now will only depend on proto(not including server which is not the case before this PR). 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell closed pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects

Posted by GitBox <gi...@apache.org>.
hvanhovell closed pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects
URL: https://github.com/apache/spark/pull/38889


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects

Posted by GitBox <gi...@apache.org>.
zhengruifeng commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1338762242

   https://github.com/vicennial/spark/actions/runs/3620662718/jobs/6103638688
   
   ```
   Finished test(python3.9): pyspark.sql.tests.connect.test_connect_function (1s) ... 5 tests were skipped
   Starting test(python3.9): pyspark.sql.tests.connect.test_connect_plan_only (temp output: /__w/spark/spark/python/target/0ffb8674-bbe9-44dc-86ab-f68ce4eda3f9/python3.9__pyspark.sql.tests.connect.test_connect_plan_only__fnnvnv4i.log)
   Finished test(python3.9): pyspark.sql.tests.connect.test_connect_plan_only (0s) ... 28 tests were skipped
   Starting test(python3.9): pyspark.sql.tests.connect.test_connect_select_ops (temp output: /__w/spark/spark/python/target/e1d26883-9178-4dba-a700-7d620f9dd8e5/python3.9__pyspark.sql.tests.connect.test_connect_select_ops__afogxb26.log)
   Finished test(python3.9): pyspark.sql.tests.connect.test_connect_select_ops (0s) ... 2 tests were skipped
   Tests passed in 12 seconds
   
   Skipped tests in pyspark.sql.tests.connect.test_connect_basic with python3.9:
         test_agg_with_avg (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.002s)
         test_agg_with_two_agg_exprs (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.001s)
         test_collect (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.001s)
         test_columns (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.002s)
         test_count (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.000s)
         test_create_global_temp_view (pyspark.sql.tests.connect.test_connect_basic.SparkConnectTests) ... skip (0.001s)
   ```
   
   this PR disabled all the tests in PySpark-Connect
   
   @HyukjinKwon @grundprinzip @hvanhovell @vicennial 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects

Posted by GitBox <gi...@apache.org>.
HyukjinKwon commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1338812964

   Okay, let's revert it for now. reverted


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] grundprinzip commented on a diff in pull request #38889: [SPARK-41369] Refactor connect directory structure

Posted by GitBox <gi...@apache.org>.
grundprinzip commented on code in PR #38889:
URL: https://github.com/apache/spark/pull/38889#discussion_r1039037147


##########
connector/connect/common/pom.xml:
##########
@@ -0,0 +1,298 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  ~ Licensed to the Apache Software Foundation (ASF) under one or more
+  ~ contributor license agreements.  See the NOTICE file distributed with
+  ~ this work for additional information regarding copyright ownership.
+  ~ The ASF licenses this file to You under the Apache License, Version 2.0
+  ~ (the "License"); you may not use this file except in compliance with
+  ~ the License.  You may obtain a copy of the License at
+  ~
+  ~    http://www.apache.org/licenses/LICENSE-2.0
+  ~
+  ~ Unless required by applicable law or agreed to in writing, software
+  ~ distributed under the License is distributed on an "AS IS" BASIS,
+  ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  ~ See the License for the specific language governing permissions and
+  ~ limitations under the License.
+  -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+    <modelVersion>4.0.0</modelVersion>
+    <parent>
+        <groupId>org.apache.spark</groupId>
+        <artifactId>spark-parent_2.12</artifactId>
+        <version>3.4.0-SNAPSHOT</version>
+        <relativePath>../../../pom.xml</relativePath>
+    </parent>
+
+    <artifactId>spark-connect-common_2.12</artifactId>
+    <packaging>jar</packaging>
+    <name>Spark Project Connect Common</name>
+    <url>https://spark.apache.org/</url>
+    <properties>
+        <sbt.project.name>connect-common</sbt.project.name>
+        <guava.version>31.0.1-jre</guava.version>
+        <guava.failureaccess.version>1.0.1</guava.failureaccess.version>
+        <io.grpc.version>1.47.0</io.grpc.version>
+        <tomcat.annotations.api.version>6.0.53</tomcat.annotations.api.version>
+    </properties>
+
+    <dependencies>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-core_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <scope>provided</scope>
+            <exclusions>
+                <exclusion>
+                    <groupId>com.google.guava</groupId>
+                    <artifactId>guava</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-core_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <type>test-jar</type>
+            <scope>test</scope>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-catalyst_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <scope>provided</scope>
+            <exclusions>
+                <exclusion>
+                    <groupId>com.google.guava</groupId>
+                    <artifactId>guava</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-sql_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <scope>provided</scope>
+            <exclusions>
+                <exclusion>
+                    <groupId>com.google.guava</groupId>
+                    <artifactId>guava</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-catalyst_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <type>test-jar</type>
+            <scope>test</scope>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-sql_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <type>test-jar</type>
+            <scope>test</scope>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-tags_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <scope>provided</scope>
+            <exclusions>
+                <exclusion>
+                    <groupId>com.google.guava</groupId>
+                    <artifactId>guava</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
+        <!-- #if scala-2.13 --><!--
+    <dependency>
+      <groupId>org.scala-lang.modules</groupId>
+      <artifactId>scala-parallel-collections_${scala.binary.version}</artifactId>
+    </dependency>
+    --><!-- #endif scala-2.13 -->
+        <dependency>
+            <groupId>com.google.guava</groupId>

Review Comment:
   Same for the below dependencies, it would be good to figure out what is needed to produce the Jar and what not.



##########
connector/connect/common/pom.xml:
##########
@@ -0,0 +1,298 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+  ~ Licensed to the Apache Software Foundation (ASF) under one or more
+  ~ contributor license agreements.  See the NOTICE file distributed with
+  ~ this work for additional information regarding copyright ownership.
+  ~ The ASF licenses this file to You under the Apache License, Version 2.0
+  ~ (the "License"); you may not use this file except in compliance with
+  ~ the License.  You may obtain a copy of the License at
+  ~
+  ~    http://www.apache.org/licenses/LICENSE-2.0
+  ~
+  ~ Unless required by applicable law or agreed to in writing, software
+  ~ distributed under the License is distributed on an "AS IS" BASIS,
+  ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+  ~ See the License for the specific language governing permissions and
+  ~ limitations under the License.
+  -->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
+    <modelVersion>4.0.0</modelVersion>
+    <parent>
+        <groupId>org.apache.spark</groupId>
+        <artifactId>spark-parent_2.12</artifactId>
+        <version>3.4.0-SNAPSHOT</version>
+        <relativePath>../../../pom.xml</relativePath>
+    </parent>
+
+    <artifactId>spark-connect-common_2.12</artifactId>
+    <packaging>jar</packaging>
+    <name>Spark Project Connect Common</name>
+    <url>https://spark.apache.org/</url>
+    <properties>
+        <sbt.project.name>connect-common</sbt.project.name>
+        <guava.version>31.0.1-jre</guava.version>
+        <guava.failureaccess.version>1.0.1</guava.failureaccess.version>
+        <io.grpc.version>1.47.0</io.grpc.version>
+        <tomcat.annotations.api.version>6.0.53</tomcat.annotations.api.version>
+    </properties>
+
+    <dependencies>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-core_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <scope>provided</scope>
+            <exclusions>
+                <exclusion>
+                    <groupId>com.google.guava</groupId>
+                    <artifactId>guava</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-core_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <type>test-jar</type>
+            <scope>test</scope>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-catalyst_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <scope>provided</scope>
+            <exclusions>
+                <exclusion>
+                    <groupId>com.google.guava</groupId>
+                    <artifactId>guava</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-sql_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <scope>provided</scope>
+            <exclusions>
+                <exclusion>
+                    <groupId>com.google.guava</groupId>
+                    <artifactId>guava</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-catalyst_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <type>test-jar</type>
+            <scope>test</scope>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-sql_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <type>test-jar</type>
+            <scope>test</scope>
+        </dependency>
+        <dependency>
+            <groupId>org.apache.spark</groupId>
+            <artifactId>spark-tags_${scala.binary.version}</artifactId>
+            <version>${project.version}</version>
+            <scope>provided</scope>
+            <exclusions>
+                <exclusion>
+                    <groupId>com.google.guava</groupId>
+                    <artifactId>guava</artifactId>
+                </exclusion>
+            </exclusions>
+        </dependency>

Review Comment:
   I don't think any of the Spark depedenies are needed here because if I get this correctly it will only generate the code and compile a jar from it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] hvanhovell commented on pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects

Posted by GitBox <gi...@apache.org>.
hvanhovell commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1338498902

   Ok I am merging this one. The tests failing are unrelated.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on pull request #38889: [SPARK-41369][CONNECT][BUILD] Split connect project into common and server projects

Posted by GitBox <gi...@apache.org>.
LuciferYang commented on PR #38889:
URL: https://github.com/apache/spark/pull/38889#issuecomment-1338680892

   I'll  investigate when I have time
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org