You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "beliefer (via GitHub)" <gi...@apache.org> on 2023/07/11 05:24:58 UTC

[GitHub] [spark] beliefer opened a new pull request, #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

beliefer opened a new pull request, #41932:
URL: https://github.com/apache/spark/pull/41932

   ### What changes were proposed in this pull request?
   https://github.com/apache/spark/pull/41687 added `call_function` and deprecate `call_udf` for Scala API.
   
   Some times, the function name can be qualified, we should let users use it to invoke persistent functions as well.
   
   
   ### Why are the changes needed?
   Support qualified function name for `call_function`.
   
   
   ### Does this PR introduce _any_ user-facing change?
   'No'.
   New feature.
   
   
   ### How was this patch tested?
   New test cases.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267506278


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -1161,6 +1161,27 @@ class ClientE2ETestSuite extends RemoteSparkSession with SQLHelper with PrivateM
     val joined = ds1.joinWith(ds2, $"a.value._1" === $"b.value._2", "inner")
     checkSameResult(Seq((Some((2, 3)), Some((1, 2)))), joined)
   }
+
+  test("call_function") {
+    val session: SparkSession = spark
+    import session.implicits._
+    val testData = spark.range(5).repartition(1)
+    try {
+      session.sql("CREATE FUNCTION custom_sum AS 'test.org.apache.spark.sql.MyDoubleSum'")

Review Comment:
   I think it's OK to not test persist functions in spark connect, as it seems hard to include the jar containing the UDF. The client-side implementation is quite simple: constructs a small proto message and server-side turns it to `UnresolvedFunction`. Making sure it works for builtin function is good enough.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan closed pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan closed pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function
URL: https://github.com/apache/spark/pull/41932


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266233707


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   maven test `ClientE2ETestSuite` and `ReplE2ESuite` is ok, but there are another 68 maven tests failed of `connect-jvm-client` module, will tracking with a new ticket. @zhengruifeng 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1648811039

   the failure is unrelated, merging to master/3.5, thanks!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1631831233

   will this also support Spark Connect?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267506524


##########
connector/connect/common/src/main/protobuf/spark/connect/expressions.proto:
##########
@@ -371,3 +372,11 @@ message JavaUDF {
   // (Required) Indicate if the Java user-defined function is an aggregate function
   bool aggregate = 3;
 }
+
+message CallFunction {
+  // (Required) Name of the SQL function.

Review Comment:
   ```suggestion
     // (Required) Unparsed name of the SQL function.
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1630165773

   ping @cloud-fan 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1261861475


##########
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveUDFSuite.scala:
##########
@@ -552,6 +552,16 @@ class HiveUDFSuite extends QueryTest with TestHiveSingleton with SQLTestUtils {
     }
   }
 
+  test("Invoke a persist hive function with call_function") {
+    val testData = spark.range(5).repartition(1)
+    withUserDefinedFunction("`default.custom_func`" -> true) {

Review Comment:
   this is not a persist function. Can you check other tests in this file? we need to use `CREATE FUNCTION` to create persist functions.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266148725


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   run the following commands:
   ```
   build/sbt clean
   build/sbt "connect-client-jvm/test" -Phive
   ```
   
   there is 1 test failed:
   
   ```
   [info] - call_function *** FAILED *** (150 milliseconds)
   [info]   org.apache.spark.SparkException: [CANNOT_LOAD_FUNCTION_CLASS] Cannot load class test.org.apache.spark.sql.MyDoubleSum when registering the function `spark_catalog`.`default`.`custom_sum`, please make sure it is on the classpath.
   [info]   at org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   [info]   at org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   [info]   at org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   [info]   at org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:80)
   [info]   at org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:133)
   [info]   at org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:150)
   [info]   at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2813)
   [info]   at org.apache.spark.sql.Dataset.withResult(Dataset.scala:3252)
   [info]   at org.apache.spark.sql.Dataset.collect(Dataset.scala:2812)
   [info]   at org.apache.spark.sql.ClientE2ETestSuite.$anonfun$new$139(ClientE2ETestSuite.scala:1175)
   [info]   at org.apache.spark.sql.connect.client.util.RemoteSparkSession.$anonfun$test$1(RemoteSparkSession.scala:246)
   ```
   
   @beliefer we should add `(LocalProject("sql") / Test / Keys.`package`).value` to 
   
   https://github.com/apache/spark/blob/228b5dbfd7688a8efa7135d9ec7b00b71e41a38a/project/SparkBuild.scala#L875-L878
   
   then the sql test jar will build&package before testing.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1272871206


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -1161,6 +1161,27 @@ class ClientE2ETestSuite extends RemoteSparkSession with SQLHelper with PrivateM
     val joined = ds1.joinWith(ds2, $"a.value._1" === $"b.value._2", "inner")
     checkSameResult(Seq((Some((2, 3)), Some((1, 2)))), joined)
   }
+
+  test("call_function") {
+    val session: SparkSession = spark
+    import session.implicits._
+    val testData = spark.range(5).repartition(1)
+    try {
+      session.sql("CREATE FUNCTION custom_sum AS 'test.org.apache.spark.sql.MyDoubleSum'")

Review Comment:
   > 
   
   +1, I think we don't need to include this change in this PR



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1271801705


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -1161,6 +1161,27 @@ class ClientE2ETestSuite extends RemoteSparkSession with SQLHelper with PrivateM
     val joined = ds1.joinWith(ds2, $"a.value._1" === $"b.value._2", "inner")
     checkSameResult(Seq((Some((2, 3)), Some((1, 2)))), joined)
   }
+
+  test("call_function") {
+    val session: SparkSession = spark
+    import session.implicits._
+    val testData = spark.range(5).repartition(1)
+    try {
+      session.sql("CREATE FUNCTION custom_sum AS 'test.org.apache.spark.sql.MyDoubleSum'")

Review Comment:
   I have communicated with @zhengruifeng and he agreed your opinion. Let's remove the test case for connect.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266122606


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/functions.scala:
##########
@@ -7926,12 +7926,13 @@ object functions {
    * Call a builtin or temp function.

Review Comment:
   `Call a SQL function`. It supports any function



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1263218083


##########
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveUDFSuite.scala:
##########
@@ -552,6 +552,14 @@ class HiveUDFSuite extends QueryTest with TestHiveSingleton with SQLTestUtils {
     }
   }
 
+  test("Invoke a persist hive function with call_function") {
+    val testData = spark.range(5).repartition(1)
+    withUserDefinedFunction("custom_func" -> false) {
+      sql(s"CREATE FUNCTION custom_func AS '${classOf[GenericUDAFAverage].getName}'")
+      checkAnswer(testData.select(call_function("custom_func", $"id")), Row(2.0))

Review Comment:
   can we also test calling the function with the qualified name? `spark_catalog.default.custom_func`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267940632


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   Both maven and sbt ok now, thanks @beliefer 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1631701550

   The CI failure looks unrelated.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266122784


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##########
@@ -1880,6 +1880,12 @@ class SparkConnectPlanner(val sessionHolder: SessionHolder) extends Logging {
         Some(
           CatalystDataToProtobuf(children.head, messageClassName, binaryFileDescSetOpt, options))
 
+      case "call_function" if fun.getArgumentsCount > 1 =>

Review Comment:
   ```suggestion
         case "call_function" if fun.getArgumentsCount >= 1 =>
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267584892


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   I think this is truly unrelated to this pr and I think the way the `–jars`  is being used in the code is incorrect now.
   
   When submitting the args as
   
   ```
   --jars spark-catalyst-xx.jar
   --jars spark-connect-client-jvm-xx.jar
   --jars spark-sql-xx.jar
   ```
   
   the final effective arg will be `--jars spark-sql-xx.jar`, if we enable debugging logs, we will found that only the `Added JAR` logs related to “spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar” and “spark-connect_2.12-3.5.0-SNAPSHOT.jar” are present.
   
   ```
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar at spark://localhost:56841/jars/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar with timestamp 1689746434318
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar at spark://localhost:56841/jars/spark-connect_2.12-3.5.0-SNAPSHOT.jar with timestamp 1689746434318
   ```
   
   and the configuration item “spark.jars” also only includes these two jars.
   
   ```
   Array((spark.app.name,org.apache.spark.sql.connect.SimpleSparkConnectService), (spark.jars,file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar,file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar), ...
   ```
   
   We should correct the usage of `–jars` to `--jars spark-catalyst-xx.jar,spark-connect-client-jvm-xx.jar,spark-sql-xx.jar`, then the maven test should pass.
   
   I think we can merge this pr first and then fix this issue separately. But, @beliefer if you prefer, you can also address this issue in this one :)
   
   
   
   
   
   



##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   I think this is truly unrelated to this pr and I think the way the `–jars`  is being used in the code is incorrect now.
   
   When submitting the args as
   
   ```
   --jars spark-catalyst-xx.jar
   --jars spark-connect-client-jvm-xx.jar
   --jars spark-sql-xx.jar
   ```
   
   the final effective arg will be `--jars spark-sql-xx.jar`, if we enable debugging logs, we will found that only the `Added JAR` logs related to `spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar` and `spark-connect_2.12-3.5.0-SNAPSHOT.jar` are present.
   
   ```
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar at spark://localhost:56841/jars/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar with timestamp 1689746434318
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar at spark://localhost:56841/jars/spark-connect_2.12-3.5.0-SNAPSHOT.jar with timestamp 1689746434318
   ```
   
   and the configuration item “spark.jars” also only includes these two jars.
   
   ```
   Array((spark.app.name,org.apache.spark.sql.connect.SimpleSparkConnectService), (spark.jars,file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar,file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar), ...
   ```
   
   We should correct the usage of `–jars` to `--jars spark-catalyst-xx.jar,spark-connect-client-jvm-xx.jar,spark-sql-xx.jar`, then the maven test should pass.
   
   I think we can merge this pr first and then fix this issue separately. But, @beliefer if you prefer, you can also address this issue in this one :)
   
   
   
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266696588


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   @LuciferYang I tested with
   ```
   build/mvn clean install -DskipTests -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
   build/mvn test -pl connector/connect/client/jvm
   ```
   I reproduce this issue too. It seems is unrelated to this PR.
   ```
   Run completed in 2 minutes, 10 seconds.
   Total number of tests run: 1101
   Suites: completed 24, aborted 0
   Tests: succeeded 1033, failed 68, canceled 0, ignored 1, pending 0
   *** 68 TESTS FAILED ***
   [INFO] ------------------------------------------------------------------------
   [INFO] BUILD FAILURE
   [INFO] ------------------------------------------------------------------------
   [INFO] Total time:  03:02 min
   [INFO] Finished at: 2023-07-18T20:20:38+08:00
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1270137127


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -1161,6 +1161,27 @@ class ClientE2ETestSuite extends RemoteSparkSession with SQLHelper with PrivateM
     val joined = ds1.joinWith(ds2, $"a.value._1" === $"b.value._2", "inner")
     checkSameResult(Seq((Some((2, 3)), Some((1, 2)))), joined)
   }
+
+  test("call_function") {
+    val session: SparkSession = spark
+    import session.implicits._
+    val testData = spark.range(5).repartition(1)
+    try {
+      session.sql("CREATE FUNCTION custom_sum AS 'test.org.apache.spark.sql.MyDoubleSum'")

Review Comment:
   Is it really worth it? @zhengruifeng 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1263217835


##########
sql/core/src/main/scala/org/apache/spark/sql/functions.scala:
##########
@@ -8367,8 +8367,17 @@ object functions {
    * @since 3.5.0
    */
   @scala.annotation.varargs
-  def call_function(funcName: String, cols: Column*): Column =
-    withExpr { UnresolvedFunction(funcName, cols.map(_.expr), false) }
+  def call_function(funcName: String, cols: Column*): Column = {

Review Comment:
   can we update the API doc? we should highlight that the function name can be qualified using the SQL syntax.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267584892


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   I think this is truly unrelated to this pr and I think the way the “–jars”  is being used in the code is incorrect now.
   
   When submitting the args as
   
   ```
   --jars spark-catalyst-xx.jar
   --jars spark-connect-client-jvm-xx.jar
   --jars spark-sql-xx.jar
   ```
   
   the final effective arg will be `--jars spark-sql-xx.jar`, if we enable debugging logs, we will found that only the “Added JAR” logs related to “spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar” and “spark-connect_2.12-3.5.0-SNAPSHOT.jar” are present.
   
   ```
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar at spark://localhost:56841/jars/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar with timestamp 1689746434318
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar at spark://localhost:56841/jars/spark-connect_2.12-3.5.0-SNAPSHOT.jar with timestamp 1689746434318
   ```
   
   and the configuration item “spark.jars” also only includes these two jars.
   
   ```
   Array((spark.app.name,org.apache.spark.sql.connect.SimpleSparkConnectService), (spark.jars,file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar,file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar), ...
   ```
   
   We should correct the usage of “–jars” to “--jars spark-catalyst-xx.jar,spark-connect-client-jvm-xx.jar,spark-sql-xx.jar”, then the maven test should pass.
   
   I think we can merge this pr first and then fix this issue separately. But, @beliefer if you prefer, you can also address this issue in this one :)
   
   
   
   
   
   



##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   I think this is truly unrelated to this pr and I think the way the “–jars”  is being used in the code is incorrect now.
   
   When submitting the args as
   
   ```
   --jars spark-catalyst-xx.jar
   --jars spark-connect-client-jvm-xx.jar
   --jars spark-sql-xx.jar
   ```
   
   the final effective arg will be `--jars spark-sql-xx.jar`, if we enable debugging logs, we will found that only the “Added JAR” logs related to “spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar” and “spark-connect_2.12-3.5.0-SNAPSHOT.jar” are present.
   
   ```
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar at spark://localhost:56841/jars/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar with timestamp 1689746434318
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar at spark://localhost:56841/jars/spark-connect_2.12-3.5.0-SNAPSHOT.jar with timestamp 1689746434318
   ```
   
   and the configuration item “spark.jars” also only includes these two jars.
   
   ```
   Array((spark.app.name,org.apache.spark.sql.connect.SimpleSparkConnectService), (spark.jars,file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar,file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar), ...
   ```
   
   We should correct the usage of “–jars” to “--jars spark-catalyst-xx.jar,spark-connect-client-jvm-xx.jar,spark-sql-xx.jar”, then the maven test should pass.
   
   I think we can merge this pr first and then fix this issue separately. But, @beliefer if you prefer, you can also address this issue in this one :)
   
   
   
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266733928


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   I'm checking now.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266122931


##########
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##########
@@ -1880,6 +1880,12 @@ class SparkConnectPlanner(val sessionHolder: SessionHolder) extends Logging {
         Some(
           CatalystDataToProtobuf(children.head, messageClassName, binaryFileDescSetOpt, options))
 
+      case "call_function" if fun.getArgumentsCount > 1 =>

Review Comment:
   We should support no-arg function as well.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266406645


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   re-check maven on master and with-this-pr:
   
   **Before**
   
   ```
   build/mvn clean -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
   build/mvn clean install -DskipTests -Phive
   build/mvn clean test -pl connector/connect/client/jvm
   ```
   
   ```
   Run completed in 2 minutes, 26 seconds.
   Total number of tests run: 1099
   Suites: completed 24, aborted 0
   Tests: succeeded 1099, failed 0, canceled 0, ignored 1, pending 0
   All tests passed.
   ```
   
   **After**
   
   ```
   gh pr checkout 41932
   build/mvn clean -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
   build/mvn clean install -DskipTests -Phive
   build/mvn clean test -pl connector/connect/client/jvm
   ```
   ```
   Run completed in 1 minute, 29 seconds.
   Total number of tests run: 1101
   Suites: completed 24, aborted 0
   Tests: succeeded 1033, failed 68, canceled 0, ignored 1, pending 0
   *** 68 TESTS FAILED ***
   ```
   
   A bit magical, but seems this pr has caused other cases to maven test fail. Could you double check this ? @beliefer 
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1635298609

   @beliefer branch cut is soon, shall we also support it in Spark Connect? Otherwise, the behaviors will be different


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1631846388

   `call_udf` and `callUDF` directly invoke `call_function`, are we going to also make them support qualified name?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266123243


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/functions.scala:
##########
@@ -7926,12 +7926,13 @@ object functions {
    * Call a builtin or temp function.
    *
    * @param funcName
-   *   function name
+   *   function name that can be qualified using the SQL syntax
    * @param cols
    *   the expression parameters of function
    * @since 3.5.0
    */
   @scala.annotation.varargs
-  def call_function(funcName: String, cols: Column*): Column = Column.fn(funcName, cols: _*)
+  def call_function(funcName: String, cols: Column*): Column =
+    Column.fn("call_function", lit(funcName) +: cols: _*)

Review Comment:
   shall we add a new proto message for it? Currently it may conflict with calling a temp function named `call_function`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266118099


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   @LuciferYang would you mind help checking this part? I am not familiar with this



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266148725


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   run the following commands:
   ```
   build/sbt clean
   build/sbt "connect-client-jvm/test" -Phive
   ```
   
   there is 1 test failed:
   
   ```
   [info] - call_function *** FAILED *** (150 milliseconds)
   [info]   org.apache.spark.SparkException: [CANNOT_LOAD_FUNCTION_CLASS] Cannot load class test.org.apache.spark.sql.MyDoubleSum when registering the function `spark_catalog`.`default`.`custom_sum`, please make sure it is on the classpath.
   [info]   at org.apache.spark.sql.connect.client.GrpcExceptionConverter$.toSparkThrowable(GrpcExceptionConverter.scala:53)
   [info]   at org.apache.spark.sql.connect.client.GrpcExceptionConverter$.convert(GrpcExceptionConverter.scala:30)
   [info]   at org.apache.spark.sql.connect.client.GrpcExceptionConverter$$anon$1.hasNext(GrpcExceptionConverter.scala:38)
   [info]   at org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:80)
   [info]   at org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:133)
   [info]   at org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:150)
   [info]   at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2813)
   [info]   at org.apache.spark.sql.Dataset.withResult(Dataset.scala:3252)
   [info]   at org.apache.spark.sql.Dataset.collect(Dataset.scala:2812)
   [info]   at org.apache.spark.sql.ClientE2ETestSuite.$anonfun$new$139(ClientE2ETestSuite.scala:1175)
   [info]   at org.apache.spark.sql.connect.client.util.RemoteSparkSession.$anonfun$test$1(RemoteSparkSession.scala:246)
   ```
   
   @beliefer we should add `(LocalProject("sql") / Test / Keys.`package`).value` to 
   
   https://github.com/apache/spark/blob/228b5dbfd7688a8efa7135d9ec7b00b71e41a38a/project/SparkBuild.scala#L875-L878
   
   then the sql test jar will build&package before testing.
   
   For maven, let me do more check



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267892365


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -1161,6 +1161,27 @@ class ClientE2ETestSuite extends RemoteSparkSession with SQLHelper with PrivateM
     val joined = ds1.joinWith(ds2, $"a.value._1" === $"b.value._2", "inner")
     checkSameResult(Seq((Some((2, 3)), Some((1, 2)))), joined)
   }
+
+  test("call_function") {
+    val session: SparkSession = spark
+    import session.implicits._
+    val testData = spark.range(5).repartition(1)
+    try {
+      session.sql("CREATE FUNCTION custom_sum AS 'test.org.apache.spark.sql.MyDoubleSum'")

Review Comment:
   I have fixed the issue. Please wait for the CI



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1260546990


##########
sql/core/src/test/scala/org/apache/spark/sql/DataFrameFunctionsSuite.scala:
##########
@@ -5918,6 +5918,9 @@ class DataFrameFunctionsSuite extends QueryTest with SharedSparkSession {
 
   test("call_function") {
     checkAnswer(testData2.select(call_function("avg", $"a")), testData2.selectExpr("avg(a)"))
+
+    spark.udf.register("default.test_avg", (i: Int) => { i + 2 })

Review Comment:
   can we add a test in `HiveUDFSuite` to make sure we can invoke a persist hive function?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1631897783

   > `call_udf` and `callUDF` directly invoke `call_function`, are we going to also make them support qualified name?
   
   Consider the end users, I think we should keep the behavior of `call_udf` not be changed. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1261919427


##########
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveUDFSuite.scala:
##########
@@ -552,6 +552,16 @@ class HiveUDFSuite extends QueryTest with TestHiveSingleton with SQLTestUtils {
     }
   }
 
+  test("Invoke a persist hive function with call_function") {
+    val testData = spark.range(5).repartition(1)
+    withUserDefinedFunction("`default.custom_func`" -> true) {

Review Comment:
   Got it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1635368066

   > @beliefer branch cut is soon, shall we also support it in Spark Connect? Otherwise, the behaviors will be different
   
   It's better to support too.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266174289


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/functions.scala:
##########
@@ -7926,12 +7926,13 @@ object functions {
    * Call a builtin or temp function.
    *
    * @param funcName
-   *   function name
+   *   function name that can be qualified using the SQL syntax
    * @param cols
    *   the expression parameters of function
    * @since 3.5.0
    */
   @scala.annotation.varargs
-  def call_function(funcName: String, cols: Column*): Column = Column.fn(funcName, cols: _*)
+  def call_function(funcName: String, cols: Column*): Column =
+    Column.fn("call_function", lit(funcName) +: cols: _*)

Review Comment:
   Yeah. Good suggestion.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267584892


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   I think this is truly unrelated to this pr and I think the way the `–jars`  is being used in the code is incorrect now.
   
   When submitting the args as
   
   ```
   --jars spark-catalyst-xx.jar
   --jars spark-connect-client-jvm-xx.jar
   --jars spark-sql-xx.jar
   ```
   
   the final effective arg will be `--jars spark-sql-xx.jar`, if we enable debugging logs, we will found that only the “Added JAR” logs related to “spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar” and “spark-connect_2.12-3.5.0-SNAPSHOT.jar” are present.
   
   ```
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar at spark://localhost:56841/jars/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar with timestamp 1689746434318
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar at spark://localhost:56841/jars/spark-connect_2.12-3.5.0-SNAPSHOT.jar with timestamp 1689746434318
   ```
   
   and the configuration item “spark.jars” also only includes these two jars.
   
   ```
   Array((spark.app.name,org.apache.spark.sql.connect.SimpleSparkConnectService), (spark.jars,file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar,file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar), ...
   ```
   
   We should correct the usage of `–jars` to `--jars spark-catalyst-xx.jar,spark-connect-client-jvm-xx.jar,spark-sql-xx.jar`, then the maven test should pass.
   
   I think we can merge this pr first and then fix this issue separately. But, @beliefer if you prefer, you can also address this issue in this one :)
   
   
   
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267606644


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   @LuciferYang Thank you for the investigation. I will take it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266712240


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   > @LuciferYang I tested with
   > 
   > ```
   > build/mvn clean install -DskipTests -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
   > build/mvn test -pl connector/connect/client/jvm
   > ```
   > 
   > I reproduce this issue too. It seems is unrelated to this PR.
   > 
   > ```
   > Run completed in 2 minutes, 10 seconds.
   > Total number of tests run: 1101
   > Suites: completed 24, aborted 0
   > Tests: succeeded 1033, failed 68, canceled 0, ignored 1, pending 0
   > *** 68 TESTS FAILED ***
   > [INFO] ------------------------------------------------------------------------
   > [INFO] BUILD FAILURE
   > [INFO] ------------------------------------------------------------------------
   > [INFO] Total time:  03:02 min
   > [INFO] Finished at: 2023-07-18T20:20:38+08:00
   > ```
   
   master branch?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1261861049


##########
sql/core/src/main/scala/org/apache/spark/sql/functions.scala:
##########
@@ -8368,8 +8368,13 @@ object functions {
    * @since 3.5.0
    */
   @scala.annotation.varargs
-  def call_function(funcName: String, cols: Column*): Column =
-    withExpr { UnresolvedFunction(funcName, cols.map(_.expr), false) }
+  def call_function(funcName: String, cols: Column*): Column = withExpr {

Review Comment:
   shall we add a private method that takes a `Seq[String]`, so that we can call it if a method does not want to support qualified names?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266953465


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   To clarify, in my local testing:
   - master branch: All tests passed.
   - with this pr: 68 TESTS FAILED



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266323060


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   hmm... please wait a moment, let me check this again.
   
   
   
    



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266121784


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/functions.scala:
##########
@@ -7926,12 +7926,13 @@ object functions {
    * Call a builtin or temp function.
    *
    * @param funcName
-   *   function name
+   *   function name that can be qualified using the SQL syntax

Review Comment:
   ```suggestion
      *   function name that follows the SQL identifier syntax (can be quoted, can be qualified)
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266158593


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   @LuciferYang Thank you for you reminder. I will add it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266247126


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   @LuciferYang  Thank you!
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267967007


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   Thank you for the double check. @LuciferYang 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1260637541


##########
sql/core/src/test/scala/org/apache/spark/sql/DataFrameFunctionsSuite.scala:
##########
@@ -5918,6 +5918,9 @@ class DataFrameFunctionsSuite extends QueryTest with SharedSparkSession {
 
   test("call_function") {
     checkAnswer(testData2.select(call_function("avg", $"a")), testData2.selectExpr("avg(a)"))
+
+    spark.udf.register("default.test_avg", (i: Int) => { i + 2 })

Review Comment:
   OK



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1648867502

   @cloud-fan @zhengruifeng @LuciferYang Thank you for all!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267584892


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   I think this is truly unrelated to this pr and I think the way the “–jars”  is being used in the code is incorrect now.
   
   When submitting the args as
   
   ```
   --jars spark-catalyst-xx.jar
   --jars spark-connect-client-jvm-xx.jar
   --jars spark-sql-xx.jar
   ```
   
   the final effective arg will be `--jars spark-sql-xx.jar`, if we enable debugging logs, we will found that only the “Added JAR” logs related to “spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar” and “spark-connect_2.12-3.5.0-SNAPSHOT.jar” are present.
   
   ```
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar at spark://localhost:56841/jars/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar with timestamp 1689746434318
   23/07/19 14:00:34 INFO SparkContext: Added JAR file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar at spark://localhost:56841/jars/spark-connect_2.12-3.5.0-SNAPSHOT.jar with timestamp 1689746434318
   ```
   
   and the configuration item “spark.jars” also only includes these two jars.
   
   ```
   Array((spark.app.name,org.apache.spark.sql.connect.SimpleSparkConnectService), (spark.jars,file:///Users/yangjie01/SourceCode/git/spark-mine-12/sql/core/target/spark-sql_2.12-3.5.0-SNAPSHOT-tests.jar,file:/Users/yangjie01/SourceCode/git/spark-mine-12/connector/connect/server/target/spark-connect_2.12-3.5.0-SNAPSHOT.jar), ...
   ```
   
   We should correct the usage of `–jars` to `--jars spark-catalyst-xx.jar,spark-connect-client-jvm-xx.jar,spark-sql-xx.jar`, then the maven test should pass.
   
   I think we can merge this pr first and then fix this issue separately. But, @beliefer if you prefer, you can also address this issue in this one :)
   
   
   
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on PR #41932:
URL: https://github.com/apache/spark/pull/41932#issuecomment-1643189573

   The CI failure is unrelated to this PR.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267506278


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -1161,6 +1161,27 @@ class ClientE2ETestSuite extends RemoteSparkSession with SQLHelper with PrivateM
     val joined = ds1.joinWith(ds2, $"a.value._1" === $"b.value._2", "inner")
     checkSameResult(Seq((Some((2, 3)), Some((1, 2)))), joined)
   }
+
+  test("call_function") {
+    val session: SparkSession = spark
+    import session.implicits._
+    val testData = spark.range(5).repartition(1)
+    try {
+      session.sql("CREATE FUNCTION custom_sum AS 'test.org.apache.spark.sql.MyDoubleSum'")

Review Comment:
   I think it's OK to not test persist functions in spark connect, as it seems hard to include the jar containing the UDF. The client-side implementation is quite simple: constructs a small proto message and server-side turns it to `UnresolvedFunction`. Making sure it works for builtin function is good enought.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267507346


##########
sql/core/src/main/scala/org/apache/spark/sql/functions.scala:
##########
@@ -8357,18 +8357,27 @@ object functions {
    */
   @scala.annotation.varargs
   def call_udf(udfName: String, cols: Column*): Column =
-    call_function(udfName, cols: _*)
+    call_function(Seq(udfName), cols: _*)
 
   /**
-   * Call a builtin or temp function.
+   * Call a SQL function.
    *
-   * @param funcName function name
+   * @param funcName function name that can be qualified using the SQL syntax

Review Comment:
   let's make sure the docs are consistent in all places.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1267507172


##########
python/pyspark/sql/functions.py:
##########
@@ -14395,15 +14395,15 @@ def call_udf(udfName: str, *cols: "ColumnOrName") -> Column:
 
 
 @try_remote_functions
-def call_function(udfName: str, *cols: "ColumnOrName") -> Column:
+def call_function(funcName: str, *cols: "ColumnOrName") -> Column:
     """
     Call a builtin or temp function.

Review Comment:
   can we update the doc in all places?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][PYTHON][CONNECT][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266406645


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   re-check maven on master and with-this-pr:
   
   **Before**
   
   ```
   build/mvn clean -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
   build/mvn clean install -DskipTests -Phive
   build/mvn clean test -pl connector/connect/client/jvm
   ```
   
   ```
   Run completed in 2 minutes, 26 seconds.
   Total number of tests run: 1099
   Suites: completed 24, aborted 0
   Tests: succeeded 1099, failed 0, canceled 0, ignored 1, pending 0
   All tests passed.
   ```
   
   **After**
   
   ```
   gh pr checkout 41932
   build/mvn clean -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
   build/mvn clean install -DskipTests -Phive
   build/mvn clean test -pl connector/connect/client/jvm
   ```
   Run completed in 1 minute, 29 seconds.
   Total number of tests run: 1101
   Suites: completed 24, aborted 0
   Tests: succeeded 1033, failed 68, canceled 0, ignored 1, pending 0
   *** 68 TESTS FAILED ***
   ```
   
   A bit magical, but seems this pr has caused other cases to maven test fail. Could you double check this ? @beliefer 
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] beliefer commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "beliefer (via GitHub)" <gi...@apache.org>.
beliefer commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1264878826


##########
sql/core/src/main/scala/org/apache/spark/sql/functions.scala:
##########
@@ -8367,8 +8367,17 @@ object functions {
    * @since 3.5.0
    */
   @scala.annotation.varargs
-  def call_function(funcName: String, cols: Column*): Column =
-    withExpr { UnresolvedFunction(funcName, cols.map(_.expr), false) }
+  def call_function(funcName: String, cols: Column*): Column = {

Review Comment:
   Done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] LuciferYang commented on a diff in pull request #41932: [SPARK-44131][SQL][FOLLOWUP] Support qualified function name for call_function

Posted by "LuciferYang (via GitHub)" <gi...@apache.org>.
LuciferYang commented on code in PR #41932:
URL: https://github.com/apache/spark/pull/41932#discussion_r1266123212


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/util/RemoteSparkSession.scala:
##########
@@ -139,7 +139,18 @@ object SparkConnectServerUtils {
       .map(clientTestJar => Seq("--jars", clientTestJar.getCanonicalPath))
       .getOrElse(Seq.empty)
 
-    writerV2Configs ++ hiveTestConfigs ++ udfTestConfigs

Review Comment:
   ok



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org