You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by we...@apache.org on 2020/03/31 17:37:25 UTC
[spark] branch branch-3.0 updated: [SPARK-31010][SQL][FOLLOW-UP]
Add Java UDF suggestion in error message of untyped Scala UDF
This is an automated email from the ASF dual-hosted git repository.
wenchen pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.0 by this push:
new 207344d [SPARK-31010][SQL][FOLLOW-UP] Add Java UDF suggestion in error message of untyped Scala UDF
207344d is described below
commit 207344d0da86496b377c2c5f5ad613c6d02f4c33
Author: yi.wu <yi...@databricks.com>
AuthorDate: Tue Mar 31 17:35:26 2020 +0000
[SPARK-31010][SQL][FOLLOW-UP] Add Java UDF suggestion in error message of untyped Scala UDF
### What changes were proposed in this pull request?
Added Java UDF suggestion in the in error message of untyped Scala UDF.
### Why are the changes needed?
To help user migrate their use case from deprecate untyped Scala UDF to other supported UDF.
### Does this PR introduce any user-facing change?
No. It haven't been released.
### How was this patch tested?
Pass Jenkins.
Closes #28070 from Ngone51/spark_31010.
Authored-by: yi.wu <yi...@databricks.com>
Signed-off-by: Wenchen Fan <we...@databricks.com>
(cherry picked from commit 590b9a0132b68d9523e663997def957b2e46dfb1)
Signed-off-by: Wenchen Fan <we...@databricks.com>
---
sql/core/src/main/scala/org/apache/spark/sql/functions.scala | 10 +++++++---
1 file changed, 7 insertions(+), 3 deletions(-)
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/functions.scala b/sql/core/src/main/scala/org/apache/spark/sql/functions.scala
index fd4e77f..782be98 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/functions.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/functions.scala
@@ -4841,9 +4841,13 @@ object functions {
"information. Spark may blindly pass null to the Scala closure with primitive-type " +
"argument, and the closure will see the default value of the Java type for the null " +
"argument, e.g. `udf((x: Int) => x, IntegerType)`, the result is 0 for null input. " +
- "You could use typed Scala UDF APIs (e.g. `udf((x: Int) => x)`) to avoid this problem, " +
- s"or set ${SQLConf.LEGACY_ALLOW_UNTYPED_SCALA_UDF.key} to true and use this API with " +
- s"caution."
+ "To get rid of this error, you could:\n" +
+ "1. use typed Scala UDF APIs, e.g. `udf((x: Int) => x)`\n" +
+ "2. use Java UDF APIs, e.g. `udf(new UDF1[String, Integer] { " +
+ "override def call(s: String): Integer = s.length() }, IntegerType)`, " +
+ "if input types are all non primitive\n" +
+ s"3. set ${SQLConf.LEGACY_ALLOW_UNTYPED_SCALA_UDF.key} to true and " +
+ s"use this API with caution"
throw new AnalysisException(errorMsg)
}
SparkUserDefinedFunction(f, dataType, inputEncoders = Nil)
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org