You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "amaliujia (via GitHub)" <gi...@apache.org> on 2023/07/31 20:02:14 UTC

[GitHub] [spark] amaliujia commented on a diff in pull request #42245: [SPARK-29497][CONNECT] Throw error when UDF is not deserializable.

amaliujia commented on code in PR #42245:
URL: https://github.com/apache/spark/pull/42245#discussion_r1279822239


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/expressions/UserDefinedFunction.scala:
##########
@@ -144,6 +146,25 @@ case class ScalarUserDefinedFunction private[sql] (
 }
 
 object ScalarUserDefinedFunction {
+  private val LAMBDA_DESERIALIZATION_ERR_MSG: String =
+    "cannot assign instance of java.lang.invoke.SerializedLambda to field"
+
+  private def checkDeserializable(bytes: Array[Byte]): Unit = {
+    try {
+      SparkSerDeUtils.deserialize(bytes, SparkClassUtils.getContextOrSparkClassLoader)
+    } catch {
+      case e: ClassCastException if e.getMessage.contains(LAMBDA_DESERIALIZATION_ERR_MSG) =>
+        throw new SparkException(
+          "UDF cannot be executed on a Spark cluster, it cannot be deserialized." +
+            "This is very likely to be caused by the UDF having a self-reference, " +
+            "this is not supported by java serialization.")

Review Comment:
   ```suggestion
           throw new SparkException(
             "UDF cannot be executed on a Spark cluster: it cannot be deserialized." +
               "This is very likely to be caused by the UDF having a self-reference. " +
               "This is not supported by java serialization.")
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org