You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2023/02/27 00:29:10 UTC
[spark] branch branch-3.4 updated: [SPARK-42569][CONNECT][FOLLOW-UP] Throw unsupported exceptions for persist
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new f6a58343833 [SPARK-42569][CONNECT][FOLLOW-UP] Throw unsupported exceptions for persist
f6a58343833 is described below
commit f6a5834383338b882bc5da4790c2d3c2f99cb17f
Author: Rui Wang <ru...@databricks.com>
AuthorDate: Mon Feb 27 09:28:42 2023 +0900
[SPARK-42569][CONNECT][FOLLOW-UP] Throw unsupported exceptions for persist
### What changes were proposed in this pull request?
Follow up https://github.com/apache/spark/pull/40164 to also throw unsupported operation exception for `persist`. Right now we are ok to depends on the `StorageLevel` in core module but in the future that shall be refactored and moved to a common module.
### Why are the changes needed?
Better way to indicate a non-supported API.
### Does this PR introduce _any_ user-facing change?
NO
### How was this patch tested?
N/A
Closes #40172 from amaliujia/unsupported_op_2.
Authored-by: Rui Wang <ru...@databricks.com>
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
(cherry picked from commit 08675f2922e4e018e25083760a0ac7413229bc43)
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
.../client/jvm/src/main/scala/org/apache/spark/sql/Dataset.scala | 9 +++++++++
1 file changed, 9 insertions(+)
diff --git a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Dataset.scala b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Dataset.scala
index 0266120d0ed..dcc770dfe55 100644
--- a/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Dataset.scala
+++ b/connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Dataset.scala
@@ -27,6 +27,7 @@ import org.apache.spark.sql.catalyst.expressions.RowOrdering
import org.apache.spark.sql.connect.client.SparkResult
import org.apache.spark.sql.connect.common.DataTypeProtoConverter
import org.apache.spark.sql.types.{Metadata, StructType}
+import org.apache.spark.storage.StorageLevel
import org.apache.spark.util.Utils
/**
@@ -2584,6 +2585,14 @@ class Dataset[T] private[sql] (val sparkSession: SparkSession, private[sql] val
new DataFrameWriterV2[T](table, this)
}
+ def persist(): this.type = {
+ throw new UnsupportedOperationException("persist is not implemented.")
+ }
+
+ def persist(newLevel: StorageLevel): this.type = {
+ throw new UnsupportedOperationException("persist is not implemented.")
+ }
+
def unpersist(blocking: Boolean): this.type = {
throw new UnsupportedOperationException("unpersist() is not implemented.")
}
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org