You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "cashmand (via GitHub)" <gi...@apache.org> on 2024/03/06 18:34:45 UTC

[PR] [SPARK-45827] Move data type checks to CreatableRelationProvider [spark]

cashmand opened a new pull request, #45409:
URL: https://github.com/apache/spark/pull/45409

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the guideline first in
        'common/utils/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   
   In DataSource.scala, there are checks to prevent writing Variant and Interval types to a `CreatableRelationalProvider`. This PR unifies the checks in a method on `CreatableRelationalProvider` so that data sources can override in order to specify a different set of supported data types.
   
   ### Why are the changes needed?
   
   Allows data sources to specify what types they support, while providing a sensible default for most data sources.
   
   ### Does this PR introduce _any_ user-facing change?
   
   The error message for Variant and Interval are now shared, and are a bit more generic. The intent is to otherwise not have any user-facing change.
   
   ### How was this patch tested?
   
   Unit tests added.
   
   ### Was this patch authored or co-authored using generative AI tooling?
   
   No.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827] Move data type checks to CreatableRelationProvider [spark]

Posted by "cashmand (via GitHub)" <gi...@apache.org>.
cashmand commented on PR #45409:
URL: https://github.com/apache/spark/pull/45409#issuecomment-1982352373

   @cloud-fan Thanks for the review! I've updated with your feedback.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827] Move data type checks to CreatableRelationProvider [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #45409:
URL: https://github.com/apache/spark/pull/45409#discussion_r1515582120


##########
sql/core/src/main/scala/org/apache/spark/sql/sources/interfaces.scala:
##########
@@ -175,6 +175,25 @@ trait CreatableRelationProvider {
       mode: SaveMode,
       parameters: Map[String, String],
       data: DataFrame): BaseRelation
+
+  /**
+   * Check if the relation supports the given data type.
+   *
+   * @param dt Data type to check
+   * @return True if the data type is supported

Review Comment:
   let's add `@since 4.0.0`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827] Move data type checks to CreatableRelationProvider [spark]

Posted by "cashmand (via GitHub)" <gi...@apache.org>.
cashmand commented on code in PR #45409:
URL: https://github.com/apache/spark/pull/45409#discussion_r1515515440


##########
sql/core/src/main/scala/org/apache/spark/sql/sources/interfaces.scala:
##########
@@ -175,6 +175,32 @@ trait CreatableRelationProvider {
       mode: SaveMode,
       parameters: Map[String, String],
       data: DataFrame): BaseRelation
+
+  /**
+   * Check if the relation supports the given data type.
+   *
+   * @param dt Data type to check
+   * @return True if the data type is supported
+   */
+  def supportsDataType(
+      dt: DataType
+  ): Boolean = {
+    dt match {
+      case ArrayType(e, _) => supportsDataType(e)
+      case MapType(k, v, _) =>
+        supportsDataType(k) && supportsDataType(v)
+      case StructType(fields) => fields.forall(f => supportsDataType(f.dataType))
+      case udt: UserDefinedType[_] => supportsDataType(udt.sqlType)
+      case _: AnsiIntervalType | CalendarIntervalType | VariantType => false
+      case BinaryType | BooleanType | ByteType | CalendarIntervalType | CharType(_) | DateType |
+           DayTimeIntervalType(_, _) | _ : DecimalType |  DoubleType | FloatType |

Review Comment:
   Oh, that was an accident. I'll remove from this list. I can have it only list supported types.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827] Move data type checks to CreatableRelationProvider [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #45409:
URL: https://github.com/apache/spark/pull/45409#discussion_r1515426129


##########
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/SaveIntoDataSourceCommandSuite.scala:
##########
@@ -71,6 +71,39 @@ class SaveIntoDataSourceCommandSuite extends QueryTest with SharedSparkSession {
 
     FakeV1DataSource.data = null
   }
+
+  test("Data type support") {
+
+    val dataSource = DataSource(
+      sparkSession = spark,
+      className = "jdbc",
+      partitionColumns = Nil,
+      options = Map())
+
+    val df = spark.range(1).selectExpr(
+        "cast('a' as binary) a", "true b", "cast(1 as byte) c", "1.23 d")
+    dataSource.planForWriting(SaveMode.ErrorIfExists, df.logicalPlan)
+
+    withSQLConf("spark.databricks.variant.enabled" -> "true") {
+      // Variant and Interval types are disallowed by default.
+      val unsupportedTypes = Seq(
+          "parse_json('1') v",
+          "array(parse_json('1'))",
+          "struct(1, parse_json('1')) s",
+          "map(1, parse_json('1')) s",
+          "INTERVAL '1' MONTH i",
+          "make_ym_interval(1, 2) ym",
+          "make_dt_interval(1, 2, 3, 4) dt")
+
+      unsupportedTypes.foreach { expr =>
+        val df = spark.range(1).selectExpr(expr)
+        val e = intercept[AnalysisException] {
+          dataSource.planForWriting(SaveMode.ErrorIfExists, df.logicalPlan)
+        }
+        assert(e.getMessage.contains("UNSUPPORTED_DATA_TYPE_FOR_DATASOURCE"))

Review Comment:
   let's use `checkError` for it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827][SQL] Move data type checks to CreatableRelationProvider [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #45409:
URL: https://github.com/apache/spark/pull/45409#issuecomment-1983973892

   thanks, merging to master!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827] Move data type checks to CreatableRelationProvider [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #45409:
URL: https://github.com/apache/spark/pull/45409#discussion_r1515423962


##########
sql/core/src/main/scala/org/apache/spark/sql/sources/interfaces.scala:
##########
@@ -175,6 +175,32 @@ trait CreatableRelationProvider {
       mode: SaveMode,
       parameters: Map[String, String],
       data: DataFrame): BaseRelation
+
+  /**
+   * Check if the relation supports the given data type.
+   *
+   * @param dt Data type to check
+   * @return True if the data type is supported
+   */
+  def supportsDataType(
+      dt: DataType
+  ): Boolean = {
+    dt match {
+      case ArrayType(e, _) => supportsDataType(e)
+      case MapType(k, v, _) =>
+        supportsDataType(k) && supportsDataType(v)
+      case StructType(fields) => fields.forall(f => supportsDataType(f.dataType))
+      case udt: UserDefinedType[_] => supportsDataType(udt.sqlType)
+      case _: AnsiIntervalType | CalendarIntervalType | VariantType => false
+      case BinaryType | BooleanType | ByteType | CalendarIntervalType | CharType(_) | DateType |
+           DayTimeIntervalType(_, _) | _ : DecimalType |  DoubleType | FloatType |

Review Comment:
   It's confusing to have the interval types in both the true and false case matches. Shall we stick with the allowlist approach and only list the supported types?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827][SQL] Move data type checks to CreatableRelationProvider [spark]

Posted by "cashmand (via GitHub)" <gi...@apache.org>.
cashmand commented on code in PR #45409:
URL: https://github.com/apache/spark/pull/45409#discussion_r1516216789


##########
sql/core/src/main/scala/org/apache/spark/sql/sources/interfaces.scala:
##########
@@ -175,6 +175,25 @@ trait CreatableRelationProvider {
       mode: SaveMode,
       parameters: Map[String, String],
       data: DataFrame): BaseRelation
+
+  /**
+   * Check if the relation supports the given data type.
+   *
+   * @param dt Data type to check
+   * @return True if the data type is supported

Review Comment:
   Done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827][SQL] Move data type checks to CreatableRelationProvider [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan closed pull request #45409: [SPARK-45827][SQL] Move data type checks to CreatableRelationProvider
URL: https://github.com/apache/spark/pull/45409


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827] Move data type checks to CreatableRelationProvider [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #45409:
URL: https://github.com/apache/spark/pull/45409#discussion_r1515421569


##########
sql/core/src/main/scala/org/apache/spark/sql/sources/interfaces.scala:
##########
@@ -175,6 +175,32 @@ trait CreatableRelationProvider {
       mode: SaveMode,
       parameters: Map[String, String],
       data: DataFrame): BaseRelation
+
+  /**
+   * Check if the relation supports the given data type.
+   *
+   * @param dt Data type to check
+   * @return True if the data type is supported
+   */
+  def supportsDataType(
+      dt: DataType
+  ): Boolean = {
+    dt match {
+      case ArrayType(e, _) => supportsDataType(e)
+      case MapType(k, v, _) =>
+        supportsDataType(k) && supportsDataType(v)

Review Comment:
   ```suggestion
         case MapType(k, v, _) => supportsDataType(k) && supportsDataType(v)
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-45827] Move data type checks to CreatableRelationProvider [spark]

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #45409:
URL: https://github.com/apache/spark/pull/45409#discussion_r1515421350


##########
sql/core/src/main/scala/org/apache/spark/sql/sources/interfaces.scala:
##########
@@ -175,6 +175,32 @@ trait CreatableRelationProvider {
       mode: SaveMode,
       parameters: Map[String, String],
       data: DataFrame): BaseRelation
+
+  /**
+   * Check if the relation supports the given data type.
+   *
+   * @param dt Data type to check
+   * @return True if the data type is supported
+   */
+  def supportsDataType(
+      dt: DataType
+  ): Boolean = {

Review Comment:
   ```suggestion
     def supportsDataType(dt: DataType): Boolean = {
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org