You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "aokolnychyi (via GitHub)" <gi...@apache.org> on 2023/03/06 21:51:19 UTC

[GitHub] [spark] aokolnychyi opened a new pull request, #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

aokolnychyi opened a new pull request, #40308:
URL: https://github.com/apache/spark/pull/40308

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the guideline first in
        'core/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   <!--
   Please clarify what changes you are proposing. The purpose of this section is to outline the changes and how this PR fixes the issue. 
   If possible, please consider writing useful notes for better and faster reviews in your PR. See the examples below.
     1. If you refactor some codes with changing classes, showing the class hierarchy will help reviewers.
     2. If you fix some SQL features, you can provide some references of other DBMSes.
     3. If there is design documentation, please add the link.
     4. If there is a discussion in the mailing list, please add the link.
   -->
   
   This PR adds a rule to align UPDATE assignments with table attributes.
   
   ### Why are the changes needed?
   <!--
   Please clarify why the changes are needed. For instance,
     1. If you propose a new API, clarify the use case for a new API.
     2. If you fix a bug, you can clarify why it is a bug.
   -->
   
   These changes are needed so that we can rewrite UPDATE statements into executable plans for tables that support row-level operations.
   
   ### Does this PR introduce _any_ user-facing change?
   <!--
   Note that it means *any* user-facing change including all aspects such as the documentation fix.
   If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible.
   If possible, please also clarify if this is a user-facing change compared to the released Spark versions or within the unreleased branches such as master.
   If no, write 'No'.
   -->
   
   No.
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some test cases that check the changes thoroughly including negative and positive cases if possible.
   If it was tested in a way different from regular unit tests, please clarify how you tested step by step, ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why it was difficult to add.
   If benchmark tests were added, please run the benchmarks in GitHub Actions for the consistent environment, and the instructions could accord to: https://spark.apache.org/developer-tools.html#github-workflow-benchmarks.
   -->
   
   This PR comes with tests.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1136474666


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   Okay, since there is no more feedback, I'll follow up with a PR to migrate INSERT validation to runtime checks. After that, I'll come back to this PR. That way, we will have consistent behavior in all operations.
   
   Thanks for the discussion, @cloud-fan @dongjoon-hyun @huaxingao @viirya!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1129875106


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I agree it varies from database to database but why do we have different validation behavior for regular inserts? I am a bit worried we would have the same problem like with V1 sources when the validation was different in SQL and DataFrame writes. It felt to me V2 commands should behave in the same way.
   
   It is not a blocker for this PR but I'd be curious to understand our long-term plan. Do we plan to switch to runtime checks for regular inserts too?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1129825024


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   It's more common in other databases that, if you write values to a non-nullable column, runtime null check is applied instead of rejecting the write because the value is nullable. (we can treat it like a column not null constraint)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1161840413


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   A few ideas to make the code more robust:
   1. I think it's better to operate on the resolved column expressions, instead of turning back the expression to `Seq[String]`
   2. Given the parser rule for the UPDATE command, the column expression can only be `AttributeReference` or accessing (array of) struct's fields. We can group by `expr.references.head` to get a map from `AttributeReference` to `Seq[Expression]` and the corresponding update expressions.
   3. We validate the map we got in step 2: for each top-level column, its expressions must be of the same tree height (to avoid updating both 'a.b' and 'a.b.c'), and must be different from each other.
   4. Now it's easy to build the new update expressions: for each top-level column, if it doesn't have a match in the map, use the actual column value as the update expression, else ... (same algorithm below)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1156731543


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I've submitted #40655 to migrate to runtime checks.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1166214000


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   @cloud-fan, any ideas on how to avoid deconstructing `Seq[String]` when applying a set of assignments to a top-level attribute? The problem there is that we recurse top to bottom in `applyUpdates` whereas `assignment.key` is a set of nested GetStructField calls with the outer expression referring the leaf column.



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   @cloud-fan, any ideas on how to avoid deconstructing `Seq[String]` when applying a set of assignments to a top-level attribute? The problem is that we recurse top to bottom in `applyUpdates` whereas `assignment.key` is a set of nested GetStructField calls with the outer expression referring the leaf column.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1131921529


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I see value in both depending on the use case. What about making it configurable? If we just switch to runtime checks everywhere, it will be a substantial behavior change. We can add a new SQL property and default to the existing INSERT behavior of throwing an exception during the analysis phase.
   
   By the way, I don't target 3.4 in this PR so we will have time to build a proper runtime checking framework. I think that would be a substantial effort as we need to cover inner fields. There is no logic for that at the moment, if I am not mistaken.
   
   I do think consistency would be important. UPDATE and INSERT should behave in the same way.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127343402


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TableOutputResolver.scala:
##########
@@ -129,7 +129,7 @@ object TableOutputResolver {
     }
   }
 
-  private def checkNullability(
+  private[analysis] def checkNullability(

Review Comment:
   I want to reuse code from `TableOutputResolver` wherever possible but adding assignment processing would make it even more complicated than it is today. That's why I decided to open up some methods instead.
   
   Let me know your thoughts.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1128944608


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**

Review Comment:
   @cloud-fan, this doc gives a bit more details about why this PR is a prerequisite for rewriting UPDATEs. Let me know if this makes sense!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] johanl-db commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "johanl-db (via GitHub)" <gi...@apache.org>.
johanl-db commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1134165086


##########
sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlignUpdateAssignmentsSuite.scala:
##########
@@ -0,0 +1,786 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.command
+
+import java.util.Collections
+
+import org.mockito.ArgumentMatchers.any
+import org.mockito.Mockito.{mock, when}
+import org.mockito.invocation.InvocationOnMock
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.analysis.{AnalysisTest, Analyzer, FunctionRegistry, NoSuchTableException, ResolveSessionCatalog}
+import org.apache.spark.sql.catalyst.catalog.{InMemoryCatalog, SessionCatalog}
+import org.apache.spark.sql.catalyst.expressions.{ArrayTransform, AttributeReference, BooleanLiteral, Cast, CheckOverflowInTableInsert, CreateNamedStruct, EvalMode, GetStructField, IntegerLiteral, LambdaFunction, LongLiteral, MapFromArrays, StringLiteral}
+import org.apache.spark.sql.catalyst.expressions.objects.StaticInvoke
+import org.apache.spark.sql.catalyst.parser.CatalystSqlParser
+import org.apache.spark.sql.catalyst.plans.logical.{Assignment, LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.connector.catalog.{CatalogManager, CatalogNotFoundException, CatalogV2Util, Column, ColumnDefaultValue, Identifier, Table, TableCapability, TableCatalog}
+import org.apache.spark.sql.connector.expressions.{LiteralValue, Transform}
+import org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog
+import org.apache.spark.sql.internal.SQLConf
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{BooleanType, IntegerType, StructType}
+
+class AlignUpdateAssignmentsSuite extends AnalysisTest {
+
+  private val primitiveTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT", nullable = false)
+      .add("l", "LONG")
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val nestedStructTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT")
+      .add(
+        "s",
+        "STRUCT<n_i: INT NOT NULL, n_s: STRUCT<dn_i: INT NOT NULL, dn_l: LONG>>",
+        nullable = false)
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val mapArrayTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT")
+      .add("a", "ARRAY<STRUCT<ac_1: INT, ac_2: INT>>")
+      .add("m", "MAP<STRING, STRING>")
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val charVarcharTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("c", "CHAR(5)")
+      .add(
+        "s",
+        "STRUCT<n_i: INT, n_vc: VARCHAR(5)>",
+        nullable = false)
+      .add(
+        "a",
+        "ARRAY<STRUCT<n_i: INT, n_vc: VARCHAR(5)>>",
+        nullable = false)
+      .add(
+        "mk",
+        "MAP<STRUCT<n_i: INT, n_vc: VARCHAR(5)>, STRING>",
+        nullable = false)
+      .add(
+        "mv",
+        "MAP<STRING, STRUCT<n_i: INT, n_vc: VARCHAR(5)>>",
+        nullable = false)
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val acceptsAnySchemaTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT", nullable = false)
+      .add("l", "LONG")
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    when(t.capabilities()).thenReturn(Collections.singleton(TableCapability.ACCEPT_ANY_SCHEMA))
+    t
+  }
+
+  private val defaultValuesTable: Table = {
+    val t = mock(classOf[Table])
+    val iDefault = new ColumnDefaultValue("42", LiteralValue(42, IntegerType))
+    when(t.columns()).thenReturn(Array(
+      Column.create("b", BooleanType, true, null, null),
+      Column.create("i", IntegerType, true, null, iDefault, null)))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val v2Catalog: TableCatalog = {
+    val newCatalog = mock(classOf[TableCatalog])
+    when(newCatalog.loadTable(any())).thenAnswer((invocation: InvocationOnMock) => {
+      val ident = invocation.getArgument[Identifier](0)
+      ident.name match {
+        case "primitive_table" => primitiveTable
+        case "nested_struct_table" => nestedStructTable
+        case "map_array_table" => mapArrayTable
+        case "char_varchar_table" => charVarcharTable
+        case "accepts_any_schema_table" => acceptsAnySchemaTable
+        case "default_values_table" => defaultValuesTable
+        case name => throw new NoSuchTableException(Seq(name))
+      }
+    })
+    when(newCatalog.name()).thenReturn("cat")
+    newCatalog
+  }
+
+  private val v1SessionCatalog: SessionCatalog = new SessionCatalog(
+    new InMemoryCatalog(),
+    FunctionRegistry.builtin,
+    new SQLConf())
+
+  private val v2SessionCatalog: TableCatalog = new V2SessionCatalog(v1SessionCatalog)
+
+  private val catalogManager: CatalogManager = {
+    val manager = mock(classOf[CatalogManager])
+    when(manager.catalog(any())).thenAnswer((invocation: InvocationOnMock) => {
+      invocation.getArgument[String](0) match {
+        case "testcat" => v2Catalog
+        case CatalogManager.SESSION_CATALOG_NAME => v2SessionCatalog
+        case name => throw new CatalogNotFoundException(s"No such catalog: $name")
+      }
+    })
+    when(manager.currentCatalog).thenReturn(v2Catalog)
+    when(manager.currentNamespace).thenReturn(Array.empty[String])
+    when(manager.v1SessionCatalog).thenReturn(v1SessionCatalog)
+    when(manager.v2SessionCatalog).thenReturn(v2SessionCatalog)
+    manager
+  }
+
+  test("align assignments (primitive types)") {
+    val sql1 = "UPDATE primitive_table SET txt = 'new', i = 1"
+    parseAndAlignAssignments(sql1) match {
+      case Seq(
+          Assignment(i: AttributeReference, IntegerLiteral(1)),
+          Assignment(l: AttributeReference, lValue: AttributeReference),
+          Assignment(txt: AttributeReference, StringLiteral("new"))) =>
+
+        assert(i.name == "i")
+        assert(l.name == "l" && l == lValue)
+        assert(txt.name == "txt")
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+
+    val sql2 = "UPDATE primitive_table SET l = 10L"
+    parseAndAlignAssignments(sql2) match {
+      case Seq(
+          Assignment(i: AttributeReference, iValue: AttributeReference),
+          Assignment(l: AttributeReference, LongLiteral(10L)),
+          Assignment(txt: AttributeReference, txtValue: AttributeReference)) =>
+
+        assert(i.name == "i" && i == iValue)
+        assert(l.name == "l")
+        assert(txt.name == "txt" && txt == txtValue)
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+
+    val sql3 = "UPDATE primitive_table AS t SET t.txt = 'new', t.l = 10L, t.i = -1"
+    parseAndAlignAssignments(sql3) match {
+      case Seq(
+          Assignment(i: AttributeReference, IntegerLiteral(-1)),
+          Assignment(l: AttributeReference, LongLiteral(10L)),
+          Assignment(txt: AttributeReference, StringLiteral("new"))) =>
+
+        assert(i.name == "i")
+        assert(l.name == "l")
+        assert(txt.name == "txt")
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+  }
+
+  test("align assignments (structs)") {

Review Comment:
   There should be a test case for `UPDATE nested_struct_table SET s.n_i = 1"` that ensures the struct `s.n_s` is preserved as a whole instead of recursing and generating assignments for each of its children.
   
   This is important if `s.n_s` contain null values: the assignments must be (`s.n_i = 1`, `s.n_s = s.n_s`), not (`s.n_i = 1`, `s.n_s.dn_i = s.n_s.dn_i`, `s.n_s.dn_l = s.n_s.dn_l`) so that `s.n_s` is still null after the update.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1163926486


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   yea this SGTM.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #40308:
URL: https://github.com/apache/spark/pull/40308#issuecomment-1512632948

   thanks, merging to master!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1131921529


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I see value in both depending on the use case. What about making it configurable? If we just switch to runtime checks everywhere, it will be a substantial behavior change. We can add a new SQL property and default to the existing INSERT behavior of throwing an exception during the analysis phase.
   
   By the way, I don't target 3.4 in this PR so we will have time to build a proper runtime checking framework. I think that would be a substantial effort as we need to cover inner fields, arrays, maps. There is no logic for that at the moment, if I am not mistaken.
   
   I do think consistency would be important. UPDATE and INSERT should behave in the same way.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1132827782


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).

Review Comment:
   Correct, the existing row-level APIs assume Spark is responsible for building an updated version of the row. That should work for Delta, Iceberg, Hudi, Hive ACID.
   
   Once there is another use case, we should be able to extend the framework to cover it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1132839365


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr
+
+        // if there is only one update and it is an exact match, return the assigned expression
+        case Seq(matchedUpdate) if isExactMatch(matchedUpdate, col) =>
+          applyUpdate(matchedUpdate.expr, col, addError, colPath :+ col.name)
+
+        // if there are matches only for children
+        case matchedUpdates if !hasExactMatch(matchedUpdates, col) =>
+          val newColPath = colPath :+ col.name
+          col.dataType match {
+            case colType: StructType =>
+              // build field expressions
+              val fieldExprs = colType.fields.zipWithIndex.map { case (field, ordinal) =>
+                GetStructField(col, ordinal, Some(field.name))
+              }
+
+              // recursively apply this method to nested fields
+              val updatedFieldExprs = applyUpdates(
+                matchedUpdates.map(update => update.copy(ref = update.ref.tail)),
+                colType.toAttributes,
+                fieldExprs,
+                addError,
+                newColPath)
+
+              // construct a new struct with updated field expressions
+              toNamedStruct(colType, updatedFieldExprs)
+
+            case otherType =>
+              addError(
+                "Updating nested fields is only supported for StructType but " +
+                s"${newColPath.quoted} is of type $otherType")
+              col
+          }
+
+        // if there are conflicting updates, throw an exception
+        // there are two illegal scenarios:
+        // - multiple updates to the same column
+        // - updates to a top-level struct and its nested fields (like a.b and a.b.c)
+        case matchedUpdates if hasExactMatch(matchedUpdates, col) =>
+          val conflictingColNames = matchedUpdates.map(update => (colPath ++ update.ref).quoted)
+          addError("Update conflicts for columns: " + conflictingColNames.distinct.mkString(", "))
+          col
+      }
+    }
+  }
+
+  private def toNamedStruct(structType: StructType, fieldExprs: Seq[Expression]): Expression = {
+    val namedStructExprs = structType.fields.zip(fieldExprs).flatMap { case (field, expr) =>
+      Seq(Literal(field.name), expr)
+    }
+    CreateNamedStruct(namedStructExprs)
+  }
+
+  private def hasExactMatch(updates: Seq[ColumnUpdate], col: NamedExpression): Boolean = {
+    updates.exists(isExactMatch(_, col))
+  }
+
+  private def isExactMatch(update: ColumnUpdate, col: NamedExpression): Boolean = {
+    update.ref match {
+      case Seq(namePart) if conf.resolver(namePart, col.name) => true
+      case _ => false
+    }
+  }
+
+  private def applyUpdate(
+      value: Expression,
+      col: Attribute,
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    (value.dataType, col.dataType) match {
+      // no need to reorder inner fields or cast if types are equal ignoring nullability
+      case (valueType, colType) if valueType.sameType(colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        validateAssignment(valueType, colType, addError, colPath)
+        value
+
+      case (valueType: StructType, colType: StructType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveStructType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: ArrayType, colType: ArrayType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveArrayType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: MapType, colType: MapType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveMapType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType, colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+
+        val colTypeHasCharVarchar = CharVarcharUtils.hasCharVarchar(colType)
+        val colTypeWithoutCharVarchar = if (colTypeHasCharVarchar) {
+          CharVarcharUtils.replaceCharVarcharWithString(colType)
+        } else {
+          colType
+        }
+
+        validateAssignment(valueType, colTypeWithoutCharVarchar, addError, colPath)
+
+        val casted = TableOutputResolver.cast(
+          value, colTypeWithoutCharVarchar,
+          conf, colPath.quoted)
+
+        if (conf.charVarcharAsString || !colTypeHasCharVarchar) {
+          casted
+        } else {
+          CharVarcharUtils.stringLengthCheck(casted, colType)
+        }
+    }
+  }
+
+  private def validateAssignment(
+      valueType: DataType,
+      expectedType: DataType,
+      addError: String => Unit,
+      colPath: Seq[String]): Unit = {
+
+    conf.storeAssignmentPolicy match {
+      case StoreAssignmentPolicy.STRICT | StoreAssignmentPolicy.ANSI =>
+        DataType.canWrite(
+          valueType, expectedType, byName = true, conf.resolver, colPath.quoted,
+          conf.storeAssignmentPolicy, addError)
+
+      case _ => // OK
+    }
+  }
+
+  /**
+   * Checks whether assignments are aligned and are compatible with table columns.
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to check
+   * @return true if the assignments are aligned
+   */
+  def aligned(attrs: Seq[Attribute], assignments: Seq[Assignment]): Boolean = {
+    if (attrs.size != assignments.size) {
+      return false
+    }
+
+    attrs.zip(assignments).forall { case (attr, assignment) =>
+      val key = assignment.key
+      val value = assignment.value
+
+      val attrType = CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType)
+
+      sameRef(toRef(key), toRef(attr)) &&
+        DataType.equalsIgnoreCompatibleNullability(value.dataType, attrType) &&
+        (attr.nullable || !value.nullable)
+    }
+  }
+
+  private def sameRef(ref: Seq[String], otherRef: Seq[String]): Boolean = {
+    ref.size == otherRef.size && ref.zip(otherRef).forall { case (namePart, otherNamePart) =>
+      conf.resolver(namePart, otherNamePart)
+    }
+  }
+
+  private def toRef(expr: Expression): Seq[String] = expr match {
+    case attr: AttributeReference =>
+      Seq(attr.name)
+    case Alias(child, _) =>
+      toRef(child)
+    case GetStructField(child, _, Some(name)) =>
+      toRef(child) :+ name
+    case other: ExtractValue =>

Review Comment:
   We should eventually. This PR doesn't support updating arrays or maps, though. I wanted to work on it later and unblock further row-level operation development for now.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127080574


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AlignRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,62 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that aligns assignments in row-level operations.
+ *
+ * Note that this rule must be run after resolving default values but before rewriting row-level
+ * commands into executable plans. This rule does not apply to tables that accept any schema.
+ * Such tables must inject their own rules to align assignments.
+ */
+object AlignRowLevelCommandAssignments extends Rule[LogicalPlan] {
+
+  override def apply(plan: LogicalPlan): LogicalPlan = plan resolveOperators {
+    case u: UpdateTable if u.resolved && !u.aligned && shouldAlign(u.table) =>
+      val newTable = u.table.transform {
+        case r: DataSourceV2Relation =>
+          validateStoreAssignmentPolicy()

Review Comment:
   I follow what we do for V2 inserts.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1130585454


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I think runtime null check is the right direction if we want to unify but we can collect feedback from more people.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on PR #40308:
URL: https://github.com/apache/spark/pull/40308#issuecomment-1512248162

   Failures in stream don't seem related.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1166214000


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   @cloud-fan, any ideas on how to avoid deconstructing `Seq[String]` when applying a set of assignments to a top-level attribute? The problem is that we recurse top to bottom in `applyUpdates` whereas `assignment.key` is a set of nested `GetStructField` calls with the outer expression referring the leaf column.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127909983


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   Can we highlight the differences? I thought we are already doing by-name assignments here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127343402


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TableOutputResolver.scala:
##########
@@ -129,7 +129,7 @@ object TableOutputResolver {
     }
   }
 
-  private def checkNullability(
+  private[analysis] def checkNullability(

Review Comment:
   I want to reuse code from `TableOutputResolver` wherever possible. However, adding assignment processing directly there would make that class even more complicated than it is today. That's why I decided to open up some methods instead.
   
   Feedback is appreciated. I could add `applyUpdate` to `TableOutputResolver` too. No preference.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1132839365


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr
+
+        // if there is only one update and it is an exact match, return the assigned expression
+        case Seq(matchedUpdate) if isExactMatch(matchedUpdate, col) =>
+          applyUpdate(matchedUpdate.expr, col, addError, colPath :+ col.name)
+
+        // if there are matches only for children
+        case matchedUpdates if !hasExactMatch(matchedUpdates, col) =>
+          val newColPath = colPath :+ col.name
+          col.dataType match {
+            case colType: StructType =>
+              // build field expressions
+              val fieldExprs = colType.fields.zipWithIndex.map { case (field, ordinal) =>
+                GetStructField(col, ordinal, Some(field.name))
+              }
+
+              // recursively apply this method to nested fields
+              val updatedFieldExprs = applyUpdates(
+                matchedUpdates.map(update => update.copy(ref = update.ref.tail)),
+                colType.toAttributes,
+                fieldExprs,
+                addError,
+                newColPath)
+
+              // construct a new struct with updated field expressions
+              toNamedStruct(colType, updatedFieldExprs)
+
+            case otherType =>
+              addError(
+                "Updating nested fields is only supported for StructType but " +
+                s"${newColPath.quoted} is of type $otherType")
+              col
+          }
+
+        // if there are conflicting updates, throw an exception
+        // there are two illegal scenarios:
+        // - multiple updates to the same column
+        // - updates to a top-level struct and its nested fields (like a.b and a.b.c)
+        case matchedUpdates if hasExactMatch(matchedUpdates, col) =>
+          val conflictingColNames = matchedUpdates.map(update => (colPath ++ update.ref).quoted)
+          addError("Update conflicts for columns: " + conflictingColNames.distinct.mkString(", "))
+          col
+      }
+    }
+  }
+
+  private def toNamedStruct(structType: StructType, fieldExprs: Seq[Expression]): Expression = {
+    val namedStructExprs = structType.fields.zip(fieldExprs).flatMap { case (field, expr) =>
+      Seq(Literal(field.name), expr)
+    }
+    CreateNamedStruct(namedStructExprs)
+  }
+
+  private def hasExactMatch(updates: Seq[ColumnUpdate], col: NamedExpression): Boolean = {
+    updates.exists(isExactMatch(_, col))
+  }
+
+  private def isExactMatch(update: ColumnUpdate, col: NamedExpression): Boolean = {
+    update.ref match {
+      case Seq(namePart) if conf.resolver(namePart, col.name) => true
+      case _ => false
+    }
+  }
+
+  private def applyUpdate(
+      value: Expression,
+      col: Attribute,
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    (value.dataType, col.dataType) match {
+      // no need to reorder inner fields or cast if types are equal ignoring nullability
+      case (valueType, colType) if valueType.sameType(colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        validateAssignment(valueType, colType, addError, colPath)
+        value
+
+      case (valueType: StructType, colType: StructType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveStructType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: ArrayType, colType: ArrayType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveArrayType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: MapType, colType: MapType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveMapType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType, colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+
+        val colTypeHasCharVarchar = CharVarcharUtils.hasCharVarchar(colType)
+        val colTypeWithoutCharVarchar = if (colTypeHasCharVarchar) {
+          CharVarcharUtils.replaceCharVarcharWithString(colType)
+        } else {
+          colType
+        }
+
+        validateAssignment(valueType, colTypeWithoutCharVarchar, addError, colPath)
+
+        val casted = TableOutputResolver.cast(
+          value, colTypeWithoutCharVarchar,
+          conf, colPath.quoted)
+
+        if (conf.charVarcharAsString || !colTypeHasCharVarchar) {
+          casted
+        } else {
+          CharVarcharUtils.stringLengthCheck(casted, colType)
+        }
+    }
+  }
+
+  private def validateAssignment(
+      valueType: DataType,
+      expectedType: DataType,
+      addError: String => Unit,
+      colPath: Seq[String]): Unit = {
+
+    conf.storeAssignmentPolicy match {
+      case StoreAssignmentPolicy.STRICT | StoreAssignmentPolicy.ANSI =>
+        DataType.canWrite(
+          valueType, expectedType, byName = true, conf.resolver, colPath.quoted,
+          conf.storeAssignmentPolicy, addError)
+
+      case _ => // OK
+    }
+  }
+
+  /**
+   * Checks whether assignments are aligned and are compatible with table columns.
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to check
+   * @return true if the assignments are aligned
+   */
+  def aligned(attrs: Seq[Attribute], assignments: Seq[Assignment]): Boolean = {
+    if (attrs.size != assignments.size) {
+      return false
+    }
+
+    attrs.zip(assignments).forall { case (attr, assignment) =>
+      val key = assignment.key
+      val value = assignment.value
+
+      val attrType = CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType)
+
+      sameRef(toRef(key), toRef(attr)) &&
+        DataType.equalsIgnoreCompatibleNullability(value.dataType, attrType) &&
+        (attr.nullable || !value.nullable)
+    }
+  }
+
+  private def sameRef(ref: Seq[String], otherRef: Seq[String]): Boolean = {
+    ref.size == otherRef.size && ref.zip(otherRef).forall { case (namePart, otherNamePart) =>
+      conf.resolver(namePart, otherNamePart)
+    }
+  }
+
+  private def toRef(expr: Expression): Seq[String] = expr match {
+    case attr: AttributeReference =>
+      Seq(attr.name)
+    case Alias(child, _) =>
+      toRef(child)
+    case GetStructField(child, _, Some(name)) =>
+      toRef(child) :+ name
+    case other: ExtractValue =>

Review Comment:
   We should eventually. This PR doesn't support updating arrays or maps, though. I wanted to work on it later and unblock further row-level operation development for now. For now, I throw an exception and support only nested fields in structs.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1132829264


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).

Review Comment:
   Let me also know if you think we should only apply this to tables that implement `SupportsRowLevelOperations`.
   [Here](https://github.com/apache/spark/pull/40308#discussion_r1128945996) is the original question.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1133467299


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   The only reason I would consider adding a config is to avoid an implicit behavior change as we currently check field compatibility for INSERTS during analysis. What is the community policy on such things? Is it okay to just transition to runtime checks in 3.5?



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   The only reason I would consider adding a config is to avoid an implicit behavior change as we currently check field compatibility for INSERTs during analysis. What is the community policy on such things? Is it okay to just transition to runtime checks in 3.5?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1128943046


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   One notable difference is not using `AssertNotNull` and relying on the same framework we have for V2 tables. In particular, we check the assignment mode and whether in and out attributes are compatible.
   
   The main contribution of this PR is `AssignmentUtils` that aligns assignments with table attributes. It is a prerequisite for rewriting UPDATEs. Right now, we only rewrite DELETEs. It also handles casts and char/varchar types.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] huaxingao commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "huaxingao (via GitHub)" <gi...@apache.org>.
huaxingao commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1131767473


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I don't have a strong opinion on runtime error or analysis error, but I agree that we need to unify the behaviors.
   
   If I remember correctly, DB2 has a Query Transformation and Semantics layer. The null check is done there instead of run time, but I don't know how this is done in other databases.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1132487845


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr
+
+        // if there is only one update and it is an exact match, return the assigned expression
+        case Seq(matchedUpdate) if isExactMatch(matchedUpdate, col) =>
+          applyUpdate(matchedUpdate.expr, col, addError, colPath :+ col.name)
+
+        // if there are matches only for children
+        case matchedUpdates if !hasExactMatch(matchedUpdates, col) =>
+          val newColPath = colPath :+ col.name
+          col.dataType match {
+            case colType: StructType =>
+              // build field expressions
+              val fieldExprs = colType.fields.zipWithIndex.map { case (field, ordinal) =>
+                GetStructField(col, ordinal, Some(field.name))
+              }
+
+              // recursively apply this method to nested fields
+              val updatedFieldExprs = applyUpdates(
+                matchedUpdates.map(update => update.copy(ref = update.ref.tail)),
+                colType.toAttributes,
+                fieldExprs,
+                addError,
+                newColPath)
+
+              // construct a new struct with updated field expressions
+              toNamedStruct(colType, updatedFieldExprs)
+
+            case otherType =>
+              addError(
+                "Updating nested fields is only supported for StructType but " +
+                s"${newColPath.quoted} is of type $otherType")
+              col
+          }
+
+        // if there are conflicting updates, throw an exception
+        // there are two illegal scenarios:
+        // - multiple updates to the same column
+        // - updates to a top-level struct and its nested fields (like a.b and a.b.c)
+        case matchedUpdates if hasExactMatch(matchedUpdates, col) =>
+          val conflictingColNames = matchedUpdates.map(update => (colPath ++ update.ref).quoted)
+          addError("Update conflicts for columns: " + conflictingColNames.distinct.mkString(", "))
+          col
+      }
+    }
+  }
+
+  private def toNamedStruct(structType: StructType, fieldExprs: Seq[Expression]): Expression = {
+    val namedStructExprs = structType.fields.zip(fieldExprs).flatMap { case (field, expr) =>
+      Seq(Literal(field.name), expr)
+    }
+    CreateNamedStruct(namedStructExprs)
+  }
+
+  private def hasExactMatch(updates: Seq[ColumnUpdate], col: NamedExpression): Boolean = {
+    updates.exists(isExactMatch(_, col))
+  }
+
+  private def isExactMatch(update: ColumnUpdate, col: NamedExpression): Boolean = {
+    update.ref match {
+      case Seq(namePart) if conf.resolver(namePart, col.name) => true
+      case _ => false
+    }
+  }
+
+  private def applyUpdate(
+      value: Expression,
+      col: Attribute,
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    (value.dataType, col.dataType) match {
+      // no need to reorder inner fields or cast if types are equal ignoring nullability
+      case (valueType, colType) if valueType.sameType(colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        validateAssignment(valueType, colType, addError, colPath)
+        value
+
+      case (valueType: StructType, colType: StructType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveStructType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: ArrayType, colType: ArrayType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveArrayType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: MapType, colType: MapType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveMapType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType, colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+
+        val colTypeHasCharVarchar = CharVarcharUtils.hasCharVarchar(colType)
+        val colTypeWithoutCharVarchar = if (colTypeHasCharVarchar) {
+          CharVarcharUtils.replaceCharVarcharWithString(colType)
+        } else {
+          colType
+        }
+
+        validateAssignment(valueType, colTypeWithoutCharVarchar, addError, colPath)
+
+        val casted = TableOutputResolver.cast(
+          value, colTypeWithoutCharVarchar,
+          conf, colPath.quoted)
+
+        if (conf.charVarcharAsString || !colTypeHasCharVarchar) {
+          casted
+        } else {
+          CharVarcharUtils.stringLengthCheck(casted, colType)
+        }
+    }
+  }
+
+  private def validateAssignment(
+      valueType: DataType,
+      expectedType: DataType,
+      addError: String => Unit,
+      colPath: Seq[String]): Unit = {
+
+    conf.storeAssignmentPolicy match {
+      case StoreAssignmentPolicy.STRICT | StoreAssignmentPolicy.ANSI =>
+        DataType.canWrite(
+          valueType, expectedType, byName = true, conf.resolver, colPath.quoted,
+          conf.storeAssignmentPolicy, addError)
+
+      case _ => // OK
+    }
+  }
+
+  /**
+   * Checks whether assignments are aligned and are compatible with table columns.
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to check
+   * @return true if the assignments are aligned
+   */
+  def aligned(attrs: Seq[Attribute], assignments: Seq[Assignment]): Boolean = {
+    if (attrs.size != assignments.size) {
+      return false
+    }
+
+    attrs.zip(assignments).forall { case (attr, assignment) =>
+      val key = assignment.key
+      val value = assignment.value
+
+      val attrType = CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType)
+
+      sameRef(toRef(key), toRef(attr)) &&
+        DataType.equalsIgnoreCompatibleNullability(value.dataType, attrType) &&
+        (attr.nullable || !value.nullable)
+    }
+  }
+
+  private def sameRef(ref: Seq[String], otherRef: Seq[String]): Boolean = {
+    ref.size == otherRef.size && ref.zip(otherRef).forall { case (namePart, otherNamePart) =>
+      conf.resolver(namePart, otherNamePart)
+    }
+  }
+
+  private def toRef(expr: Expression): Seq[String] = expr match {
+    case attr: AttributeReference =>
+      Seq(attr.name)
+    case Alias(child, _) =>
+      toRef(child)
+    case GetStructField(child, _, Some(name)) =>
+      toRef(child) :+ name
+    case other: ExtractValue =>

Review Comment:
   For ALTER COLUMN we support a special syntax to reference any inner field, for example, `array_col.element.field1`, `map_col.key.field2`, etc. Shall we support this syntax in UPDATE as well? The related code is in `StructType.findNestedField`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] viirya commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "viirya (via GitHub)" <gi...@apache.org>.
viirya commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1133355436


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns

Review Comment:
   nit: a consistent description
   
   ```suggestion
      * @param attrs table attributes
      * @param assignments assignments to align
      * @return aligned assignments that match table attributes
   ```



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   Hmm, I feel it is a bit more than required to have additional config for this runtime error/analysis error behavior. It sounds like an internal decision to databases, not sure if we need it to be configurable. Maybe I don't see too much good with such config.
   
   Ideally it should be an unified behavior, I agree this too. If there is already some inconsistency. I think we can address them incrementally.



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -319,6 +319,7 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
       ResolveRandomSeed ::
       ResolveBinaryArithmetic ::
       ResolveUnion ::
+      AlignRowLevelCommandAssignments ::

Review Comment:
   Maybe add a comment that this rule cannot be changed in order for now.



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr
+
+        // if there is only one update and it is an exact match, return the assigned expression
+        case Seq(matchedUpdate) if isExactMatch(matchedUpdate, col) =>
+          applyUpdate(matchedUpdate.expr, col, addError, colPath :+ col.name)
+
+        // if there are matches only for children
+        case matchedUpdates if !hasExactMatch(matchedUpdates, col) =>
+          val newColPath = colPath :+ col.name
+          col.dataType match {
+            case colType: StructType =>
+              // build field expressions
+              val fieldExprs = colType.fields.zipWithIndex.map { case (field, ordinal) =>
+                GetStructField(col, ordinal, Some(field.name))
+              }
+
+              // recursively apply this method to nested fields
+              val updatedFieldExprs = applyUpdates(
+                matchedUpdates.map(update => update.copy(ref = update.ref.tail)),
+                colType.toAttributes,
+                fieldExprs,
+                addError,
+                newColPath)
+
+              // construct a new struct with updated field expressions
+              toNamedStruct(colType, updatedFieldExprs)
+
+            case otherType =>
+              addError(
+                "Updating nested fields is only supported for StructType but " +
+                s"${newColPath.quoted} is of type $otherType")
+              col
+          }
+
+        // if there are conflicting updates, throw an exception
+        // there are two illegal scenarios:
+        // - multiple updates to the same column
+        // - updates to a top-level struct and its nested fields (like a.b and a.b.c)
+        case matchedUpdates if hasExactMatch(matchedUpdates, col) =>
+          val conflictingColNames = matchedUpdates.map(update => (colPath ++ update.ref).quoted)
+          addError("Update conflicts for columns: " + conflictingColNames.distinct.mkString(", "))
+          col
+      }
+    }
+  }
+
+  private def toNamedStruct(structType: StructType, fieldExprs: Seq[Expression]): Expression = {
+    val namedStructExprs = structType.fields.zip(fieldExprs).flatMap { case (field, expr) =>
+      Seq(Literal(field.name), expr)
+    }
+    CreateNamedStruct(namedStructExprs)
+  }
+
+  private def hasExactMatch(updates: Seq[ColumnUpdate], col: NamedExpression): Boolean = {
+    updates.exists(isExactMatch(_, col))
+  }
+
+  private def isExactMatch(update: ColumnUpdate, col: NamedExpression): Boolean = {
+    update.ref match {
+      case Seq(namePart) if conf.resolver(namePart, col.name) => true
+      case _ => false
+    }
+  }
+
+  private def applyUpdate(
+      value: Expression,
+      col: Attribute,
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    (value.dataType, col.dataType) match {
+      // no need to reorder inner fields or cast if types are equal ignoring nullability
+      case (valueType, colType) if valueType.sameType(colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        validateAssignment(valueType, colType, addError, colPath)
+        value
+
+      case (valueType: StructType, colType: StructType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveStructType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)

Review Comment:
   If cannot resolve it, why it is returning `col` instead of `value`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1160348678


##########
sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlignUpdateAssignmentsSuite.scala:
##########
@@ -0,0 +1,781 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.command
+
+import java.util.Collections
+
+import org.mockito.ArgumentMatchers.any
+import org.mockito.Mockito.{mock, when}
+import org.mockito.invocation.InvocationOnMock
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.analysis.{AnalysisTest, Analyzer, FunctionRegistry, NoSuchTableException, ResolveSessionCatalog}
+import org.apache.spark.sql.catalyst.catalog.{InMemoryCatalog, SessionCatalog}
+import org.apache.spark.sql.catalyst.expressions.{ArrayTransform, AttributeReference, BooleanLiteral, Cast, CheckOverflowInTableInsert, CreateNamedStruct, EvalMode, GetStructField, IntegerLiteral, LambdaFunction, LongLiteral, MapFromArrays, StringLiteral}
+import org.apache.spark.sql.catalyst.expressions.objects.{AssertNotNull, StaticInvoke}
+import org.apache.spark.sql.catalyst.parser.CatalystSqlParser
+import org.apache.spark.sql.catalyst.plans.logical.{Assignment, LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.connector.catalog.{CatalogManager, CatalogNotFoundException, CatalogV2Util, Column, ColumnDefaultValue, Identifier, Table, TableCapability, TableCatalog}
+import org.apache.spark.sql.connector.expressions.{LiteralValue, Transform}
+import org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog
+import org.apache.spark.sql.internal.SQLConf
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{BooleanType, IntegerType, StructType}
+
+class AlignUpdateAssignmentsSuite extends AnalysisTest {
+
+  private val primitiveTable = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT", nullable = false)
+      .add("l", "LONG")
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val nestedStructTable = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT")
+      .add(
+        "s",
+        "STRUCT<n_i: INT NOT NULL, n_s: STRUCT<dn_i: INT NOT NULL, dn_l: LONG>>",
+        nullable = false)
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val mapArrayTable = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT")
+      .add("a", "ARRAY<STRUCT<ac_1: INT, ac_2: INT>>")
+      .add("m", "MAP<STRING, STRING>")
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val charVarcharTable = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("c", "CHAR(5)")
+      .add(
+        "s",
+        "STRUCT<n_i: INT, n_vc: VARCHAR(5)>",
+        nullable = false)
+      .add(
+        "a",
+        "ARRAY<STRUCT<n_i: INT, n_vc: VARCHAR(5)>>",
+        nullable = false)
+      .add(
+        "mk",
+        "MAP<STRUCT<n_i: INT, n_vc: VARCHAR(5)>, STRING>",
+        nullable = false)
+      .add(
+        "mv",
+        "MAP<STRING, STRUCT<n_i: INT, n_vc: VARCHAR(5)>>",
+        nullable = false)
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val acceptsAnySchemaTable = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT", nullable = false)
+      .add("l", "LONG")
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    when(t.capabilities()).thenReturn(Collections.singleton(TableCapability.ACCEPT_ANY_SCHEMA))
+    t
+  }
+
+  private val defaultValuesTable = {
+    val t = mock(classOf[Table])
+    val iDefault = new ColumnDefaultValue("42", LiteralValue(42, IntegerType))
+    when(t.columns()).thenReturn(Array(
+      Column.create("b", BooleanType, true, null, null),
+      Column.create("i", IntegerType, true, null, iDefault, null)))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val v2Catalog = {
+    val newCatalog = mock(classOf[TableCatalog])
+    when(newCatalog.loadTable(any())).thenAnswer((invocation: InvocationOnMock) => {
+      val ident = invocation.getArgument[Identifier](0)
+      ident.name match {
+        case "primitive_table" => primitiveTable
+        case "nested_struct_table" => nestedStructTable
+        case "map_array_table" => mapArrayTable
+        case "char_varchar_table" => charVarcharTable
+        case "accepts_any_schema_table" => acceptsAnySchemaTable
+        case "default_values_table" => defaultValuesTable
+        case name => throw new NoSuchTableException(Seq(name))
+      }
+    })
+    when(newCatalog.name()).thenReturn("cat")
+    newCatalog
+  }
+
+  private val v1SessionCatalog =
+    new SessionCatalog(new InMemoryCatalog(), FunctionRegistry.builtin, new SQLConf())
+
+  private val v2SessionCatalog = new V2SessionCatalog(v1SessionCatalog)
+
+  private val catalogManager = {
+    val manager = mock(classOf[CatalogManager])
+    when(manager.catalog(any())).thenAnswer((invocation: InvocationOnMock) => {
+      invocation.getArgument[String](0) match {
+        case "testcat" => v2Catalog
+        case CatalogManager.SESSION_CATALOG_NAME => v2SessionCatalog
+        case name => throw new CatalogNotFoundException(s"No such catalog: $name")
+      }
+    })
+    when(manager.currentCatalog).thenReturn(v2Catalog)
+    when(manager.currentNamespace).thenReturn(Array.empty[String])
+    when(manager.v1SessionCatalog).thenReturn(v1SessionCatalog)
+    when(manager.v2SessionCatalog).thenReturn(v2SessionCatalog)
+    manager
+  }
+
+  test("align assignments (primitive types)") {
+    val sql1 = "UPDATE primitive_table SET txt = 'new', i = 1"
+    parseAndAlignAssignments(sql1) match {
+      case Seq(
+          Assignment(i: AttributeReference, IntegerLiteral(1)),
+          Assignment(l: AttributeReference, lValue: AttributeReference),
+          Assignment(txt: AttributeReference, StringLiteral("new"))) =>
+
+        assert(i.name == "i")
+        assert(l.name == "l" && l == lValue)
+        assert(txt.name == "txt")
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+
+    val sql2 = "UPDATE primitive_table SET l = 10L"
+    parseAndAlignAssignments(sql2) match {
+      case Seq(
+          Assignment(i: AttributeReference, iValue: AttributeReference),
+          Assignment(l: AttributeReference, LongLiteral(10L)),
+          Assignment(txt: AttributeReference, txtValue: AttributeReference)) =>
+
+        assert(i.name == "i" && i == iValue)
+        assert(l.name == "l")
+        assert(txt.name == "txt" && txt == txtValue)
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+
+    val sql3 = "UPDATE primitive_table AS t SET t.txt = 'new', t.l = 10L, t.i = -1"
+    parseAndAlignAssignments(sql3) match {
+      case Seq(
+          Assignment(i: AttributeReference, IntegerLiteral(-1)),
+          Assignment(l: AttributeReference, LongLiteral(10L)),
+          Assignment(txt: AttributeReference, StringLiteral("new"))) =>
+
+        assert(i.name == "i")
+        assert(l.name == "l")
+        assert(txt.name == "txt")
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+  }
+
+  test("align assignments (structs)") {
+    val sql1 =
+      "UPDATE nested_struct_table " +
+      "SET s = named_struct('n_s', named_struct('dn_i', 1, 'dn_l', 100L), 'n_i', 1)"
+    parseAndAlignAssignments(sql1) match {
+      case Seq(
+          Assignment(i: AttributeReference, iValue: AttributeReference),
+          Assignment(s: AttributeReference, sValue: CreateNamedStruct),
+          Assignment(txt: AttributeReference, txtValue: AttributeReference)) =>
+
+        assert(i.name == "i" && i == iValue)
+
+        assert(s.name == "s")
+        sValue.children match {
+          case Seq(
+              StringLiteral("n_i"), GetStructField(_, _, Some("n_i")),
+              StringLiteral("n_s"), nsValue: CreateNamedStruct) =>
+
+            nsValue.children match {
+              case Seq(
+                  StringLiteral("dn_i"), GetStructField(_, _, Some("dn_i")),
+                  StringLiteral("dn_l"), GetStructField(_, _, Some("dn_l"))) =>
+                // OK
+
+              case nsValueChildren =>
+                fail(s"Unexpected children for 's.n_s': $nsValueChildren")
+            }
+
+          case sValueChildren =>
+            fail(s"Unexpected children for 's': $sValueChildren")
+        }
+
+        assert(txt.name == "txt" && txt == txtValue)
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+
+    val sql2 = "UPDATE nested_struct_table SET s.n_s = named_struct('dn_i', 1, 'dn_l', 1L)"
+    parseAndAlignAssignments(sql2) match {
+      case Seq(
+          Assignment(i: AttributeReference, iValue: AttributeReference),
+          Assignment(s: AttributeReference, sValue: CreateNamedStruct),
+          Assignment(txt: AttributeReference, txtValue: AttributeReference)) =>
+
+        assert(i.name == "i" && i == iValue)
+
+        assert(s.name == "s")
+        sValue.children match {
+          case Seq(
+              StringLiteral("n_i"), GetStructField(_, _, Some("n_i")),
+              StringLiteral("n_s"), nsValue: CreateNamedStruct) =>
+
+            nsValue.children match {
+              case Seq(
+                  StringLiteral("dn_i"), IntegerLiteral(1),
+                  StringLiteral("dn_l"), LongLiteral(1L)) =>
+                // OK
+
+              case nsValueChildren =>
+                fail(s"Unexpected children for 's.n_s': $nsValueChildren")
+            }
+
+          case sValueChildren =>
+            fail(s"Unexpected children for 's': $sValueChildren")
+        }
+
+        assert(txt.name == "txt" && txt == txtValue)
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+
+    val sql3 = "UPDATE nested_struct_table SET s.n_s = named_struct('dn_l', 1L, 'dn_i', 1)"
+    parseAndAlignAssignments(sql3) match {
+      case Seq(
+          Assignment(i: AttributeReference, iValue: AttributeReference),
+          Assignment(s: AttributeReference, sValue: CreateNamedStruct),
+          Assignment(txt: AttributeReference, txtValue: AttributeReference)) =>
+
+        assert(i.name == "i" && i == iValue)
+
+        assert(s.name == "s")
+        sValue.children match {
+          case Seq(
+              StringLiteral("n_i"), GetStructField(_, _, Some("n_i")),
+              StringLiteral("n_s"), nsValue: CreateNamedStruct) =>
+
+            nsValue.children match {
+              case Seq(
+                  StringLiteral("dn_i"), GetStructField(_, _, Some("dn_i")),
+                  StringLiteral("dn_l"), GetStructField(_, _, Some("dn_l"))) =>
+                // OK
+
+              case nsValueChildren =>
+                fail(s"Unexpected children for 's.n_s': $nsValueChildren")
+            }
+
+          case sValueChildren =>
+            fail(s"Unexpected children for 's': $sValueChildren")
+        }
+
+        assert(txt.name == "txt" && txt == txtValue)
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+
+    val sql4 = "UPDATE nested_struct_table SET s.n_i = 1"

Review Comment:
   @johanl-db, here is the test we talked about. If you have time to contribute any other tests or to check the alignment logic works for Delta, it would be great!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1160427447


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr
+
+        // if there is only one update and it is an exact match, return the assigned expression
+        case Seq(matchedUpdate) if isExactMatch(matchedUpdate, col) =>
+          applyUpdate(matchedUpdate.expr, col, addError, colPath :+ col.name)
+
+        // if there are matches only for children
+        case matchedUpdates if !hasExactMatch(matchedUpdates, col) =>
+          val newColPath = colPath :+ col.name
+          col.dataType match {
+            case colType: StructType =>
+              // build field expressions
+              val fieldExprs = colType.fields.zipWithIndex.map { case (field, ordinal) =>
+                GetStructField(col, ordinal, Some(field.name))
+              }
+
+              // recursively apply this method to nested fields
+              val updatedFieldExprs = applyUpdates(
+                matchedUpdates.map(update => update.copy(ref = update.ref.tail)),
+                colType.toAttributes,
+                fieldExprs,
+                addError,
+                newColPath)
+
+              // construct a new struct with updated field expressions
+              toNamedStruct(colType, updatedFieldExprs)
+
+            case otherType =>
+              addError(
+                "Updating nested fields is only supported for StructType but " +
+                s"${newColPath.quoted} is of type $otherType")
+              col
+          }
+
+        // if there are conflicting updates, throw an exception
+        // there are two illegal scenarios:
+        // - multiple updates to the same column
+        // - updates to a top-level struct and its nested fields (like a.b and a.b.c)
+        case matchedUpdates if hasExactMatch(matchedUpdates, col) =>
+          val conflictingColNames = matchedUpdates.map(update => (colPath ++ update.ref).quoted)
+          addError("Update conflicts for columns: " + conflictingColNames.distinct.mkString(", "))
+          col
+      }
+    }
+  }
+
+  private def toNamedStruct(structType: StructType, fieldExprs: Seq[Expression]): Expression = {
+    val namedStructExprs = structType.fields.zip(fieldExprs).flatMap { case (field, expr) =>
+      Seq(Literal(field.name), expr)
+    }
+    CreateNamedStruct(namedStructExprs)
+  }
+
+  private def hasExactMatch(updates: Seq[ColumnUpdate], col: NamedExpression): Boolean = {
+    updates.exists(isExactMatch(_, col))
+  }
+
+  private def isExactMatch(update: ColumnUpdate, col: NamedExpression): Boolean = {
+    update.ref match {
+      case Seq(namePart) if conf.resolver(namePart, col.name) => true
+      case _ => false
+    }
+  }
+
+  private def applyUpdate(
+      value: Expression,
+      col: Attribute,
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    (value.dataType, col.dataType) match {
+      // no need to reorder inner fields or cast if types are equal ignoring nullability
+      case (valueType, colType) if valueType.sameType(colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        validateAssignment(valueType, colType, addError, colPath)
+        value
+
+      case (valueType: StructType, colType: StructType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveStructType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)

Review Comment:
   Fixed. Let me resolve this to simplify reviews.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on PR #40308:
URL: https://github.com/apache/spark/pull/40308#issuecomment-1500441091

   Failures don't seem to be related.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1167348702


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,167 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, CreateNamedStruct, Expression, GetStructField, Literal}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This allows Spark to
+   * construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = attrs.map { attr =>
+      applyAssignments(
+        col = restoreActualType(attr),
+        colExpr = attr,
+        assignments,
+        addError = err => errors += err,
+        colPath = Seq(attr.name))
+    }
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyAssignments(
+      col: Attribute,
+      colExpr: Expression,
+      assignments: Seq[Assignment],
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    val (exactAssignments, otherAssignments) = assignments.partition { assignment =>
+      assignment.key.semanticEquals(colExpr)
+    }
+
+    val fieldAssignments = otherAssignments.filter { assignment =>
+      assignment.key.exists(_.semanticEquals(colExpr))
+    }
+
+    if (exactAssignments.size > 1) {

Review Comment:
   @cloud-fan, I've changed the approach to avoid deconstructing references. However, I decided to keep the validation while recursing vs doing this in a separate step as we discussed. When I tried to implement that idea, it turned out to be pretty involved with lots of edge cases. For instance, we can't have multiple assignments per top-level key but keys can reference top-level fields many times, `a.b.c` and `a.b.d` are allowed but not `a.b` and `a.b.c`. It felt easier to validate while recursing, just like `TableOutputResolver`.
   
   Let me know what you think.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1161935288


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   > it's better to operate on the resolved column expressions
   
   I agree, let's see if we can avoid the conversion to references.
   
   > We validate the map we got in step 2: for each top-level column, its expressions must be of the same tree height (to avoid updating both 'a.b' and 'a.b.c')
   
   Could you elaborate a bit on how you see the tree height check? Like add a separate method for computing expression height? What about cases when it is OK to have different expression heights like 'a.b.n1' and 'a.c' where a, b, c are all structs?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1166650749


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   I'm thinking about this
   ```
   def alignAssignments(
       assignments: Seq[Assignment],
       attrs: Seq[Attribute]): Seq[Assignment] = {
     // use ExpressionSet to check assignments have no duplication
     ...
     attrs.map { attr =>
       Assignment(attr, applyUpdates(assignments, attr))
     }
   }
   
   def applyUpdates(
       assignments: Seq[Assignment],
       col: Expression): Expression = {
     val (exactAssigments, others) = assignments.partition { assignment =>
       assigment.key.semanticEquals(col)
     }
     val relatedAssignments = others.filter { assignment =>
       assigment.key.exists(_.semanticEquals(col))
     }
     assert(exactAssigments.length <= 1)
     if (exactAssigments.nonEmpty) {
       if (relatedAssignments.nonEmpty) fail...
       exactAssigments.head.value
     } else {
       col.dataType match {
         case StructType(fields) =>
           CreatedStruct(fields.flatMap { field =>
             applyUpdates(relatedAssignments, GetStruct(col, field.name))
           })
   
         case _ =>
           assert(relatedAssignments.isEmpty)
           col
       }
     }
   }
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1163493525


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AlignRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that aligns assignments in row-level operations.
+ *
+ * Note that this rule must be run after resolving default values but before rewriting row-level
+ * commands into executable plans. This rule does not apply to tables that accept any schema.
+ * Such tables must inject their own rules to align assignments.
+ */
+object AlignRowLevelCommandAssignments extends Rule[LogicalPlan] {
+
+  override def apply(plan: LogicalPlan): LogicalPlan = plan resolveOperators {
+    case u: UpdateTable if !u.skipSchemaResolution && u.resolved && !u.aligned =>

Review Comment:
   I plan to ignore such statements when rewriting UPDATEs into executable plans, like we do today for DELETE. This would allow data sources to inject their own handling.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] viirya commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "viirya (via GitHub)" <gi...@apache.org>.
viirya commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1163426987


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AlignRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that aligns assignments in row-level operations.
+ *
+ * Note that this rule must be run after resolving default values but before rewriting row-level
+ * commands into executable plans. This rule does not apply to tables that accept any schema.
+ * Such tables must inject their own rules to align assignments.
+ */
+object AlignRowLevelCommandAssignments extends Rule[LogicalPlan] {
+
+  override def apply(plan: LogicalPlan): LogicalPlan = plan resolveOperators {
+    case u: UpdateTable if !u.skipSchemaResolution && u.resolved && !u.aligned =>

Review Comment:
   What happens if the table doesn't implement `SupportsRowLevelOperations`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1132829264


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).

Review Comment:
   Let me also know if you think we should only apply this to implementations of `SupportsRowLevelOperations`.
   [Here](https://github.com/apache/spark/pull/40308#discussion_r1128945996) is the original question.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1132469074


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).

Review Comment:
   hmm, do we expect a data source that can directly update an inner field? For such data sources, this is a regression.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1128943046


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   One notable difference is not using `AssertNotNull` and relying on the same framework we have for V2 tables. In particular, we check the assignment mode and whether in and out attributes are compatible. As a consequence, we get analysis errors and not runtime errors.
   
   However, the main contribution of this PR is `AssignmentUtils` that aligns assignments with table attributes. It is a prerequisite for rewriting UPDATEs. Right now, we only rewrite DELETEs. The new utility also handles casts and char/varchar types, which we already did here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127348254


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryCompilationErrors.scala:
##########
@@ -2057,6 +2057,17 @@ private[sql] object QueryCompilationErrors extends QueryErrorsBase {
         "errors" -> errors.mkString("\n- ")))
   }
 
+  def invalidRowLevelOperationAssignments(
+      assignments: Seq[Assignment],
+      errors: Seq[String]): Throwable = {
+
+    new AnalysisException(
+      errorClass = "DATATYPE_MISMATCH.INVALID_ROW_LEVEL_OPERATION_ASSIGNMENTS",

Review Comment:
   I am using `DATATYPE_MISMATCH` as it seems appropriate.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127079791


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AlignRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,62 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that aligns assignments in row-level operations.
+ *
+ * Note that this rule must be run after resolving default values but before rewriting row-level

Review Comment:
   We need to think about a reliable way to check if default values have been resolved. Right now, it simply relies on the order of rules, which is fragile. Ideas are welcome.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1160344747


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -319,6 +319,7 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
       ResolveRandomSeed ::
       ResolveBinaryArithmetic ::
       ResolveUnion ::
+      AlignRowLevelCommandAssignments ::

Review Comment:
   Added a comment above.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1160347542


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr
+
+        // if there is only one update and it is an exact match, return the assigned expression
+        case Seq(matchedUpdate) if isExactMatch(matchedUpdate, col) =>
+          applyUpdate(matchedUpdate.expr, col, addError, colPath :+ col.name)
+
+        // if there are matches only for children
+        case matchedUpdates if !hasExactMatch(matchedUpdates, col) =>
+          val newColPath = colPath :+ col.name
+          col.dataType match {
+            case colType: StructType =>
+              // build field expressions
+              val fieldExprs = colType.fields.zipWithIndex.map { case (field, ordinal) =>
+                GetStructField(col, ordinal, Some(field.name))
+              }
+
+              // recursively apply this method to nested fields
+              val updatedFieldExprs = applyUpdates(
+                matchedUpdates.map(update => update.copy(ref = update.ref.tail)),
+                colType.toAttributes,
+                fieldExprs,
+                addError,
+                newColPath)
+
+              // construct a new struct with updated field expressions
+              toNamedStruct(colType, updatedFieldExprs)
+
+            case otherType =>
+              addError(
+                "Updating nested fields is only supported for StructType but " +
+                s"${newColPath.quoted} is of type $otherType")
+              col
+          }
+
+        // if there are conflicting updates, throw an exception
+        // there are two illegal scenarios:
+        // - multiple updates to the same column
+        // - updates to a top-level struct and its nested fields (like a.b and a.b.c)
+        case matchedUpdates if hasExactMatch(matchedUpdates, col) =>
+          val conflictingColNames = matchedUpdates.map(update => (colPath ++ update.ref).quoted)
+          addError("Update conflicts for columns: " + conflictingColNames.distinct.mkString(", "))
+          col
+      }
+    }
+  }
+
+  private def toNamedStruct(structType: StructType, fieldExprs: Seq[Expression]): Expression = {
+    val namedStructExprs = structType.fields.zip(fieldExprs).flatMap { case (field, expr) =>
+      Seq(Literal(field.name), expr)
+    }
+    CreateNamedStruct(namedStructExprs)
+  }
+
+  private def hasExactMatch(updates: Seq[ColumnUpdate], col: NamedExpression): Boolean = {
+    updates.exists(isExactMatch(_, col))
+  }
+
+  private def isExactMatch(update: ColumnUpdate, col: NamedExpression): Boolean = {
+    update.ref match {
+      case Seq(namePart) if conf.resolver(namePart, col.name) => true
+      case _ => false
+    }
+  }
+
+  private def applyUpdate(
+      value: Expression,
+      col: Attribute,
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    (value.dataType, col.dataType) match {
+      // no need to reorder inner fields or cast if types are equal ignoring nullability
+      case (valueType, colType) if valueType.sameType(colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        validateAssignment(valueType, colType, addError, colPath)
+        value
+
+      case (valueType: StructType, colType: StructType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveStructType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: ArrayType, colType: ArrayType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveArrayType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: MapType, colType: MapType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveMapType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType, colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+
+        val colTypeHasCharVarchar = CharVarcharUtils.hasCharVarchar(colType)
+        val colTypeWithoutCharVarchar = if (colTypeHasCharVarchar) {
+          CharVarcharUtils.replaceCharVarcharWithString(colType)
+        } else {
+          colType
+        }
+
+        validateAssignment(valueType, colTypeWithoutCharVarchar, addError, colPath)
+
+        val casted = TableOutputResolver.cast(
+          value, colTypeWithoutCharVarchar,
+          conf, colPath.quoted)
+
+        if (conf.charVarcharAsString || !colTypeHasCharVarchar) {
+          casted
+        } else {
+          CharVarcharUtils.stringLengthCheck(casted, colType)
+        }
+    }
+  }
+
+  private def validateAssignment(
+      valueType: DataType,
+      expectedType: DataType,
+      addError: String => Unit,
+      colPath: Seq[String]): Unit = {
+
+    conf.storeAssignmentPolicy match {
+      case StoreAssignmentPolicy.STRICT | StoreAssignmentPolicy.ANSI =>
+        DataType.canWrite(
+          valueType, expectedType, byName = true, conf.resolver, colPath.quoted,
+          conf.storeAssignmentPolicy, addError)
+
+      case _ => // OK
+    }
+  }
+
+  /**
+   * Checks whether assignments are aligned and are compatible with table columns.
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to check
+   * @return true if the assignments are aligned
+   */
+  def aligned(attrs: Seq[Attribute], assignments: Seq[Assignment]): Boolean = {
+    if (attrs.size != assignments.size) {
+      return false
+    }
+
+    attrs.zip(assignments).forall { case (attr, assignment) =>
+      val key = assignment.key
+      val value = assignment.value
+
+      val attrType = CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType)
+
+      sameRef(toRef(key), toRef(attr)) &&
+        DataType.equalsIgnoreCompatibleNullability(value.dataType, attrType) &&
+        (attr.nullable || !value.nullable)
+    }
+  }
+
+  private def sameRef(ref: Seq[String], otherRef: Seq[String]): Boolean = {
+    ref.size == otherRef.size && ref.zip(otherRef).forall { case (namePart, otherNamePart) =>
+      conf.resolver(namePart, otherNamePart)
+    }
+  }
+
+  private def toRef(expr: Expression): Seq[String] = expr match {
+    case attr: AttributeReference =>
+      Seq(attr.name)
+    case Alias(child, _) =>
+      toRef(child)
+    case GetStructField(child, _, Some(name)) =>
+      toRef(child) :+ name
+    case other: ExtractValue =>

Review Comment:
   Actually, I can add support for those expressions here but fail temporary in the rewrite logic.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1160349040


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns

Review Comment:
   Done.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1128943386


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr

Review Comment:
   Note: I am skipping char/varchar length checks when a field is assigned to itself (i.e. did not change).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1128943046


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   One notable difference is not using `AssertNotNull` and relying on the same framework we have for V2 tables. In particular, we check the assignment mode and whether in and out attributes are compatible. As a consequence, we get analysis errors and not runtime errors.
   
   However, the main contribution of this PR is `AssignmentUtils` that aligns assignments with table attributes. It is a prerequisite for rewriting UPDATEs. Right now, we only rewrite DELETEs. The new utility also handles casts and char/varchar types.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127079791


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AlignRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,62 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that aligns assignments in row-level operations.
+ *
+ * Note that this rule must be run after resolving default values but before rewriting row-level

Review Comment:
   We may need to think about a reliable way to check if default values have been resolved. Right now, it simply relies on the order of rules, which is fragile.



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AlignRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,62 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that aligns assignments in row-level operations.
+ *
+ * Note that this rule must be run after resolving default values but before rewriting row-level

Review Comment:
   We need to think about a reliable way to check if default values have been resolved. Right now, it simply relies on the order of rules, which is fragile.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127343402


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TableOutputResolver.scala:
##########
@@ -129,7 +129,7 @@ object TableOutputResolver {
     }
   }
 
-  private def checkNullability(
+  private[analysis] def checkNullability(

Review Comment:
   I want to reuse code from `TableOutputResolver` wherever possible. However, adding assignment processing directly there would make that class even more complicated than it is today. That's why I decided to open up some methods instead.
   
   Feedback is appreciated. I could add `applyUpdate` to `TableOutputResolver` too.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1167348702


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,167 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, CreateNamedStruct, Expression, GetStructField, Literal}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This allows Spark to
+   * construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = attrs.map { attr =>
+      applyAssignments(
+        col = restoreActualType(attr),
+        colExpr = attr,
+        assignments,
+        addError = err => errors += err,
+        colPath = Seq(attr.name))
+    }
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyAssignments(
+      col: Attribute,
+      colExpr: Expression,
+      assignments: Seq[Assignment],
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    val (exactAssignments, otherAssignments) = assignments.partition { assignment =>
+      assignment.key.semanticEquals(colExpr)
+    }
+
+    val fieldAssignments = otherAssignments.filter { assignment =>
+      assignment.key.exists(_.semanticEquals(colExpr))
+    }
+
+    if (exactAssignments.size > 1) {

Review Comment:
   @cloud-fan, I've changed the approach to avoid deconstructing references. However, I decided to keep the validation while recursing vs doing this in a separate step as we discussed. When I tried to implement that idea, it turned out to be pretty involved with lots of edge cases. For instance, we can't have multiple assignments per top-level key but keys can reference top-level fields many times, `a.b.c` and `a.b.d` are allowed but `a.b` and `a.b.c` are not.
   
   It felt easier to validate while recursing, similar to `TableOutputResolver`.
   
   Let me know what you think.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1167348702


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,167 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, CreateNamedStruct, Expression, GetStructField, Literal}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This allows Spark to
+   * construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = attrs.map { attr =>
+      applyAssignments(
+        col = restoreActualType(attr),
+        colExpr = attr,
+        assignments,
+        addError = err => errors += err,
+        colPath = Seq(attr.name))
+    }
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyAssignments(
+      col: Attribute,
+      colExpr: Expression,
+      assignments: Seq[Assignment],
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    val (exactAssignments, otherAssignments) = assignments.partition { assignment =>
+      assignment.key.semanticEquals(colExpr)
+    }
+
+    val fieldAssignments = otherAssignments.filter { assignment =>
+      assignment.key.exists(_.semanticEquals(colExpr))
+    }
+
+    if (exactAssignments.size > 1) {

Review Comment:
   @cloud-fan, I've changed the approach to avoid deconstructing references. However, I decided to keep the validation while recursing vs doing this in a separate step as we discussed. When I tried to implement that idea, it turned out to be pretty involved with lots of edge cases. For instance, we can't have multiple assignments per top-level key but keys can reference top-level fields many times, `a.b.c` and `a.b.d` are allowed but `a.b` and `a.b.c` are not. It felt easier to validate while recursing, similar to `TableOutputResolver`.
   
   Let me know what you think.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan closed pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan closed pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes
URL: https://github.com/apache/spark/pull/40308


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127081206


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   This resolution was substantially different compared to what we do in normal writes or in data sources that actually support row-level operations. I am migrating to the logic that is close to by name resolution in V2 tables.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127081206


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   This resolution was substantially different compared to what we do in normal writes or in data sources that actually support row-level operations. I am migrating to the logic that is close to by name resolution in v2 tables.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127342306


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr
+
+        // if there is only one update and it is an exact match, return the assigned expression
+        case Seq(matchedUpdate) if isExactMatch(matchedUpdate, col) =>
+          applyUpdate(matchedUpdate.expr, col, addError, colPath :+ col.name)
+
+        // if there are matches only for children
+        case matchedUpdates if !hasExactMatch(matchedUpdates, col) =>
+          val newColPath = colPath :+ col.name
+          col.dataType match {
+            case colType: StructType =>
+              // build field expressions
+              val fieldExprs = colType.fields.zipWithIndex.map { case (field, ordinal) =>
+                GetStructField(col, ordinal, Some(field.name))
+              }
+
+              // recursively apply this method to nested fields
+              val updatedFieldExprs = applyUpdates(
+                matchedUpdates.map(update => update.copy(ref = update.ref.tail)),
+                colType.toAttributes,
+                fieldExprs,
+                addError,
+                newColPath)
+
+              // construct a new struct with updated field expressions
+              toNamedStruct(colType, updatedFieldExprs)
+
+            case otherType =>
+              addError(
+                "Updating nested fields is only supported for StructType but " +
+                s"${newColPath.quoted} is of type $otherType")
+              col
+          }
+
+        // if there are conflicting updates, throw an exception
+        // there are two illegal scenarios:
+        // - multiple updates to the same column
+        // - updates to a top-level struct and its nested fields (like a.b and a.b.c)
+        case matchedUpdates if hasExactMatch(matchedUpdates, col) =>
+          val conflictingColNames = matchedUpdates.map(update => (colPath ++ update.ref).quoted)
+          addError("Update conflicts for columns: " + conflictingColNames.distinct.mkString(", "))
+          col
+      }
+    }
+  }
+
+  private def toNamedStruct(structType: StructType, fieldExprs: Seq[Expression]): Expression = {
+    val namedStructExprs = structType.fields.zip(fieldExprs).flatMap { case (field, expr) =>
+      Seq(Literal(field.name), expr)
+    }
+    CreateNamedStruct(namedStructExprs)
+  }
+
+  private def hasExactMatch(updates: Seq[ColumnUpdate], col: NamedExpression): Boolean = {
+    updates.exists(isExactMatch(_, col))
+  }
+
+  private def isExactMatch(update: ColumnUpdate, col: NamedExpression): Boolean = {
+    update.ref match {
+      case Seq(namePart) if conf.resolver(namePart, col.name) => true
+      case _ => false
+    }
+  }
+
+  private def applyUpdate(
+      value: Expression,
+      col: Attribute,
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    (value.dataType, col.dataType) match {
+      // no need to reorder inner fields or cast if types are equal ignoring nullability
+      case (valueType, colType) if valueType.sameType(colType) =>

Review Comment:
   Any reasons why this may not be safe? Without this, output expressions may have quite a bit of redundant get field, create named struct code that is not always folded by the optimizer.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1129904923


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   Also, the current logic does not seem to cover nullability in inner checks.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1134549074


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I don't feel strongly about this. Looks like we agree about unifying the behavior, which I think is important. I'll wait a bit for others to comment. If no other input, I'll migrate INSERTs to runtime checks then.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1132477504


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).

Review Comment:
   Thinking about this more, I think this is required by the row level operation framework so we have no choice. Data sources can skip it (`skipSchemaResolution` return true) and use a more advanced implementation if they can.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] viirya commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "viirya (via GitHub)" <gi...@apache.org>.
viirya commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1133502235


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I just don't feel that analysis error to runtime error is a breaking behavior change. Failure scenarios are still failure scenarios, though the timing is different. I'm open to have a config for this if others think it is necessary.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1161935288


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   > it's better to operate on the resolved column expressions
   
   I agree, let's if we can avoid the conversion to references.
   
   > We validate the map we got in step 2: for each top-level column, its expressions must be of the same tree height (to avoid updating both 'a.b' and 'a.b.c')
   
   Could you elaborate a bit on how you see the tree heigh check? Like add a separate method for computing expression height? What about cases when it is OK to have different expression heights like 'a.b.n1' and 'a.c' where a, b, c are all structs?



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   > it's better to operate on the resolved column expressions
   
   I agree, let's see if we can avoid the conversion to references.
   
   > We validate the map we got in step 2: for each top-level column, its expressions must be of the same tree height (to avoid updating both 'a.b' and 'a.b.c')
   
   Could you elaborate a bit on how you see the tree heigh check? Like add a separate method for computing expression height? What about cases when it is OK to have different expression heights like 'a.b.n1' and 'a.c' where a, b, c are all structs?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1162922025


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   ah you are right, we can't simply check the tree height. I think a better way is to use a `ExpressionSet` to make sure these column expressions have no duplication.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1166214000


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   @cloud-fan, any ideas on how to avoid deconstructing `Seq[String]` when applying a set of assignments to a top-level attribute? The problem is that we recurse top to bottom in `applyUpdates` whereas `assignment.key` is a set of nested `GetStructField` calls with the outer expression referring the leaf column.
   
   I can see ways to perform the validation without converting keys to `Seq[String]` but I don't see an easy way to avoid that in `applyUpdates`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1166650749


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   I'm thinking about this
   ```
   def alignAssignments(
       assignments: Seq[Assignment],
       attrs: Seq[Attribute]): Seq[Assignment] = {
     // use ExpressionSet to check assignments have no duplication
     ...
     attrs.map { attr =>
       Assignment(attr, applyUpdates(assignments, attr))
     }
   }
   
   def applyUpdates(
       assignments: Seq[Assignment],
       col: Expression): Expression = {
     val (exactAssigments, others) = assignments.partition { assignment =>
       assigment.key.semanticEquals(col)
     }
     val relatedAssignments = others.filter { assignment =>
       assigment.key.exists(_.semanticEquals(col))
     }
     assert(exactAssigments.length <= 1)
     if (exactAssigments.nonEmpty) {
       if (relatedAssignments.nonEmpty) fail...
       exactAssigments.head.value
     } else {
       if (relatedAssignments.isEmpty) {
         col
       } else {
         assert(col.dataType.isInstanceOf[StructType])
         CreatedStruct(col.dataType.asInstanceOf[StructType].fields.flatMap { field =>
           Literal(field.name) :: applyUpdates(relatedAssignments, GetStruct(col, field.name)) :: Nil
         })
       }
     }
   }
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1132482968


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr
+
+        // if there is only one update and it is an exact match, return the assigned expression
+        case Seq(matchedUpdate) if isExactMatch(matchedUpdate, col) =>
+          applyUpdate(matchedUpdate.expr, col, addError, colPath :+ col.name)
+
+        // if there are matches only for children
+        case matchedUpdates if !hasExactMatch(matchedUpdates, col) =>
+          val newColPath = colPath :+ col.name
+          col.dataType match {
+            case colType: StructType =>
+              // build field expressions
+              val fieldExprs = colType.fields.zipWithIndex.map { case (field, ordinal) =>
+                GetStructField(col, ordinal, Some(field.name))
+              }
+
+              // recursively apply this method to nested fields
+              val updatedFieldExprs = applyUpdates(
+                matchedUpdates.map(update => update.copy(ref = update.ref.tail)),
+                colType.toAttributes,
+                fieldExprs,
+                addError,
+                newColPath)
+
+              // construct a new struct with updated field expressions
+              toNamedStruct(colType, updatedFieldExprs)
+
+            case otherType =>
+              addError(
+                "Updating nested fields is only supported for StructType but " +
+                s"${newColPath.quoted} is of type $otherType")
+              col
+          }
+
+        // if there are conflicting updates, throw an exception
+        // there are two illegal scenarios:
+        // - multiple updates to the same column
+        // - updates to a top-level struct and its nested fields (like a.b and a.b.c)
+        case matchedUpdates if hasExactMatch(matchedUpdates, col) =>
+          val conflictingColNames = matchedUpdates.map(update => (colPath ++ update.ref).quoted)
+          addError("Update conflicts for columns: " + conflictingColNames.distinct.mkString(", "))
+          col
+      }
+    }
+  }
+
+  private def toNamedStruct(structType: StructType, fieldExprs: Seq[Expression]): Expression = {
+    val namedStructExprs = structType.fields.zip(fieldExprs).flatMap { case (field, expr) =>
+      Seq(Literal(field.name), expr)
+    }
+    CreateNamedStruct(namedStructExprs)
+  }
+
+  private def hasExactMatch(updates: Seq[ColumnUpdate], col: NamedExpression): Boolean = {
+    updates.exists(isExactMatch(_, col))
+  }
+
+  private def isExactMatch(update: ColumnUpdate, col: NamedExpression): Boolean = {
+    update.ref match {
+      case Seq(namePart) if conf.resolver(namePart, col.name) => true
+      case _ => false
+    }
+  }
+
+  private def applyUpdate(
+      value: Expression,
+      col: Attribute,
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    (value.dataType, col.dataType) match {
+      // no need to reorder inner fields or cast if types are equal ignoring nullability
+      case (valueType, colType) if valueType.sameType(colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        validateAssignment(valueType, colType, addError, colPath)
+        value
+
+      case (valueType: StructType, colType: StructType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveStructType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: ArrayType, colType: ArrayType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveArrayType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType: MapType, colType: MapType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveMapType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)
+
+      case (valueType, colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+
+        val colTypeHasCharVarchar = CharVarcharUtils.hasCharVarchar(colType)
+        val colTypeWithoutCharVarchar = if (colTypeHasCharVarchar) {
+          CharVarcharUtils.replaceCharVarcharWithString(colType)
+        } else {
+          colType
+        }
+
+        validateAssignment(valueType, colTypeWithoutCharVarchar, addError, colPath)
+
+        val casted = TableOutputResolver.cast(
+          value, colTypeWithoutCharVarchar,
+          conf, colPath.quoted)
+
+        if (conf.charVarcharAsString || !colTypeHasCharVarchar) {
+          casted
+        } else {
+          CharVarcharUtils.stringLengthCheck(casted, colType)
+        }
+    }
+  }
+
+  private def validateAssignment(
+      valueType: DataType,
+      expectedType: DataType,
+      addError: String => Unit,
+      colPath: Seq[String]): Unit = {
+
+    conf.storeAssignmentPolicy match {
+      case StoreAssignmentPolicy.STRICT | StoreAssignmentPolicy.ANSI =>
+        DataType.canWrite(
+          valueType, expectedType, byName = true, conf.resolver, colPath.quoted,
+          conf.storeAssignmentPolicy, addError)
+
+      case _ => // OK
+    }
+  }
+
+  /**
+   * Checks whether assignments are aligned and are compatible with table columns.
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to check
+   * @return true if the assignments are aligned
+   */
+  def aligned(attrs: Seq[Attribute], assignments: Seq[Assignment]): Boolean = {
+    if (attrs.size != assignments.size) {
+      return false
+    }
+
+    attrs.zip(assignments).forall { case (attr, assignment) =>
+      val key = assignment.key
+      val value = assignment.value
+
+      val attrType = CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType)
+
+      sameRef(toRef(key), toRef(attr)) &&
+        DataType.equalsIgnoreCompatibleNullability(value.dataType, attrType) &&
+        (attr.nullable || !value.nullable)
+    }
+  }
+
+  private def sameRef(ref: Seq[String], otherRef: Seq[String]): Boolean = {
+    ref.size == otherRef.size && ref.zip(otherRef).forall { case (namePart, otherNamePart) =>
+      conf.resolver(namePart, otherNamePart)
+    }
+  }
+
+  private def toRef(expr: Expression): Seq[String] = expr match {
+    case attr: AttributeReference =>
+      Seq(attr.name)
+    case Alias(child, _) =>
+      toRef(child)
+    case GetStructField(child, _, Some(name)) =>
+      toRef(child) :+ name
+    case other: ExtractValue =>

Review Comment:
   how about `GetArrayStructField`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1129904923


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   Also, the current logic does not seem to cover nullability in inner fields (e.g. when we assign a struct).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127340319


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(

Review Comment:
   The logic in this method tries to follow by name resolution we have in V2 tables.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on PR #40308:
URL: https://github.com/apache/spark/pull/40308#issuecomment-1513265266

   Thanks for reviewing, @cloud-fan @huaxingao @dongjoon-hyun @viirya @johanl-db!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1169050872


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,103 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.expressions.{AttributeReference, Cast}
+import org.apache.spark.sql.catalyst.expressions.objects.AssertNotNull
+import org.apache.spark.sql.catalyst.plans.logical.{Assignment, LogicalPlan, MergeIntoTable, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.trees.TreePattern.COMMAND
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.SupportsRowLevelOperations
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that resolves assignments in row-level commands.
+ *
+ * Note that this rule must be run before rewriting row-level commands into executable plans.
+ * This rule does not apply to tables that accept any schema. Such tables must inject their own
+ * rules to resolve assignments.
+ */
+object ResolveRowLevelCommandAssignments extends Rule[LogicalPlan] {
+
+  override def apply(plan: LogicalPlan): LogicalPlan = plan.resolveOperatorsWithPruning(
+    _.containsPattern(COMMAND), ruleId) {
+    case u: UpdateTable if !u.skipSchemaResolution && u.resolved &&
+        supportsRowLevelOperations(u.table) && !u.aligned =>
+      validateStoreAssignmentPolicy()
+      val newTable = u.table.transform {
+        case r: DataSourceV2Relation =>
+          r.copy(output = r.output.map(CharVarcharUtils.cleanAttrMetadata))
+      }
+      val newAssignments = AssignmentUtils.alignAssignments(u.table.output, u.assignments)
+      u.copy(table = newTable, assignments = newAssignments)
+
+    case u: UpdateTable if !u.skipSchemaResolution && u.resolved && !u.aligned =>
+      resolveAssignments(u)
+
+    case m: MergeIntoTable if !m.skipSchemaResolution && m.resolved =>
+      resolveAssignments(m)
+  }
+
+  private def validateStoreAssignmentPolicy(): Unit = {
+    // SPARK-28730: LEGACY store assignment policy is disallowed in data source v2
+    if (conf.storeAssignmentPolicy == StoreAssignmentPolicy.LEGACY) {
+      throw QueryCompilationErrors.legacyStoreAssignmentPolicyError()
+    }
+  }
+
+  private def supportsRowLevelOperations(table: LogicalPlan): Boolean = {
+    EliminateSubqueryAliases(table) match {
+      case DataSourceV2Relation(_: SupportsRowLevelOperations, _, _, _, _) => true
+      case _ => false
+    }
+  }
+
+  private def resolveAssignments(p: LogicalPlan): LogicalPlan = {

Review Comment:
   Copied from `ResolveOutputRelation` to preserve the existing behavior for data sources that rely on custom implementation.



##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,103 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.expressions.{AttributeReference, Cast}
+import org.apache.spark.sql.catalyst.expressions.objects.AssertNotNull
+import org.apache.spark.sql.catalyst.plans.logical.{Assignment, LogicalPlan, MergeIntoTable, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.trees.TreePattern.COMMAND
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.SupportsRowLevelOperations
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that resolves assignments in row-level commands.
+ *
+ * Note that this rule must be run before rewriting row-level commands into executable plans.
+ * This rule does not apply to tables that accept any schema. Such tables must inject their own
+ * rules to resolve assignments.
+ */
+object ResolveRowLevelCommandAssignments extends Rule[LogicalPlan] {
+
+  override def apply(plan: LogicalPlan): LogicalPlan = plan.resolveOperatorsWithPruning(
+    _.containsPattern(COMMAND), ruleId) {
+    case u: UpdateTable if !u.skipSchemaResolution && u.resolved &&
+        supportsRowLevelOperations(u.table) && !u.aligned =>
+      validateStoreAssignmentPolicy()
+      val newTable = u.table.transform {
+        case r: DataSourceV2Relation =>
+          r.copy(output = r.output.map(CharVarcharUtils.cleanAttrMetadata))
+      }
+      val newAssignments = AssignmentUtils.alignAssignments(u.table.output, u.assignments)
+      u.copy(table = newTable, assignments = newAssignments)
+
+    case u: UpdateTable if !u.skipSchemaResolution && u.resolved && !u.aligned =>
+      resolveAssignments(u)
+
+    case m: MergeIntoTable if !m.skipSchemaResolution && m.resolved =>
+      resolveAssignments(m)
+  }
+
+  private def validateStoreAssignmentPolicy(): Unit = {
+    // SPARK-28730: LEGACY store assignment policy is disallowed in data source v2
+    if (conf.storeAssignmentPolicy == StoreAssignmentPolicy.LEGACY) {
+      throw QueryCompilationErrors.legacyStoreAssignmentPolicyError()
+    }
+  }
+
+  private def supportsRowLevelOperations(table: LogicalPlan): Boolean = {
+    EliminateSubqueryAliases(table) match {
+      case DataSourceV2Relation(_: SupportsRowLevelOperations, _, _, _, _) => true
+      case _ => false
+    }
+  }
+
+  private def resolveAssignments(p: LogicalPlan): LogicalPlan = {

Review Comment:
   Copied from `ResolveOutputRelation` to preserve the existing behavior for data sources that rely on custom implementations.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127081206


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   This resolution was substantially different compared to what we do in normal writes or in data sources that actually support row-level operations.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1160345085


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3334,9 +3337,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
           v2Write
         }
 
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I have removed this resolution but the logic is same: runtime null checks, varchar/char length checks, etc.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1160350074


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,275 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, CreateNamedStruct, Expression, ExtractValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{ArrayType, DataType, MapType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table columns
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(
+      updates: Seq[ColumnUpdate],
+      cols: Seq[Attribute],
+      colExprs: Seq[Expression],
+      addError: String => Unit,
+      colPath: Seq[String] = Nil): Seq[Expression] = {
+
+    // iterate through columns at the current level and find matching updates
+    cols.zip(colExprs).map { case (col, colExpr) =>
+      // find matches for this column or any of its children
+      val prefixMatchedUpdates = updates.filter(update => conf.resolver(update.ref.head, col.name))
+      prefixMatchedUpdates match {
+        // if there is no exact match and no match for children, return the column expr as is
+        case matchedUpdates if matchedUpdates.isEmpty =>
+          colExpr
+
+        // if there is only one update and it is an exact match, return the assigned expression
+        case Seq(matchedUpdate) if isExactMatch(matchedUpdate, col) =>
+          applyUpdate(matchedUpdate.expr, col, addError, colPath :+ col.name)
+
+        // if there are matches only for children
+        case matchedUpdates if !hasExactMatch(matchedUpdates, col) =>
+          val newColPath = colPath :+ col.name
+          col.dataType match {
+            case colType: StructType =>
+              // build field expressions
+              val fieldExprs = colType.fields.zipWithIndex.map { case (field, ordinal) =>
+                GetStructField(col, ordinal, Some(field.name))
+              }
+
+              // recursively apply this method to nested fields
+              val updatedFieldExprs = applyUpdates(
+                matchedUpdates.map(update => update.copy(ref = update.ref.tail)),
+                colType.toAttributes,
+                fieldExprs,
+                addError,
+                newColPath)
+
+              // construct a new struct with updated field expressions
+              toNamedStruct(colType, updatedFieldExprs)
+
+            case otherType =>
+              addError(
+                "Updating nested fields is only supported for StructType but " +
+                s"${newColPath.quoted} is of type $otherType")
+              col
+          }
+
+        // if there are conflicting updates, throw an exception
+        // there are two illegal scenarios:
+        // - multiple updates to the same column
+        // - updates to a top-level struct and its nested fields (like a.b and a.b.c)
+        case matchedUpdates if hasExactMatch(matchedUpdates, col) =>
+          val conflictingColNames = matchedUpdates.map(update => (colPath ++ update.ref).quoted)
+          addError("Update conflicts for columns: " + conflictingColNames.distinct.mkString(", "))
+          col
+      }
+    }
+  }
+
+  private def toNamedStruct(structType: StructType, fieldExprs: Seq[Expression]): Expression = {
+    val namedStructExprs = structType.fields.zip(fieldExprs).flatMap { case (field, expr) =>
+      Seq(Literal(field.name), expr)
+    }
+    CreateNamedStruct(namedStructExprs)
+  }
+
+  private def hasExactMatch(updates: Seq[ColumnUpdate], col: NamedExpression): Boolean = {
+    updates.exists(isExactMatch(_, col))
+  }
+
+  private def isExactMatch(update: ColumnUpdate, col: NamedExpression): Boolean = {
+    update.ref match {
+      case Seq(namePart) if conf.resolver(namePart, col.name) => true
+      case _ => false
+    }
+  }
+
+  private def applyUpdate(
+      value: Expression,
+      col: Attribute,
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    (value.dataType, col.dataType) match {
+      // no need to reorder inner fields or cast if types are equal ignoring nullability
+      case (valueType, colType) if valueType.sameType(colType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        validateAssignment(valueType, colType, addError, colPath)
+        value
+
+      case (valueType: StructType, colType: StructType) =>
+        TableOutputResolver.checkNullability(value, col, conf, addError, colPath)
+        val resolvedValue = TableOutputResolver.resolveStructType(
+          value, valueType, colType,
+          conf, addError, colPath)
+        resolvedValue.getOrElse(col)

Review Comment:
   It doesn't really matter but makes more sense, let me change.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1167348702


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,167 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, CreateNamedStruct, Expression, GetStructField, Literal}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This allows Spark to
+   * construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = attrs.map { attr =>
+      applyAssignments(
+        col = restoreActualType(attr),
+        colExpr = attr,
+        assignments,
+        addError = err => errors += err,
+        colPath = Seq(attr.name))
+    }
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyAssignments(
+      col: Attribute,
+      colExpr: Expression,
+      assignments: Seq[Assignment],
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    val (exactAssignments, otherAssignments) = assignments.partition { assignment =>
+      assignment.key.semanticEquals(colExpr)
+    }
+
+    val fieldAssignments = otherAssignments.filter { assignment =>
+      assignment.key.exists(_.semanticEquals(colExpr))
+    }
+
+    if (exactAssignments.size > 1) {

Review Comment:
   @cloud-fan, I've changed the approach to avoid deconstructing references. However, I decided to keep the validation while recursing vs doing this in a separate step as we discussed. When I tried to implement that idea, it turned out to be pretty involved with lots of edge cases. For instance, we can't have multiple assignments per top-level key but keys can reference top-level fields many times, `a.b.c` and `a.b.d` are allowed but not `a.b` and `a.b.c`. It felt easier to validate while recursing, just like we do that in `TableOutputResolver`.
   
   Let me know what you think.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1167349771


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/v2Commands.scala:
##########
@@ -778,6 +781,7 @@ case class Assignment(key: Expression, value: Expression) extends Expression
   override def dataType: DataType = throw new UnresolvedException("nullable")
   override def left: Expression = key
   override def right: Expression = value
+  override def sql: String = s"${key.sql} = ${value.sql}"

Review Comment:
   Added this for better error messages.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1167352539


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AlignRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,62 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.SupportsRowLevelOperations
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that aligns assignments with table attributes in row-level operations.
+ *
+ * Note that this rule must be run after resolving default values but before rewriting row-level
+ * commands into executable plans. This rule does not apply to tables that accept any schema.
+ * Such tables must inject their own rules to align assignments.
+ */
+object AlignRowLevelCommandAssignments extends Rule[LogicalPlan] {
+
+  override def apply(plan: LogicalPlan): LogicalPlan = plan resolveOperators {
+    case u: UpdateTable if u.resolved && requiresAlignment(u.table) && !u.aligned =>
+      val newTable = u.table.transform {
+        case r: DataSourceV2Relation =>
+          validateStoreAssignmentPolicy()
+          r.copy(output = r.output.map(CharVarcharUtils.cleanAttrMetadata))
+      }
+      val newAssignments = AssignmentUtils.alignAssignments(u.table.output, u.assignments)
+      u.copy(table = newTable, assignments = newAssignments)
+  }
+
+  private def validateStoreAssignmentPolicy(): Unit = {
+    // SPARK-28730: LEGACY store assignment policy is disallowed in data source v2
+    if (conf.storeAssignmentPolicy == StoreAssignmentPolicy.LEGACY) {
+      throw QueryCompilationErrors.legacyStoreAssignmentPolicyError()
+    }
+  }
+
+  private def requiresAlignment(table: LogicalPlan): Boolean = {
+    EliminateSubqueryAliases(table) match {
+      case r: NamedRelation if r.skipSchemaResolution => false
+      case DataSourceV2Relation(_: SupportsRowLevelOperations, _, _, _, _) => true

Review Comment:
   @viirya, I decided not to align assignments if tables don't extend `SupportsRowLevelOperations`. That way, data sources using their own implementations won't be affected. They can still use `AssignmentUtils`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1127343402


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TableOutputResolver.scala:
##########
@@ -129,7 +129,7 @@ object TableOutputResolver {
     }
   }
 
-  private def checkNullability(
+  private[analysis] def checkNullability(

Review Comment:
   I want to reuse code from `TableOutputResolver` wherever possible. However, adding assignment processing directly there would make that class even more complicated than it is today. That's why I decided to open up some methods instead.
   
   Feedback is appreciated.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on PR #40308:
URL: https://github.com/apache/spark/pull/40308#issuecomment-1457537193

   cc @huaxingao @cloud-fan @dongjoon-hyun @sunchao @viirya @gengliangwang 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1128945996


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AlignRowLevelCommandAssignments.scala:
##########
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+
+/**
+ * A rule that aligns assignments in row-level operations.
+ *
+ * Note that this rule must be run after resolving default values but before rewriting row-level
+ * commands into executable plans. This rule does not apply to tables that accept any schema.
+ * Such tables must inject their own rules to align assignments.
+ */
+object AlignRowLevelCommandAssignments extends Rule[LogicalPlan] {
+
+  override def apply(plan: LogicalPlan): LogicalPlan = plan resolveOperators {
+    case u: UpdateTable if !u.skipSchemaResolution && u.resolved && !u.aligned =>

Review Comment:
   We can apply this rule only if `table` implements `SupportsRowLevelOperations` if that feels safer?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1129825024


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   It's more common in other databases that, if you write values to a non-nullable column, runtime null check is applied instead of rejecting the INSERT because the value is nullable.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1129825024


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   It's more common in other databases that, if you write values to a non-nullable column, runtime null check is applied instead of rejecting the write because the value is nullable.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1136474892


##########
sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlignUpdateAssignmentsSuite.scala:
##########
@@ -0,0 +1,786 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.execution.command
+
+import java.util.Collections
+
+import org.mockito.ArgumentMatchers.any
+import org.mockito.Mockito.{mock, when}
+import org.mockito.invocation.InvocationOnMock
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.analysis.{AnalysisTest, Analyzer, FunctionRegistry, NoSuchTableException, ResolveSessionCatalog}
+import org.apache.spark.sql.catalyst.catalog.{InMemoryCatalog, SessionCatalog}
+import org.apache.spark.sql.catalyst.expressions.{ArrayTransform, AttributeReference, BooleanLiteral, Cast, CheckOverflowInTableInsert, CreateNamedStruct, EvalMode, GetStructField, IntegerLiteral, LambdaFunction, LongLiteral, MapFromArrays, StringLiteral}
+import org.apache.spark.sql.catalyst.expressions.objects.StaticInvoke
+import org.apache.spark.sql.catalyst.parser.CatalystSqlParser
+import org.apache.spark.sql.catalyst.plans.logical.{Assignment, LogicalPlan, UpdateTable}
+import org.apache.spark.sql.catalyst.rules.Rule
+import org.apache.spark.sql.connector.catalog.{CatalogManager, CatalogNotFoundException, CatalogV2Util, Column, ColumnDefaultValue, Identifier, Table, TableCapability, TableCatalog}
+import org.apache.spark.sql.connector.expressions.{LiteralValue, Transform}
+import org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog
+import org.apache.spark.sql.internal.SQLConf
+import org.apache.spark.sql.internal.SQLConf.StoreAssignmentPolicy
+import org.apache.spark.sql.types.{BooleanType, IntegerType, StructType}
+
+class AlignUpdateAssignmentsSuite extends AnalysisTest {
+
+  private val primitiveTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT", nullable = false)
+      .add("l", "LONG")
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val nestedStructTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT")
+      .add(
+        "s",
+        "STRUCT<n_i: INT NOT NULL, n_s: STRUCT<dn_i: INT NOT NULL, dn_l: LONG>>",
+        nullable = false)
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val mapArrayTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT")
+      .add("a", "ARRAY<STRUCT<ac_1: INT, ac_2: INT>>")
+      .add("m", "MAP<STRING, STRING>")
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val charVarcharTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("c", "CHAR(5)")
+      .add(
+        "s",
+        "STRUCT<n_i: INT, n_vc: VARCHAR(5)>",
+        nullable = false)
+      .add(
+        "a",
+        "ARRAY<STRUCT<n_i: INT, n_vc: VARCHAR(5)>>",
+        nullable = false)
+      .add(
+        "mk",
+        "MAP<STRUCT<n_i: INT, n_vc: VARCHAR(5)>, STRING>",
+        nullable = false)
+      .add(
+        "mv",
+        "MAP<STRING, STRUCT<n_i: INT, n_vc: VARCHAR(5)>>",
+        nullable = false)
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val acceptsAnySchemaTable: Table = {
+    val t = mock(classOf[Table])
+    val schema = new StructType()
+      .add("i", "INT", nullable = false)
+      .add("l", "LONG")
+      .add("txt", "STRING")
+    when(t.columns()).thenReturn(CatalogV2Util.structTypeToV2Columns(schema))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    when(t.capabilities()).thenReturn(Collections.singleton(TableCapability.ACCEPT_ANY_SCHEMA))
+    t
+  }
+
+  private val defaultValuesTable: Table = {
+    val t = mock(classOf[Table])
+    val iDefault = new ColumnDefaultValue("42", LiteralValue(42, IntegerType))
+    when(t.columns()).thenReturn(Array(
+      Column.create("b", BooleanType, true, null, null),
+      Column.create("i", IntegerType, true, null, iDefault, null)))
+    when(t.partitioning()).thenReturn(Array.empty[Transform])
+    t
+  }
+
+  private val v2Catalog: TableCatalog = {
+    val newCatalog = mock(classOf[TableCatalog])
+    when(newCatalog.loadTable(any())).thenAnswer((invocation: InvocationOnMock) => {
+      val ident = invocation.getArgument[Identifier](0)
+      ident.name match {
+        case "primitive_table" => primitiveTable
+        case "nested_struct_table" => nestedStructTable
+        case "map_array_table" => mapArrayTable
+        case "char_varchar_table" => charVarcharTable
+        case "accepts_any_schema_table" => acceptsAnySchemaTable
+        case "default_values_table" => defaultValuesTable
+        case name => throw new NoSuchTableException(Seq(name))
+      }
+    })
+    when(newCatalog.name()).thenReturn("cat")
+    newCatalog
+  }
+
+  private val v1SessionCatalog: SessionCatalog = new SessionCatalog(
+    new InMemoryCatalog(),
+    FunctionRegistry.builtin,
+    new SQLConf())
+
+  private val v2SessionCatalog: TableCatalog = new V2SessionCatalog(v1SessionCatalog)
+
+  private val catalogManager: CatalogManager = {
+    val manager = mock(classOf[CatalogManager])
+    when(manager.catalog(any())).thenAnswer((invocation: InvocationOnMock) => {
+      invocation.getArgument[String](0) match {
+        case "testcat" => v2Catalog
+        case CatalogManager.SESSION_CATALOG_NAME => v2SessionCatalog
+        case name => throw new CatalogNotFoundException(s"No such catalog: $name")
+      }
+    })
+    when(manager.currentCatalog).thenReturn(v2Catalog)
+    when(manager.currentNamespace).thenReturn(Array.empty[String])
+    when(manager.v1SessionCatalog).thenReturn(v1SessionCatalog)
+    when(manager.v2SessionCatalog).thenReturn(v2SessionCatalog)
+    manager
+  }
+
+  test("align assignments (primitive types)") {
+    val sql1 = "UPDATE primitive_table SET txt = 'new', i = 1"
+    parseAndAlignAssignments(sql1) match {
+      case Seq(
+          Assignment(i: AttributeReference, IntegerLiteral(1)),
+          Assignment(l: AttributeReference, lValue: AttributeReference),
+          Assignment(txt: AttributeReference, StringLiteral("new"))) =>
+
+        assert(i.name == "i")
+        assert(l.name == "l" && l == lValue)
+        assert(txt.name == "txt")
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+
+    val sql2 = "UPDATE primitive_table SET l = 10L"
+    parseAndAlignAssignments(sql2) match {
+      case Seq(
+          Assignment(i: AttributeReference, iValue: AttributeReference),
+          Assignment(l: AttributeReference, LongLiteral(10L)),
+          Assignment(txt: AttributeReference, txtValue: AttributeReference)) =>
+
+        assert(i.name == "i" && i == iValue)
+        assert(l.name == "l")
+        assert(txt.name == "txt" && txt == txtValue)
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+
+    val sql3 = "UPDATE primitive_table AS t SET t.txt = 'new', t.l = 10L, t.i = -1"
+    parseAndAlignAssignments(sql3) match {
+      case Seq(
+          Assignment(i: AttributeReference, IntegerLiteral(-1)),
+          Assignment(l: AttributeReference, LongLiteral(10L)),
+          Assignment(txt: AttributeReference, StringLiteral("new"))) =>
+
+        assert(i.name == "i")
+        assert(l.name == "l")
+        assert(txt.name == "txt")
+
+      case assignments =>
+        fail(s"Unexpected assignments: $assignments")
+    }
+  }
+
+  test("align assignments (structs)") {

Review Comment:
   Will make sure there is test case for this.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1131921529


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I see value in both depending on the use case. What about making it configurable? If we just switch to runtime checks everywhere, it will be a substantial behavior change. We can add a new SQL property and default to the existing behavior of throwing an exception during the analysis phase.
   
   By the way, I don't target 3.4 in this PR so we will have time to build a proper runtime checking framework. I think that would be a substantial effort as we need to cover inner fields. There is no logic for that at the moment, if I am not mistaken.
   
   I do think consistency would be important. UPDATE and INSERT should behave in the same way.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1131340809


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -3344,43 +3345,6 @@ class Analyzer(override val catalogManager: CatalogManager) extends RuleExecutor
         } else {
           v2Write
         }
-
-      case u: UpdateTable if !u.skipSchemaResolution && u.resolved =>

Review Comment:
   I'm also +1 for @cloud-fan 's direction.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #40308:
URL: https://github.com/apache/spark/pull/40308#issuecomment-1512630753

   thanks, merging to master!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1163197480


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   We could probably build `ExpressionSet` for each update key per top-level attribute and check the intersection across all `ExpressionSet` is empty. Let me know if that's similar to what you thought.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1163122671


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   Using `ExpressionSet` to detect duplicate assignments to `a.b.c` and `a.b.c` would be easy. What about cases like `a.b` and `a.b.c` where we assign a value to a struct and its field at the same time? Are you thinking of recursively adding all subparts of each column key to `ExpressionSet`? For instance, we would need to add `a.b`, `a.b.c`, `a.b.c.d` to `ExpressionSet` for `a.b.c.d`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1167348702


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,167 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, CreateNamedStruct, Expression, GetStructField, Literal}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This allows Spark to
+   * construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = attrs.map { attr =>
+      applyAssignments(
+        col = restoreActualType(attr),
+        colExpr = attr,
+        assignments,
+        addError = err => errors += err,
+        colPath = Seq(attr.name))
+    }
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyAssignments(
+      col: Attribute,
+      colExpr: Expression,
+      assignments: Seq[Assignment],
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    val (exactAssignments, otherAssignments) = assignments.partition { assignment =>
+      assignment.key.semanticEquals(colExpr)
+    }
+
+    val fieldAssignments = otherAssignments.filter { assignment =>
+      assignment.key.exists(_.semanticEquals(colExpr))
+    }
+
+    if (exactAssignments.size > 1) {

Review Comment:
   @cloud-fan, I've changed the approach to avoid deconstructing references. However, I decided to keep validating assignments while recursing vs doing this in a separate step as we discussed. When I tried to implement that idea, it turned out to be pretty involved with lots of edge cases. For instance, we can't have multiple assignments per top-level key but keys can reference top-level fields many times, `a.b.c` and `a.b.d` are allowed but not `a.b` and `a.b.c`. It felt easier to validate while recursing, just like we do that in `TableOutputResolver`.
   
   Let me know what you think.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1167348702


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,167 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, CreateNamedStruct, Expression, GetStructField, Literal}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This allows Spark to
+   * construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = attrs.map { attr =>
+      applyAssignments(
+        col = restoreActualType(attr),
+        colExpr = attr,
+        assignments,
+        addError = err => errors += err,
+        colPath = Seq(attr.name))
+    }
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyAssignments(
+      col: Attribute,
+      colExpr: Expression,
+      assignments: Seq[Assignment],
+      addError: String => Unit,
+      colPath: Seq[String]): Expression = {
+
+    val (exactAssignments, otherAssignments) = assignments.partition { assignment =>
+      assignment.key.semanticEquals(colExpr)
+    }
+
+    val fieldAssignments = otherAssignments.filter { assignment =>
+      assignment.key.exists(_.semanticEquals(colExpr))
+    }
+
+    if (exactAssignments.size > 1) {

Review Comment:
   @cloud-fan, I've changed the approach to avoid deconstructing references. However, I decided to keep validating assignments while recursing vs doing this in a separate step as we discussed. When I tried to implement that idea, it turned out to be pretty involved with lots of edge cases. For instance, we can't have multiple assignments per top-level key but keys can reference top-level fields many times, `a.b.c` and `a.b.d` are allowed but not `a.b` and `a.b.c`. It felt easier to validate while recursing, just like `TableOutputResolver`.
   
   Let me know what you think.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] aokolnychyi commented on a diff in pull request #40308: [SPARK-42151][SQL] Align UPDATE assignments with table attributes

Posted by "aokolnychyi (via GitHub)" <gi...@apache.org>.
aokolnychyi commented on code in PR #40308:
URL: https://github.com/apache/spark/pull/40308#discussion_r1167348939


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/AssignmentUtils.scala:
##########
@@ -0,0 +1,204 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.analysis
+
+import scala.collection.mutable
+
+import org.apache.spark.sql.AnalysisException
+import org.apache.spark.sql.catalyst.SQLConfHelper
+import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference, CreateNamedStruct, Expression, GetArrayItem, GetArrayStructFields, GetMapValue, GetStructField, Literal, NamedExpression}
+import org.apache.spark.sql.catalyst.plans.logical.Assignment
+import org.apache.spark.sql.catalyst.util.CharVarcharUtils
+import org.apache.spark.sql.connector.catalog.CatalogV2Implicits._
+import org.apache.spark.sql.errors.QueryCompilationErrors
+import org.apache.spark.sql.types.{DataType, StructType}
+
+object AssignmentUtils extends SQLConfHelper with CastSupport {
+
+  private case class ColumnUpdate(ref: Seq[String], expr: Expression)
+
+  /**
+   * Aligns assignments to match table columns.
+   * <p>
+   * This method processes and reorders given assignments so that each target column gets
+   * an expression it should be set to. If a column does not have a matching assignment,
+   * it will be set to its current value. For example, if one passes table attributes c1, c2
+   * and an assignment c2 = 1, this method will return c1 = c1, c2 = 1. This alignment is
+   * required to construct an updated version of a row.
+   * <p>
+   * This method also handles updates to nested columns. If there is an assignment to a particular
+   * nested field, this method will construct a new struct with one field updated preserving other
+   * fields that have not been modified. For example, if one passes table attributes c1, c2
+   * where c2 is a struct with fields n1 and n2 and an assignment c2.n2 = 1, this method will
+   * return c1 = c1, c2 = struct(c2.n1, 1).
+   *
+   * @param attrs table attributes
+   * @param assignments assignments to align
+   * @return aligned assignments that match table attributes
+   */
+  def alignAssignments(
+      attrs: Seq[Attribute],
+      assignments: Seq[Assignment]): Seq[Assignment] = {
+
+    val errors = new mutable.ArrayBuffer[String]()
+
+    val output = applyUpdates(
+      updates = assignments.map(toColumnUpdate),
+      cols = attrs.map(restoreActualType),
+      colExprs = attrs,
+      addError = err => errors += err)
+
+    if (errors.nonEmpty) {
+      throw QueryCompilationErrors.invalidRowLevelOperationAssignments(assignments, errors.toSeq)
+    }
+
+    attrs.zip(output).map { case (attr, expr) => Assignment(attr, expr) }
+  }
+
+  private def toColumnUpdate(assignment: Assignment): ColumnUpdate = {
+    ColumnUpdate(toRef(assignment.key), assignment.value)
+  }
+
+  private def restoreActualType(attr: Attribute): Attribute = {
+    attr.withDataType(CharVarcharUtils.getRawType(attr.metadata).getOrElse(attr.dataType))
+  }
+
+  private def applyUpdates(

Review Comment:
   Perfect, I forgot about `exists`. Thanks!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org