You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/11/25 07:07:38 UTC

[GitHub] [spark] grundprinzip commented on a diff in pull request #38793: [SPARK-41256][CONNECT] Implement DataFrame.withColumn(s)

grundprinzip commented on code in PR #38793:
URL: https://github.com/apache/spark/pull/38793#discussion_r1032061331


##########
connector/connect/src/main/protobuf/spark/connect/relations.proto:
##########
@@ -457,3 +458,16 @@ message RenameColumnsByNameToNameMap {
   // duplicated B are not allowed.
   map<string, string> rename_columns_map = 2;
 }
+
+// Adding columns or replacing the existing columns that has the same names.
+message WithColumns {
+  // (Required) The input relation.
+  Relation input = 1;
+
+  // (Required)
+  //
+  // Given a column name, apply corresponding expression on the column. If column
+  // name exists in the input relation, then replacing the column. if column name
+  // does not exist in the input relation, then adding the column.
+  map<string, Expression> cols_map = 2;

Review Comment:
   +1 on using a repeated tuple here. The reason is that the behavior should be defined by the API not by the language implementation. The reason we have consistent behavior in Spark today is that Python converts to Scala before even calling the method so there is only one client implementation.
   
   My suggestion would be to change this to:
   
   ```
   repeated Expression.Alias col_map
   ```
   
   The Alias has a reference to an arbitrary expression and a name which is exactly what we want.
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org