You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nicholas Chammas (Jira)" <ji...@apache.org> on 2021/12/14 20:38:00 UTC

[jira] [Commented] (SPARK-24853) Support Column type for withColumn and withColumnRenamed apis

    [ https://issues.apache.org/jira/browse/SPARK-24853?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17459467#comment-17459467 ] 

Nicholas Chammas commented on SPARK-24853:
------------------------------------------

[~hyukjin.kwon] - Are you still opposed to this proposed improvement? If not, I'd like to work on it.

> Support Column type for withColumn and withColumnRenamed apis
> -------------------------------------------------------------
>
>                 Key: SPARK-24853
>                 URL: https://issues.apache.org/jira/browse/SPARK-24853
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.2.2, 3.2.0
>            Reporter: nirav patel
>            Priority: Minor
>
> Can we add overloaded version of withColumn or withColumnRenamed that accept Column type instead of String? That way I can specify FQN in case when there is duplicate column names. e.g. if I have 2 columns with same name as a result of join and I want to rename one of the field I can do it with this new API.
>  
> This would be similar to Drop api which supports both String and Column type.
>  
> def
> withColumn(colName: Column, col: Column): DataFrame
> Returns a new Dataset by adding a column or replacing the existing column that has the same name.
>  
> def
> withColumnRenamed(existingName: Column, newName: Column): DataFrame
> Returns a new Dataset with a column renamed.
>  
>  
>  
> I think there should also be this one:
>  
> def
> withColumnRenamed(existingName: *Column*, newName: *Column*): DataFrame
> Returns a new Dataset with a column renamed.
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org