You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nicholas Chammas (Jira)" <ji...@apache.org> on 2021/11/02 14:39:00 UTC

[jira] [Updated] (SPARK-24853) Support Column type for withColumn and withColumnRenamed apis

     [ https://issues.apache.org/jira/browse/SPARK-24853?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Nicholas Chammas updated SPARK-24853:
-------------------------------------
    Priority: Minor  (was: Major)

> Support Column type for withColumn and withColumnRenamed apis
> -------------------------------------------------------------
>
>                 Key: SPARK-24853
>                 URL: https://issues.apache.org/jira/browse/SPARK-24853
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.2.2
>            Reporter: nirav patel
>            Priority: Minor
>
> Can we add overloaded version of withColumn or withColumnRenamed that accept Column type instead of String? That way I can specify FQN in case when there is duplicate column names. e.g. if I have 2 columns with same name as a result of join and I want to rename one of the field I can do it with this new API.
>  
> This would be similar to Drop api which supports both String and Column type.
>  
> def
> withColumn(colName: Column, col: Column): DataFrame
> Returns a new Dataset by adding a column or replacing the existing column that has the same name.
>  
> def
> withColumnRenamed(existingName: Column, newName: Column): DataFrame
> Returns a new Dataset with a column renamed.
>  
>  
>  
> I think there should also be this one:
>  
> def
> withColumnRenamed(existingName: *Column*, newName: *Column*): DataFrame
> Returns a new Dataset with a column renamed.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org