You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Albert Meltzer (JIRA)" <ji...@apache.org> on 2017/10/16 06:33:00 UTC
[jira] [Created] (SPARK-22283) withColumn should replaces multiple
instances with a single one
Albert Meltzer created SPARK-22283:
--------------------------------------
Summary: withColumn should replaces multiple instances with a single one
Key: SPARK-22283
URL: https://issues.apache.org/jira/browse/SPARK-22283
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 2.2.0
Reporter: Albert Meltzer
Currently, {{withColumn}} claims to do the following: _"adding a column or replacing the existing column that has the same name."_
Unfortunately, if multiple existing columns have the same name (which is a normal occurrence after a join), this results in multiple replaced -- and retained columns (with the same value) --and messages about an ambiguous column.
The current implementation of {{withColumn}} contains this:
{noformat}
def withColumn(colName: String, col: Column): DataFrame = {
val resolver = sparkSession.sessionState.analyzer.resolver
val output = queryExecution.analyzed.output
val shouldReplace = output.exists(f => resolver(f.name, colName))
if (shouldReplace) {
val columns = output.map { field =>
if (resolver(field.name, colName)) {
col.as(colName)
} else {
Column(field)
}
}
select(columns : _*)
} else {
select(Column("*"), col.as(colName))
}
}
{noformat}
Instead, suggest something like this (which replaces all matching fields with a single instance of the new one):
{noformat}
def withColumn(colName: String, col: Column): DataFrame = {
val resolver = sparkSession.sessionState.analyzer.resolver
val output = queryExecution.analyzed.output
val existing = output.filterNot(f => resolver(f.name, colName)).map(new Column(_))
select(existing :+ col.as(colName): _*)
}
{noformat}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org