You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/08/21 01:30:00 UTC

[jira] [Commented] (SPARK-28783) Support plus + operator as concatenation for StringType columns in pyspark

    [ https://issues.apache.org/jira/browse/SPARK-28783?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16911856#comment-16911856 ] 

Hyukjin Kwon commented on SPARK-28783:
--------------------------------------

I think it's a breaking change if we do. The current behaviour seems also making sense. Let's avoid to do this for now.
FYI, there are many inconsistency comparing to pandas and this is being handled in thridparty libraries, for instance, https://github.com/databricks/koalas

> Support plus + operator as concatenation for StringType columns in pyspark
> --------------------------------------------------------------------------
>
>                 Key: SPARK-28783
>                 URL: https://issues.apache.org/jira/browse/SPARK-28783
>             Project: Spark
>          Issue Type: Wish
>          Components: PySpark
>    Affects Versions: 2.4.3
>            Reporter: Louis Yang
>            Priority: Trivial
>
> Right now if one try to use plus operator for two columns in pyspark like `F.col('a') + F.col('b')`, pyspark always try to treat it as arithmetic addition like `2+3=5`. If the columns are `StringType`, pyspark try to convert them to number then do the addition. However, in python and pandas world, the plus operation for string type means concatenation not arithmetic addition. So it will be great that pyspark also support the same logic that `F.col('str1') + F.col('str2') + F.col('str3')` yields `F.concat(F.col('str1'), F.col('str2'), F.col('str3'))`.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org