You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Olivier Girardot (JIRA)" <ji...@apache.org> on 2015/06/02 11:27:17 UTC

[jira] [Created] (SPARK-8038) PySpark SQL when functions is broken on Column

Olivier Girardot created SPARK-8038:
---------------------------------------

             Summary: PySpark SQL when functions is broken on Column
                 Key: SPARK-8038
                 URL: https://issues.apache.org/jira/browse/SPARK-8038
             Project: Spark
          Issue Type: Bug
    Affects Versions: 1.4.0
         Environment: Spark 1.4.0 RC3
            Reporter: Olivier Girardot
            Priority: Blocker





{code}
In [1]: df = sqlCtx.createDataFrame([(1, "1"), (2, "2"), (1, "2"), (1, "2")], ["key", "value"])


In [2]: from pyspark.sql import functions as F

In [8]: df.select(df.key, F.when(df.key > 1, 0).when(df.key == 0, 2).otherwise(1)).show()

+---+---------------------------------+
| key |CASE WHEN (key = 0) THEN 2 ELSE 1|
+---+---------------------------------+
| 1| 1|
| 2| 1|
| 1| 1|
| 1| 1|
+---+---------------------------------+
{code}

When in Scala I get the expected expression and behaviour : 

{code}
scala> val df = sqlContext.createDataFrame(List((1, "1"), (2, "2"), (1, "2"), (1, "2"))).toDF("key", "value")

scala> import org.apache.spark.sql.functions._

scala> df.select(df("key"), when(df("key") > 1, 0).when(df("key") === 2, 2).otherwise(1)).show()

+---+-------------------------------------------------------+

|key|CASE WHEN (key > 1) THEN 0 WHEN (key = 2) THEN 2 ELSE 1|
+---+-------------------------------------------------------+
| 1| 1|
| 2| 0|
| 1| 1|
| 1| 1|
+---+-------------------------------------------------------+
{code}

This is coming from the "column.py" file with the Column class definition of **when** and the fix is coming.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org