You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Renat Bekbolatov (JIRA)" <ji...@apache.org> on 2016/06/26 06:36:33 UTC

[jira] [Issue Comment Deleted] (SPARK-16211) DataFrame filter is buggy when possibly: AND clause, one of the columns involved is of type String

     [ https://issues.apache.org/jira/browse/SPARK-16211?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Renat Bekbolatov updated SPARK-16211:
-------------------------------------
    Comment: was deleted

(was: I haven't tested this on a later version.
)

> DataFrame filter is buggy when possibly: AND clause, one of the columns involved is of type String
> --------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16211
>                 URL: https://issues.apache.org/jira/browse/SPARK-16211
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell, SQL
>    Affects Versions: 1.5.0
>         Environment: CDH 5.5.0/YARN
>            Reporter: Renat Bekbolatov
>
> df was a result of several joins with some upstream tables having column names renamed.
> scala> df.filter(col("ad_market_id") === 4 && col("event_date") === "2016-05-30").show
> +----------+------------+
> |event_date|ad_market_id|
> +----------+------------+
> +----------+------------+
> scala> df.filter("ad_market_id = 4 and event_date = '2016-05-30'").show
> +----------+------------+
> |event_date|ad_market_id|
> +----------+------------+
> +----------+------------+
> scala> df.filter("ad_market_id = 4").coalesce(20).filter("event_date = '2016-05-30'").show
> +----------+------------+
> |event_date|ad_market_id|
> +----------+------------+
> |2016-05-30|           4|
> +----------+------------+
> scala> sc.version
> res40: String = 1.5.0
> scala> df
> res41: org.apache.spark.sql.DataFrame = [event_date: string, ad_market_id: int]



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org