You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/10/22 13:14:27 UTC

[jira] [Updated] (SPARK-11257) Spark dataframe negate filter conditions

     [ https://issues.apache.org/jira/browse/SPARK-11257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-11257:
------------------------------
    Target Version/s:   (was: 1.5.1)
       Fix Version/s:     (was: 1.5.0)

[~lokeshdotp] also do not set Target/Fix version. https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark

> Spark dataframe negate filter conditions
> ----------------------------------------
>
>                 Key: SPARK-11257
>                 URL: https://issues.apache.org/jira/browse/SPARK-11257
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0
>         Environment: Fedora 21 core i5
>            Reporter: Lokesh Kumar
>              Labels: bug
>
> I am trying to apply a negation of filter condition on the DataFrame as shown below.
> !(`Ship Mode` LIKE '%Truck%')
> Which is throwing an exception below
> Exception in thread "main" java.lang.RuntimeException: [1.3] failure: identifier expected
> (!(`Ship Mode` LIKE '%Truck%'))
>   ^
>     at scala.sys.package$.error(package.scala:27)
>     at org.apache.spark.sql.catalyst.SqlParser.parseExpression(SqlParser.scala:47)
>     at org.apache.spark.sql.DataFrame.filter(DataFrame.scala:748)
>     at Main.main(Main.java:73)
> Where as the same kind of negative filter conditions are working fine in MySQL. Please find below
> mysql> select count(*) from audit_log where !(operation like '%Log%' or operation like '%Proj%');
> +----------+
> | count(*) |
> +----------+
> |      129 |
> +----------+
> 1 row in set (0.05 sec)
> Can anyone please let me know if this is planned to be fixed in Spark DataFrames in future releases



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org