You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lokesh Kumar (JIRA)" <ji...@apache.org> on 2015/10/22 13:19:27 UTC

[jira] [Commented] (SPARK-11257) Spark dataframe negate filter conditions

    [ https://issues.apache.org/jira/browse/SPARK-11257?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14968976#comment-14968976 ] 

Lokesh Kumar commented on SPARK-11257:
--------------------------------------

Sorry using jira for the first time.

> Spark dataframe negate filter conditions
> ----------------------------------------
>
>                 Key: SPARK-11257
>                 URL: https://issues.apache.org/jira/browse/SPARK-11257
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0
>         Environment: Fedora 21 core i5
>            Reporter: Lokesh Kumar
>              Labels: bug
>
> I am trying to apply a negation of filter condition on the DataFrame as shown below.
> !(`Ship Mode` LIKE '%Truck%')
> Which is throwing an exception below
> Exception in thread "main" java.lang.RuntimeException: [1.3] failure: identifier expected
> (!(`Ship Mode` LIKE '%Truck%'))
>   ^
>     at scala.sys.package$.error(package.scala:27)
>     at org.apache.spark.sql.catalyst.SqlParser.parseExpression(SqlParser.scala:47)
>     at org.apache.spark.sql.DataFrame.filter(DataFrame.scala:748)
>     at Main.main(Main.java:73)
> Where as the same kind of negative filter conditions are working fine in MySQL. Please find below
> mysql> select count(*) from audit_log where !(operation like '%Log%' or operation like '%Proj%');
> +----------+
> | count(*) |
> +----------+
> |      129 |
> +----------+
> 1 row in set (0.05 sec)
> Can anyone please let me know if this is planned to be fixed in Spark DataFrames in future releases



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org