You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/02/20 06:56:00 UTC

[jira] [Commented] (SPARK-26930) Several test cases in ParquetFilterSuite are broken

    [ https://issues.apache.org/jira/browse/SPARK-26930?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16772690#comment-16772690 ] 

Hyukjin Kwon commented on SPARK-26930:
--------------------------------------

Sorry but can you show some codes? It's difficult for me to parse the description :). Also, technically it's not broken if it doesn't test something. Please fix the JIRA title.

> Several test cases in ParquetFilterSuite are broken
> ---------------------------------------------------
>
>                 Key: SPARK-26930
>                 URL: https://issues.apache.org/jira/browse/SPARK-26930
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Nandor Kollar
>            Priority: Minor
>
> While investigating Parquet predicate pushdown test cases, I noticed that several tests seems to be broken, they don't test what they were originally intended to. Most of the verification ends up in one of the overloaded checkFilterPredicate functions, which supposed to test if a given filter class is generated or not with this call: {{maybeFilter.exists(_.getClass === filterClass)}}, but on one side an assert is missing from here, on the other side, the filters are more complicated, for example equality is checked with an 'and' wrapping not null check along with an equality check for the given value. 'Exists' function call won't help with these compounds filters, since they are not collection instances.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org