You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "L. C. Hsieh (Jira)" <ji...@apache.org> on 2019/11/04 06:58:00 UTC

[jira] [Commented] (SPARK-29740) Filter On DataFrame Will return Tuple of 2 dataframes

    [ https://issues.apache.org/jira/browse/SPARK-29740?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16966430#comment-16966430 ] 

L. C. Hsieh commented on SPARK-29740:
-------------------------------------

What does it mean "it requires 2 filter ..."? To obtain two data frames which satisfies and not satisfies the condition, do not we just have one filter for each data frame?

e.g.,

val baseDF = ....
val matchedDF = baseDF.filter(...)
val notMatchedDF = baseDF.filter(...)

Otherwise, is this all about a API to return two data frames, given a filter condition, e.g.,

def twoFilter(condition: Expression): (DataFrame, DataFrame) = {
   val matched = this.filter(condition)
   val notMatched = this.filter(Not(condition))
  (matched, notMatched)
} 

> Filter On DataFrame Will return Tuple of 2 dataframes
> -----------------------------------------------------
>
>                 Key: SPARK-29740
>                 URL: https://issues.apache.org/jira/browse/SPARK-29740
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.4.4
>            Reporter: sanket sahu
>            Priority: Minor
>   Original Estimate: 120h
>  Remaining Estimate: 120h
>
> The filter operation over dataframe/dataset based on a boolean parameter,
>  will return tuple of 2 dataframes instead of 1. 
> i.e.
> (DataFrame_Matching_the_condition, DataFrame_NOT_Matching_the_condition)
> As of now, it requires 2 filter operation to get this result.
> But if this is implemented , this use case can be done in a single filter operation. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org