You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by kant kodali <ka...@gmail.com> on 2017/06/03 22:00:10 UTC

Is there a way to do conditional group by in spark 2.1.1?

Hi All,

Is there a way to do conditional group by in spark 2.1.1? other words, I
want to do something like this

if (field1 == "foo") {
       df.groupBy(field1)
} else if (field2 == "bar")
      df.groupBy(field2)

Thanks

Re: Is there a way to do conditional group by in spark 2.1.1?

Posted by vaquar khan <va...@gmail.com>.
Avoid groupby and use reducebykey.

Regards,
Vaquar khan

On Jun 4, 2017 8:32 AM, "Guy Cohen" <gu...@gettaxi.com> wrote:

> Try this one:
>
> df.groupBy(
>   when(expr("field1='foo'"),"field1").when(expr("field2='bar'"),"field2"))
>
>
> On Sun, Jun 4, 2017 at 3:16 AM, Bryan Jeffrey <br...@gmail.com>
> wrote:
>
>> You should be able to project a new column that is your group column.
>> Then you can group on the projected column.
>>
>> Get Outlook for Android <https://aka.ms/ghei36>
>>
>>
>>
>>
>> On Sat, Jun 3, 2017 at 6:26 PM -0400, "upendra 1991" <
>> upendra1991@yahoo.com.invalid> wrote:
>>
>> Use a function
>>>
>>> Sent from Yahoo Mail on Android
>>> <https://overview.mail.yahoo.com/mobile/?.src=Android>
>>>
>>> On Sat, Jun 3, 2017 at 5:01 PM, kant kodali
>>> <ka...@gmail.com> wrote:
>>> Hi All,
>>>
>>> Is there a way to do conditional group by in spark 2.1.1? other words, I
>>> want to do something like this
>>>
>>> if (field1 == "foo") {
>>>        df.groupBy(field1)
>>> } else if (field2 == "bar")
>>>       df.groupBy(field2)
>>>
>>> Thanks
>>>
>>>
>

Re: Is there a way to do conditional group by in spark 2.1.1?

Posted by Guy Cohen <gu...@gettaxi.com>.
Try this one:

df.groupBy(
  when(expr("field1='foo'"),"field1").when(expr("field2='bar'"),"field2"))


On Sun, Jun 4, 2017 at 3:16 AM, Bryan Jeffrey <br...@gmail.com>
wrote:

> You should be able to project a new column that is your group column. Then
> you can group on the projected column.
>
> Get Outlook for Android <https://aka.ms/ghei36>
>
>
>
>
> On Sat, Jun 3, 2017 at 6:26 PM -0400, "upendra 1991" <
> upendra1991@yahoo.com.invalid> wrote:
>
> Use a function
>>
>> Sent from Yahoo Mail on Android
>> <https://overview.mail.yahoo.com/mobile/?.src=Android>
>>
>> On Sat, Jun 3, 2017 at 5:01 PM, kant kodali
>> <ka...@gmail.com> wrote:
>> Hi All,
>>
>> Is there a way to do conditional group by in spark 2.1.1? other words, I
>> want to do something like this
>>
>> if (field1 == "foo") {
>>        df.groupBy(field1)
>> } else if (field2 == "bar")
>>       df.groupBy(field2)
>>
>> Thanks
>>
>>

Re: Is there a way to do conditional group by in spark 2.1.1?

Posted by Bryan Jeffrey <br...@gmail.com>.
You should be able to project a new column that is your group column. Then you can group on the projected column. 




Get Outlook for Android







On Sat, Jun 3, 2017 at 6:26 PM -0400, "upendra 1991" <up...@yahoo.com.invalid> wrote:










Use a function

Sent from Yahoo Mail on Android 
   On Sat, Jun 3, 2017 at 5:01 PM, kant kodali<ka...@gmail.com> wrote:   Hi All,
Is there a way to do conditional group by in spark 2.1.1? other words, I want to do something like this

if (field1 == "foo") { 
       df.groupBy(field1)
} else if (field2 == "bar")      df.groupBy(field2)
Thanks
  





Re: Is there a way to do conditional group by in spark 2.1.1?

Posted by upendra 1991 <up...@yahoo.com.INVALID>.
Use a function

Sent from Yahoo Mail on Android 
 
  On Sat, Jun 3, 2017 at 5:01 PM, kant kodali<ka...@gmail.com> wrote:   Hi All,
Is there a way to do conditional group by in spark 2.1.1? other words, I want to do something like this

if (field1 == "foo") { 
       df.groupBy(field1)
} else if (field2 == "bar")      df.groupBy(field2)
Thanks