You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Deep Pradhan <pr...@gmail.com> on 2014/11/14 11:09:52 UTC

EmptyRDD

How to create an empty RDD in Spark?

Thank You

Re: EmptyRDD

Posted by Ted Yu <yu...@gmail.com>.
See http://spark.apache.org/docs/0.8.1/api/core/org/apache/spark/rdd/EmptyRDD.html


On Nov 14, 2014, at 2:09 AM, Deep Pradhan <pr...@gmail.com> wrote:

> How to create an empty RDD in Spark?
> 
> Thank You

Re: EmptyRDD

Posted by Gerard Maas <ge...@gmail.com>.
It looks like an Scala issue. Seems like the implicit conversion to
ArrayOps does not apply if the type is Array[Nothing].

Try giving a type to the empty RDD:

val emptyRdd: RDD[Any] = sc.EmptyRDD
emptyRdd.collect.foreach(println) // prints a line return


-kr, Gerard.



On Fri, Nov 14, 2014 at 11:35 AM, Deep Pradhan <pr...@gmail.com>
wrote:

> Thank You Gerard.
> I was trying val emptyRdd = sc.EmptyRDD.
>
> Yes it works but I am not able to do *emptyRdd.collect.foreach(println)*
>
> Thank You
>
> On Fri, Nov 14, 2014 at 3:58 PM, Gerard Maas <ge...@gmail.com>
> wrote:
>
>> If I remember correctly, EmptyRDD is private [spark]
>>
>> You can create an empty RDD using the spark context:
>>
>> val emptyRdd = sc.emptyRDD
>>
>> -kr, Gerard.
>>
>>
>>
>> On Fri, Nov 14, 2014 at 11:22 AM, Deep Pradhan <pradhandeep1991@gmail.com
>> > wrote:
>>
>>> To get an empty RDD, I did this:
>>>
>>> I have an rdd with one element. I created another rdd using filter so
>>> that the second rdd does not contain anything. I achieved what I wanted but
>>> I want to know whether there is an efficient way to achieve this. This is a
>>> very crude way of creating an empty RDD. Is there another way to do this?
>>>
>>> Thank you
>>>
>>> On Fri, Nov 14, 2014 at 3:39 PM, Deep Pradhan <pradhandeep1991@gmail.com
>>> > wrote:
>>>
>>>> How to create an empty RDD in Spark?
>>>>
>>>> Thank You
>>>>
>>>
>>>
>>
>

Re: EmptyRDD

Posted by Gerard Maas <ge...@gmail.com>.
If I remember correctly, EmptyRDD is private [spark]

You can create an empty RDD using the spark context:

val emptyRdd = sc.emptyRDD

-kr, Gerard.



On Fri, Nov 14, 2014 at 11:22 AM, Deep Pradhan <pr...@gmail.com>
wrote:

> To get an empty RDD, I did this:
>
> I have an rdd with one element. I created another rdd using filter so that
> the second rdd does not contain anything. I achieved what I wanted but I
> want to know whether there is an efficient way to achieve this. This is a
> very crude way of creating an empty RDD. Is there another way to do this?
>
> Thank you
>
> On Fri, Nov 14, 2014 at 3:39 PM, Deep Pradhan <pr...@gmail.com>
> wrote:
>
>> How to create an empty RDD in Spark?
>>
>> Thank You
>>
>
>

Re: EmptyRDD

Posted by Deep Pradhan <pr...@gmail.com>.
To get an empty RDD, I did this:

I have an rdd with one element. I created another rdd using filter so that
the second rdd does not contain anything. I achieved what I wanted but I
want to know whether there is an efficient way to achieve this. This is a
very crude way of creating an empty RDD. Is there another way to do this?

Thank you

On Fri, Nov 14, 2014 at 3:39 PM, Deep Pradhan <pr...@gmail.com>
wrote:

> How to create an empty RDD in Spark?
>
> Thank You
>