You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Adrian Mocanu <am...@verticalscope.com> on 2015/03/25 20:49:32 UTC

writing DStream RDDs to the same file

Hi
Is there a way to write all RDDs in a DStream to the same file?
I tried this and got an empty file. I think it's bc the file is not closed i.e. ESMinibatchFunctions.writer.close() executes before the stream is created.

Here's my code
  myStream.foreachRDD(rdd => {
        rdd.foreach(x => {
          ESMinibatchFunctions.writer.append(rdd.collect()(0).toString()+" the data ")        })
//        localRdd = localRdd.union(rdd)
//        localArray = localArray ++ rdd.collect()
      } )

ESMinibatchFunctions.writer.close()

object ESMinibatchFunctions {
  val writer = new PrintWriter("c:/delme/exxx.txt")
}

Re: writing DStream RDDs to the same file

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Heres something similar which i used to do:

unionDStream.foreachRDD(rdd => { val events = rdd.count() println("Received
Events : " + rdd.count()) if(events > 0 ){ val fw = new
FileWriter("events", true) fw.write(Calendar.getInstance().getTime + "," +
events + "\n") fw.close() } })

Sending from cellphone, not sure how the code snippet will look. :)
On 26 Mar 2015 01:20, "Adrian Mocanu" <am...@verticalscope.com> wrote:

>  Hi
>
> Is there a way to write all RDDs in a DStream to the same file?
>
> I tried this and got an empty file. I think it’s bc the file is not closed
> i.e. ESMinibatchFunctions.writer.close() executes before the stream is
> created.
>
>
>
> Here’s my code
>
>   myStream.foreachRDD(rdd => {
>
>         rdd.foreach(x => {
>
>           ESMinibatchFunctions.writer.append(rdd.collect()(0).toString()+"
> the data ")        })
>
> //        localRdd = localRdd.union(rdd)
>
> //        localArray = localArray ++ rdd.collect()
>
>       } )
>
>
>
> ESMinibatchFunctions.writer.close()
>
>
>
> object ESMinibatchFunctions {
>
>   val writer = new PrintWriter("c:/delme/exxx.txt")
>
> }
>