You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by anshu shukla <an...@gmail.com> on 2015/07/25 19:43:26 UTC
Parallelism of Custom receiver in spark
1 - How to increase the level of *parallelism in spark streaming custom
RECEIVER* .
2 - Will ssc.receiverstream(/**anything //) will *delete the data
stored in spark memory using store(s) * logic .
--
Thanks & Regards,
Anshu Shukla
Re: Parallelism of Custom receiver in spark
Posted by Michal Čizmazia <mi...@gmail.com>.
#1 see
https://spark.apache.org/docs/latest/streaming-programming-guide.html#level-of-parallelism-in-data-receiving
#2 By default, all input data and persisted RDDs generated by DStream
transformations are automatically cleared. Spark Streaming decides when to
clear the data based on the transformations that are used. See
https://spark.apache.org/docs/latest/streaming-programming-guide.html#memory-tuning
Hope this helps.
On 25 July 2015 at 13:43, anshu shukla <an...@gmail.com> wrote:
> 1 - How to increase the level of *parallelism in spark streaming custom
> RECEIVER* .
>
> 2 - Will ssc.receiverstream(/**anything //) will *delete the data
> stored in spark memory using store(s) * logic .
>
> --
> Thanks & Regards,
> Anshu Shukla
>