You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by ravidspark <ra...@gmail.com> on 2018/05/10 00:10:59 UTC

Making spark streaming application single threaded

Hi All,

Is there any property which makes my spark streaming application a single
threaded? 

I researched on this property, *spark.dynamicAllocation.maxExecutors=1*, but
as far as I understand this launches a maximum of one container but not a
single thread. In local mode, we can configure the number of threads using
local[*]. But, how can I do the same in cluster mode? 

I am trying to read data from Kafka and I see in my logs, every Kafka
message is being read 3 times. I wanted this to be read only once. How can I
achieve this?


Thanks in advance, 
Ravi



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org