You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ravi Hemnani <ra...@gmail.com> on 2014/03/10 11:26:18 UTC

Using flume to create stream for spark streaming.

Hey,

I am using the following flume flow,

Flume agent 1 consisting of Rabbitmq-> source, files-> channet, avro-> sink
sending data to a slave node of spark cluster. 
Flume agent 2, slave node of spark cluster, consisting of avro-> source,
files-> channel, now for the sink i tried avro, hdfs, file_roll as sink but
i am not able to read the DStream from any of these where for avro sink
type, i am giving sink address as the same slave node and some other port
and i am asking the spark streaming program to listen to slave node and the
port of the sink defined in the conf of slave node. Thus spark streaming is
giving me no result. 

I am running the program as java -jar <jar>  on master of the cluster. 

What should be the sink type that should be used on the slave node?
I have stuck on this since two weeks now and i am confused how to approach
this. 

Any help?




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Using-flume-to-create-stream-for-spark-streaming-tp2457.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.