You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by centerqi hu <ce...@gmail.com> on 2014/09/26 04:32:12 UTC

flume spark streaming receiver host random

Hi all
My code is as follows:

/usr/local/webserver/sparkhive/bin/spark-submit
--class org.apache.spark.examples.streaming.FlumeEventCount
--master yarn
--deploy-mode cluster
--queue  online
--num-executors 5
--driver-memory 6g
--executor-memory 20g
--executor-cores 5 target/scala-2.10/simple-project_2.10-1.0.jar
10.1.15.115 60000

However, the receiver does not in the 10.1.15.115, but the random
choice of one slave host.

How to solve this problem?

Thanks


-- 
centerqi@gmail.com|齐忠

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: flume spark streaming receiver host random

Posted by Sean Owen <so...@cloudera.com>.
I don't think you control which host he receiver runs on, right? So that
Spark can handle the failure of that node and reassign the receiver.
On Sep 27, 2014 2:43 AM, "centerqi hu" <ce...@gmail.com> wrote:

> the receiver is not running on the machine I expect
>
>
>
> 2014-09-26 14:09 GMT+08:00 Sean Owen <so...@cloudera.com>:
> > I think you may be missing a key word here. Are you saying that the
> machine
> > has multiple interfaces and it is not using the one you expect or the
> > receiver is not running on the machine you expect?
>
> --
> centerqi@gmail.com|齐忠
>

Re: flume spark streaming receiver host random

Posted by centerqi hu <ce...@gmail.com>.
the receiver is not running on the machine I expect



2014-09-26 14:09 GMT+08:00 Sean Owen <so...@cloudera.com>:
> I think you may be missing a key word here. Are you saying that the machine
> has multiple interfaces and it is not using the one you expect or the
> receiver is not running on the machine you expect?

-- 
centerqi@gmail.com|齐忠

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: flume spark streaming receiver host random

Posted by Sean Owen <so...@cloudera.com>.
I think you may be missing a key word here. Are you saying that the machine
has multiple interfaces and it is not using the one you expect or the
receiver is not running on the machine you expect?
On Sep 26, 2014 3:33 AM, "centerqi hu" <ce...@gmail.com> wrote:

> Hi all
> My code is as follows:
>
> /usr/local/webserver/sparkhive/bin/spark-submit
> --class org.apache.spark.examples.streaming.FlumeEventCount
> --master yarn
> --deploy-mode cluster
> --queue  online
> --num-executors 5
> --driver-memory 6g
> --executor-memory 20g
> --executor-cores 5 target/scala-2.10/simple-project_2.10-1.0.jar
> 10.1.15.115 60000
>
> However, the receiver does not in the 10.1.15.115, but the random
> choice of one slave host.
>
> How to solve this problem?
>
> Thanks
>
>
> --
> centerqi@gmail.com|齐忠
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>