You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by 杨强 <ya...@ict.ac.cn> on 2013/11/19 02:59:15 UTC

Can not get the expected output when running the BroadcastTest example program.

Hi, all.
I'm using spark-0.8.0-incubating.

I tried the example BroadcastTest in local mode.
./run-example org.apache.spark.examples.BroadcastTest local 1 2>/dev/null 
This works fine and get the result:
Iteration 0
===========
1000000
1000000
1000000
1000000
1000000
1000000
1000000
1000000
1000000
1000000
Iteration 1
===========
1000000
1000000
1000000
1000000
1000000
1000000
1000000
1000000
1000000
1000000

But when I run this program in the cluster(standalone mode) with:
./run-example org.apache.spark.examples.BroadcastTest spark://172.16.1.39:7077 5 2>/dev/null 
This output is as follows:
Iteration 0
===========
Iteration 1
===========

I also tried command
./run-example org.apache.spark.examples.BroadcastTest spark://172.16.1.39:7077 5
but I did not find any error message.

Hope someone can give me some advices. Thank you.


The content of file etc/spark-env.sh is as follows:

export SCALA_HOME=/usr/lib/scala-2.9.3
export SPARK_MASTER_IP=172.16.1.39
export SPARK_MASTER_WEBUI_PORT=8090
export SPARK_WORKER_WEBUI_PORT=8091
export SPARK_WORKER_MEMORY=2G
#export SPARK_CLASSPATH=.:/home/spark-0.7.3/core/target/spark-core-assembly-0.7.3.jar:$SPACK_CLASSPATH
export SPARK_CLASSPATH=.:/home/hadoop/spark-0.8.0-incubating/conf:/home/hadoop/spark-0.8.0-incubating/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop1.0.1.jar:/home/hadoop/hadoop-1.0.1/conf




    Sincerely

Yang, Qiang

[Advice works] Re: Can not get the expected output when running the BroadcastTest example program.

Posted by 杨强 <ya...@ict.ac.cn>.
Thanks, Aaron.
Your advice really works.

Does this mean that the collect() method pulls all related data from slave nodes to master node?




    Sincerely

Yang, Qiang

发件人: Aaron Davidson
发送时间: 2013年11月19日(星期二) 下午12:38
收件人: user; yangqiang
主题: Re: Can not get the expected output when running the BroadcastTestexample program.
Assuming your cluster is actually working (e.g., other examples like SparkPi work), then the problem is probably that println() doesn't actually write output back to the driver; instead, it may just be outputting locally to each slave. You can test this by replacing lines 43 through 45 with:


  sc.parallelize(1 to 10, slices).map {
    i => barr1.value.size
  }.collect().foreach(i => println(i))


which should gather the exact same data but ensure that the printlns actually occur on the driver.

Re: Can not get the expected output when running the BroadcastTest example program.

Posted by Aaron Davidson <il...@gmail.com>.
Assuming your cluster is actually working (e.g., other examples like
SparkPi work), then the problem is probably that println() doesn't actually
write output back to the driver; instead, it may just be outputting locally
to each slave. You can test this by replacing lines 43 through 45 with:

  sc.parallelize(1 to 10, slices).map {
    i => barr1.value.size
  }.collect().foreach(i => println(i))

which should gather the exact same data but ensure that the printlns
actually occur on the driver.

On Mon, Nov 18, 2013 at 5:59 PM, 杨强 <ya...@ict.ac.cn> wrote:

>  Hi, all.
> I'm using spark-0.8.0-incubating.
>
> I tried the example BroadcastTest in local mode.
> ./run-example org.apache.spark.examples.BroadcastTest local 1 2>/dev/null
> This works fine and get the result:
>  Iteration 0
> ===========
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> Iteration 1
> ===========
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
> 1000000
>
> But when I run this program in the cluster(standalone mode) with:
> ./run-example org.apache.spark.examples.BroadcastTest spark://
> 172.16.1.39:7077 5 2>/dev/null
> This output is as follows:
>  Iteration 0
> ===========
> Iteration 1
> ===========
>
> I also tried command
> ./run-example org.apache.spark.examples.BroadcastTest spark://
> 172.16.1.39:7077 5
> but I did not find any error message.
>
> Hope someone can give me some advices. Thank you.
>
>
> The content of file etc/spark-env.sh is as follows:
>
>  export SCALA_HOME=/usr/lib/scala-2.9.3
> export SPARK_MASTER_IP=172.16.1.39
> export SPARK_MASTER_WEBUI_PORT=8090
> export SPARK_WORKER_WEBUI_PORT=8091
> export SPARK_WORKER_MEMORY=2G
>
> #export SPARK_CLASSPATH=.:/home/spark-0.7.3/core/target/spark-core-assembly-0.7.3.jar:$SPACK_CLASSPATH
>
> export SPARK_CLASSPATH=.:/home/hadoop/spark-0.8.0-incubating/conf:/home/hadoop/spark-0.8.0-incubating/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop1.0.1.jar:/home/hadoop/hadoop-1.0.1/conf
>
> ------------------------------
>      Sincerely
>
> Yang, Qiang
>