You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Pierre B <pi...@realimpactanalytics.com> on 2014/03/02 15:48:14 UTC

Spark 0.9.0 - local mode - sc.addJar problem (bug?)

Hi all!

In spark 0.9.0, local mode, whenever I try to add jar(s), using either
SparkConf.addJars or SparkConfiguration.addJar, in the shell or in a
standalone mode, I observe a strange behaviour.

I investigated this because my standalone app works perfectly on my cluster
but is getting stuck in local mode.

So, as can be seen in the following screenshot, the jar file is supposedly
made available at the given http address:
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n2218/Screen_Shot_2014-03-02_at_15.30.51.png> 


However, when I try to get the file from http (in a browser or using wget),
the download always gets stuck after a while (usually around 66,724 bytes,
as can be seen in the next screenshot):
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n2218/Screen_Shot_2014-03-02_at_15.30.39.png> 

This happens for all jars I've trying, except the ones smaller than
60kbytes.

Could this be a bug or just a problem on my machine? (MacOS Mavericks)

Cheers

Pierre



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-0-9-0-local-mode-sc-addJar-problem-bug-tp2218.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Spark 0.9.0 - local mode - sc.addJar problem (bug?)

Posted by Pierre Borckmans <pi...@realimpactanalytics.com>.
Hi Nan,

Must be a local network config problem with my machine.
When I replace the 10.0.1.7 ip with localhost, it works perfectly…
Thanks for trying though…

Pierre

On 02 Mar 2014, at 16:02, Nan Zhu <zh...@gmail.com> wrote:

> Cannot reproduce it….even I add spark-assembly jar
> 
> scala> sc.addJar("/Users/nanzhu/code/spark-0.9.0-incubating/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar")
> 14/03/02 09:59:47 INFO SparkContext: Added JAR /Users/nanzhu/code/spark-0.9.0-incubating/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar at http://192.168.2.17:55364/jars/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar with timestamp 1393772387586
> 
> 
> --2014-03-02 09:59:54--  http://192.168.2.17:55364/jars/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar
> Connecting to 192.168.2.17:55364... connected.
> HTTP request sent, awaiting response... 200 OK
> Length: 87878749 (84M) [application/java-archive]
> Saving to: ‘spark-assembly-0.9.0-incubating-hadoop1.0.4.jar.1’
> 
> 100%[========================================================================>] 87,878,749  66.7MB/s   in 1.3s   
> 
> 2014-03-02 09:59:56 (66.7 MB/s) - ‘spark-assembly-0.9.0-incubating-hadoop1.0.4.jar.1’ saved [87878749/87878749]
> 
> 
> Best,
> 
> -- 
> Nan Zhu
> 
> On Sunday, March 2, 2014 at 9:48 AM, Pierre B wrote:
> 
>> Hi all!
>> 
>> In spark 0.9.0, local mode, whenever I try to add jar(s), using either
>> SparkConf.addJars or SparkConfiguration.addJar, in the shell or in a
>> standalone mode, I observe a strange behaviour.
>> 
>> I investigated this because my standalone app works perfectly on my cluster
>> but is getting stuck in local mode.
>> 
>> So, as can be seen in the following screenshot, the jar file is supposedly
>> made available at the given http address:
>> <http://apache-spark-user-list.1001560.n3.nabble.com/file/n2218/Screen_Shot_2014-03-02_at_15.30.51.png>
>> 
>> 
>> However, when I try to get the file from http (in a browser or using wget),
>> the download always gets stuck after a while (usually around 66,724 bytes,
>> as can be seen in the next screenshot):
>> <http://apache-spark-user-list.1001560.n3.nabble.com/file/n2218/Screen_Shot_2014-03-02_at_15.30.39.png>
>> 
>> This happens for all jars I've trying, except the ones smaller than
>> 60kbytes.
>> 
>> Could this be a bug or just a problem on my machine? (MacOS Mavericks)
>> 
>> Cheers
>> 
>> Pierre
>> 
>> 
>> 
>> --
>> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-0-9-0-local-mode-sc-addJar-problem-bug-tp2218.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 


Re: Spark 0.9.0 - local mode - sc.addJar problem (bug?)

Posted by Nan Zhu <zh...@gmail.com>.
Cannot reproduce it….even I add spark-assembly jar

scala> sc.addJar("/Users/nanzhu/code/spark-0.9.0-incubating/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar")  
14/03/02 09:59:47 INFO SparkContext: Added JAR /Users/nanzhu/code/spark-0.9.0-incubating/assembly/target/scala-2.10/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar at http://192.168.2.17:55364/jars/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar with timestamp 1393772387586



--2014-03-02 09:59:54--  http://192.168.2.17:55364/jars/spark-assembly-0.9.0-incubating-hadoop1.0.4.jar  
Connecting to 192.168.2.17:55364... connected.
HTTP request sent, awaiting response... 200 OK
Length: 87878749 (84M) [application/java-archive]
Saving to: ‘spark-assembly-0.9.0-incubating-hadoop1.0.4.jar.1’

100%[========================================================================>] 87,878,749  66.7MB/s   in 1.3s     

2014-03-02 09:59:56 (66.7 MB/s) - ‘spark-assembly-0.9.0-incubating-hadoop1.0.4.jar.1’ saved [87878749/87878749]


Best,

--  
Nan Zhu


On Sunday, March 2, 2014 at 9:48 AM, Pierre B wrote:

> Hi all!
>  
> In spark 0.9.0, local mode, whenever I try to add jar(s), using either
> SparkConf.addJars or SparkConfiguration.addJar, in the shell or in a
> standalone mode, I observe a strange behaviour.
>  
> I investigated this because my standalone app works perfectly on my cluster
> but is getting stuck in local mode.
>  
> So, as can be seen in the following screenshot, the jar file is supposedly
> made available at the given http address:
> <http://apache-spark-user-list.1001560.n3.nabble.com/file/n2218/Screen_Shot_2014-03-02_at_15.30.51.png>  
>  
>  
> However, when I try to get the file from http (in a browser or using wget),
> the download always gets stuck after a while (usually around 66,724 bytes,
> as can be seen in the next screenshot):
> <http://apache-spark-user-list.1001560.n3.nabble.com/file/n2218/Screen_Shot_2014-03-02_at_15.30.39.png>  
>  
> This happens for all jars I've trying, except the ones smaller than
> 60kbytes.
>  
> Could this be a bug or just a problem on my machine? (MacOS Mavericks)
>  
> Cheers
>  
> Pierre
>  
>  
>  
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-0-9-0-local-mode-sc-addJar-problem-bug-tp2218.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com (http://Nabble.com).
>  
>  



Re: Spark 0.9.0 - local mode - sc.addJar problem (bug?)

Posted by Pierre B <pi...@realimpactanalytics.com>.
I'm still puzzled why trying wget with my IP is not working properly, whereas
it's working if I use 127.0.0.1 or localhost...

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n2221/Screen_Shot_2014-03-02_at_16.07.14.png> 

?

 



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-0-9-0-local-mode-sc-addJar-problem-bug-tp2218p2221.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.