You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ford Farline <fo...@gmail.com> on 2015/08/04 23:36:01 UTC

Re: Problem submiting an script .py against an standalone cluster.

The code is very simple, just a couple of lines. When i lanch it runs in
local but not in cluster.

sc = SparkContext("local", "Tech Companies Feedback")

beginning_time = datetime.now()

time.sleep(60)

print datetime.now() - beginning_time

sc.stop()

Thanks for your interest,

Gonzalo



On Fri, Jul 31, 2015 at 4:24 AM, Marcelo Vanzin <va...@cloudera.com> wrote:

> Can you share the part of the code in your script where you create the
> SparkContext instance?
>
> On Thu, Jul 30, 2015 at 7:19 PM, fordfarline <fo...@gmail.com>
> wrote:
>
>> Hi All,
>>
>> I`m having an issue when lanching an app (python) against a stand alone
>> cluster, but runs in local, as it doesn't reach the cluster.
>> It's the first time i try the cluster, in local works ok.
>>
>> i made this:
>>
>> -> /home/user/Spark/spark-1.3.0-bin-hadoop2.4/sbin/start-all.sh # Master
>> and
>> worker are up in localhost:8080/4040
>> -> /home/user/Spark/spark-1.3.0-bin-hadoop2.4/bin/spark-submit --master
>> spark://localhost:7077 Script.py
>>            * The script runs ok but in local :(    i can check it in
>> localhost:4040, but i don't see any job in cluster UI
>>
>> The only warning it's:
>> WARN Utils: Your hostname, localhost resolves to a loopback address:
>> 127.0.0.1; using 192.168.1.132 instead (on interface eth0)
>>
>> I set SPARK_LOCAL_IP=127.0.0.1 to solve this, al least de warning
>> disappear,
>> but the script keep executing in local not in cluster.
>>
>> I think it has something to do with my virtual server:
>> -> Host Server: Linux Mint
>> -> The Virtual Server (workstation 10) where runs Spark is Linux Mint as
>> well.
>>
>> Any ideas what am i doing wrong?
>>
>> Thanks in advance for any suggestion, i getting mad on it!!
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Problem-submiting-an-script-py-against-an-standalone-cluster-tp24091.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>
>
> --
> Marcelo
>