You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by "Ulanov, Alexander" <al...@hp.com> on 2015/04/03 21:21:56 UTC

Running LocalClusterSparkContext

Hi,

I am trying to execute unit tests with LocalClusterSparkContext on Windows 7. I am getting a bunch of error in the log saying that: "Cannot find any assembly build directories." Below is the part from the log where it brakes. Could you suggest what's happening? In addition the application has created a folder "app-20150403121631-0000" and is keeping to create empty folders there named with numbers until I stop the application.


15/04/03 12:16:31.239 sparkWorker2-akka.actor.default-dispatcher-3 INFO Worker: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
15/04/03 12:16:31.277 sparkDriver-akka.actor.default-dispatcher-2 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
15/04/03 12:16:31.549 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Registering worker Mynetwork.net:52006 with 1 cores, 512.0 MB RAM
15/04/03 12:16:31.556 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Registering worker Mynetwork.net:52020 with 1 cores, 512.0 MB RAM
15/04/03 12:16:31.561 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Registering app test-cluster
15/04/03 12:16:31.568 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Registered app test-cluster with ID app-20150403121631-0000
15/04/03 12:16:31.573 sparkWorker1-akka.actor.default-dispatcher-3 INFO Worker: Successfully registered with master spark://Mynetwork.net:51990
15/04/03 12:16:31.588 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Launching executor app-20150403121631-0000/0 on worker worker-20150403121631-Mynetwork.net-52020
15/04/03 12:16:31.589 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Launching executor app-20150403121631-0000/1 on worker worker-20150403121631-Mynetwork.net-52006
15/04/03 12:16:31.590 sparkWorker2-akka.actor.default-dispatcher-3 INFO Worker: Successfully registered with master spark://Mynetwork.net:51990
15/04/03 12:16:31.595 sparkDriver-akka.actor.default-dispatcher-3 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150403121631-0000
15/04/03 12:16:31.608 sparkWorker1-akka.actor.default-dispatcher-4 INFO Worker: Asked to launch executor app-20150403121631-0000/1 for test-cluster
15/04/03 12:16:31.624 sparkWorker2-akka.actor.default-dispatcher-5 INFO Worker: Asked to launch executor app-20150403121631-0000/0 for test-cluster
15/04/03 12:16:31.639 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor added: app-20150403121631-0000/0 on worker-20150403121631-Mynetwork.net-52020 (Mynetwork.net:52020) with 1 cores
15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150403121631-0000/0 on hostPort Mynetwork.net:52020 with 1 cores, 512.0 MB RAM
15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor added: app-20150403121631-0000/1 on worker-20150403121631-Mynetwork.net-52006 (Mynetwork.net:52006) with 1 cores
15/04/03 12:16:31.684 sparkDriver-akka.actor.default-dispatcher-3 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150403121631-0000/1 on hostPort Mynetwork.net:52006 with 1 cores, 512.0 MB RAM
15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is now LOADING
15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is now LOADING
15/04/03 12:16:31.688 ExecutorRunner for app-20150403121631-0000/1 ERROR ExecutorRunner: Error running executor
java.lang.IllegalStateException: Cannot find any assembly build directories.
                at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:228)
                at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:283)
                at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:150)
                at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:111)
                at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
                at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:48)
                at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:61)
                at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:49)
                at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:132)
                at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:68)
15/04/03 12:16:31.712 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is now RUNNING
15/04/03 12:16:31.725 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is now RUNNING
15/04/03 12:16:31.726 sparkWorker1-akka.actor.default-dispatcher-4 INFO Worker: Executor app-20150403121631-0000/1 finished with state FAILED message java.lang.IllegalStateException: Cannot find any assembly build directories.
15/04/03 12:16:31.689 ExecutorRunner for app-20150403121631-0000/0 ERROR ExecutorRunner: Error running executor
java.lang.IllegalStateException: Cannot find any assembly build directories.

Best regards, Alexander

RE: Running LocalClusterSparkContext

Posted by "Ulanov, Alexander" <al...@hp.com>.
Thank you so much! It allows me to proceed and Spark started to execute my code. I use sc.textFile and Spark crashes on it: 

15/04/03 15:15:21.572 sparkDriver-akka.actor.default-dispatcher-4 INFO SparkDeploySchedulerBackend: Registered executor: Actor[akka.tcp://sparkExecutor@Mynetwork.net:57652/user/Executor#1137816732] with ID 1
15/04/03 15:15:21.601 sparkDriver-akka.actor.default-dispatcher-4 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, Mynetwork.net, PROCESS_LOCAL, 1301 bytes)
15/04/03 15:15:22.285 sparkDriver-akka.actor.default-dispatcher-2 INFO BlockManagerMasterActor: Registering block manager Mynetwork.net:57687 with 265.1 MB RAM, BlockManagerId(1, Mynetwork.net, 57687)
15/04/03 15:15:22.436 sparkDriver-akka.actor.default-dispatcher-2 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, Mynetwork.net, PROCESS_LOCAL, 1301 bytes)
15/04/03 15:15:22.455 task-result-getter-0 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, Mynetwork.net): java.io.EOFException
	at java.io.ObjectInputStream$BlockDataInputStream.readFully(ObjectInputStream.java:2747)
	at java.io.ObjectInputStream.readFully(ObjectInputStream.java:1033)

Could you suggest?

(it seems that new version of Spark was not tested for Windows. Previous versions worked more or less fine for me)

-----Original Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] 
Sent: Friday, April 03, 2015 1:04 PM
To: Ulanov, Alexander
Cc: dev@spark.apache.org
Subject: Re: Running LocalClusterSparkContext

SPARK-2356

On Fri, Apr 3, 2015 at 1:01 PM, Ulanov, Alexander <al...@hp.com> wrote:
> Thanks! It worked. I've updated the PR and now start the test with 
> working directory in SPARK_HOME
>
> Now I get:
> ERROR Shell: Failed to locate the winutils binary in the hadoop binary 
> path
> java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
>
> Any ideas?
>
> -----Original Message-----
> From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
> Sent: Friday, April 03, 2015 12:52 PM
> To: Ulanov, Alexander
> Cc: dev@spark.apache.org
> Subject: Re: Running LocalClusterSparkContext
>
> That looks like another bug on top of 6673; can you point that out in the PR to make sure it's covered?
>
> In the meantime, could you try running the command from $SPARK_HOME instead of from inside the mllib directory?
>
> On Fri, Apr 3, 2015 at 12:48 PM, Ulanov, Alexander <al...@hp.com> wrote:
>> Hi Marcelo,
>>
>> Thank you for quick response!
>> It seems that I get the issue 6673. If I set set SPARK_SCALA_VERSION=2.10 as suggested in https://issues.apache.org/jira/browse/SPARK-6673, I get instead:
>>
>> 15/04/03 12:46:24.510 ExecutorRunner for app-20150403124624-0000/0 
>> ERROR ExecutorRunner: Error running executor
>> java.lang.IllegalStateException: No assemblies found in 'C:\ulanov\dev\spark\mllib\.\assembly\target\scala-2.10'.
>>
>> -----Original Message-----
>> From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
>> Sent: Friday, April 03, 2015 12:31 PM
>> To: Ulanov, Alexander
>> Cc: dev@spark.apache.org
>> Subject: Re: Running LocalClusterSparkContext
>>
>> When was the last time you pulled? That should have been fixed as part of SPARK-6473.
>>
>> Notice latest master suffers from SPARK-6673 on Windows.
>>
>> On Fri, Apr 3, 2015 at 12:21 PM, Ulanov, Alexander <al...@hp.com> wrote:
>>> Hi,
>>>
>>> I am trying to execute unit tests with LocalClusterSparkContext on Windows 7. I am getting a bunch of error in the log saying that: "Cannot find any assembly build directories." Below is the part from the log where it brakes. Could you suggest what's happening? In addition the application has created a folder "app-20150403121631-0000" and is keeping to create empty folders there named with numbers until I stop the application.
>>>
>>>
>>> 15/04/03 12:16:31.239 sparkWorker2-akka.actor.default-dispatcher-3 INFO Worker: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
>>> 15/04/03 12:16:31.277 sparkDriver-akka.actor.default-dispatcher-2 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
>>> 15/04/03 12:16:31.549 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Registering worker Mynetwork.net:52006 with 1 cores, 512.0 
>>> MB RAM
>>> 15/04/03 12:16:31.556 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Registering worker Mynetwork.net:52020 with 1 cores, 512.0 
>>> MB RAM
>>> 15/04/03 12:16:31.561 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Registering app test-cluster
>>> 15/04/03 12:16:31.568 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Registered app test-cluster with ID app-20150403121631-0000
>>> 15/04/03 12:16:31.573 sparkWorker1-akka.actor.default-dispatcher-3
>>> INFO Worker: Successfully registered with master
>>> spark://Mynetwork.net:51990
>>> 15/04/03 12:16:31.588 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Launching executor app-20150403121631-0000/0 on worker
>>> worker-20150403121631-Mynetwork.net-52020
>>> 15/04/03 12:16:31.589 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Launching executor app-20150403121631-0000/1 on worker
>>> worker-20150403121631-Mynetwork.net-52006
>>> 15/04/03 12:16:31.590 sparkWorker2-akka.actor.default-dispatcher-3
>>> INFO Worker: Successfully registered with master
>>> spark://Mynetwork.net:51990
>>> 15/04/03 12:16:31.595 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> SparkDeploySchedulerBackend: Connected to Spark cluster with app ID
>>> app-20150403121631-0000
>>> 15/04/03 12:16:31.608 sparkWorker1-akka.actor.default-dispatcher-4
>>> INFO Worker: Asked to launch executor app-20150403121631-0000/1 for 
>>> test-cluster
>>> 15/04/03 12:16:31.624 sparkWorker2-akka.actor.default-dispatcher-5
>>> INFO Worker: Asked to launch executor app-20150403121631-0000/0 for 
>>> test-cluster
>>> 15/04/03 12:16:31.639 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor added: app-20150403121631-0000/0 on
>>> worker-20150403121631-Mynetwork.net-52020 (Mynetwork.net:52020) with
>>> 1 cores
>>> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> SparkDeploySchedulerBackend: Granted executor ID
>>> app-20150403121631-0000/0 on hostPort Mynetwork.net:52020 with 1 
>>> cores, 512.0 MB RAM
>>> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor added: app-20150403121631-0000/1 on
>>> worker-20150403121631-Mynetwork.net-52006 (Mynetwork.net:52006) with
>>> 1 cores
>>> 15/04/03 12:16:31.684 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> SparkDeploySchedulerBackend: Granted executor ID
>>> app-20150403121631-0000/1 on hostPort Mynetwork.net:52006 with 1 
>>> cores, 512.0 MB RAM
>>> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 
>>> is now LOADING
>>> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 
>>> is now LOADING
>>> 15/04/03 12:16:31.688 ExecutorRunner for app-20150403121631-0000/1 
>>> ERROR ExecutorRunner: Error running executor
>>> java.lang.IllegalStateException: Cannot find any assembly build directories.
>>>                 at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:228)
>>>                 at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:283)
>>>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:150)
>>>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:111)
>>>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
>>>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:48)
>>>                 at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:61)
>>>                 at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:49)
>>>                 at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:132)
>>>                 at
>>> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRu
>>> n
>>> n
>>> er.scala:68)
>>> 15/04/03 12:16:31.712 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 
>>> is now RUNNING
>>> 15/04/03 12:16:31.725 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 
>>> is now RUNNING
>>> 15/04/03 12:16:31.726 sparkWorker1-akka.actor.default-dispatcher-4 INFO Worker: Executor app-20150403121631-0000/1 finished with state FAILED message java.lang.IllegalStateException: Cannot find any assembly build directories.
>>> 15/04/03 12:16:31.689 ExecutorRunner for app-20150403121631-0000/0 
>>> ERROR ExecutorRunner: Error running executor
>>> java.lang.IllegalStateException: Cannot find any assembly build directories.
>>>
>>> Best regards, Alexander
>>
>>
>>
>> --
>> Marcelo
>
>
>
> --
> Marcelo



--
Marcelo

Re: Running LocalClusterSparkContext

Posted by Marcelo Vanzin <va...@cloudera.com>.
SPARK-2356

On Fri, Apr 3, 2015 at 1:01 PM, Ulanov, Alexander
<al...@hp.com> wrote:
> Thanks! It worked. I've updated the PR and now start the test with working directory in SPARK_HOME
>
> Now I get:
> ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
> java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
>
> Any ideas?
>
> -----Original Message-----
> From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
> Sent: Friday, April 03, 2015 12:52 PM
> To: Ulanov, Alexander
> Cc: dev@spark.apache.org
> Subject: Re: Running LocalClusterSparkContext
>
> That looks like another bug on top of 6673; can you point that out in the PR to make sure it's covered?
>
> In the meantime, could you try running the command from $SPARK_HOME instead of from inside the mllib directory?
>
> On Fri, Apr 3, 2015 at 12:48 PM, Ulanov, Alexander <al...@hp.com> wrote:
>> Hi Marcelo,
>>
>> Thank you for quick response!
>> It seems that I get the issue 6673. If I set set SPARK_SCALA_VERSION=2.10 as suggested in https://issues.apache.org/jira/browse/SPARK-6673, I get instead:
>>
>> 15/04/03 12:46:24.510 ExecutorRunner for app-20150403124624-0000/0
>> ERROR ExecutorRunner: Error running executor
>> java.lang.IllegalStateException: No assemblies found in 'C:\ulanov\dev\spark\mllib\.\assembly\target\scala-2.10'.
>>
>> -----Original Message-----
>> From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
>> Sent: Friday, April 03, 2015 12:31 PM
>> To: Ulanov, Alexander
>> Cc: dev@spark.apache.org
>> Subject: Re: Running LocalClusterSparkContext
>>
>> When was the last time you pulled? That should have been fixed as part of SPARK-6473.
>>
>> Notice latest master suffers from SPARK-6673 on Windows.
>>
>> On Fri, Apr 3, 2015 at 12:21 PM, Ulanov, Alexander <al...@hp.com> wrote:
>>> Hi,
>>>
>>> I am trying to execute unit tests with LocalClusterSparkContext on Windows 7. I am getting a bunch of error in the log saying that: "Cannot find any assembly build directories." Below is the part from the log where it brakes. Could you suggest what's happening? In addition the application has created a folder "app-20150403121631-0000" and is keeping to create empty folders there named with numbers until I stop the application.
>>>
>>>
>>> 15/04/03 12:16:31.239 sparkWorker2-akka.actor.default-dispatcher-3 INFO Worker: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
>>> 15/04/03 12:16:31.277 sparkDriver-akka.actor.default-dispatcher-2 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
>>> 15/04/03 12:16:31.549 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Registering worker Mynetwork.net:52006 with 1 cores, 512.0 MB
>>> RAM
>>> 15/04/03 12:16:31.556 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Registering worker Mynetwork.net:52020 with 1 cores, 512.0 MB
>>> RAM
>>> 15/04/03 12:16:31.561 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Registering app test-cluster
>>> 15/04/03 12:16:31.568 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Registered app test-cluster with ID app-20150403121631-0000
>>> 15/04/03 12:16:31.573 sparkWorker1-akka.actor.default-dispatcher-3
>>> INFO Worker: Successfully registered with master
>>> spark://Mynetwork.net:51990
>>> 15/04/03 12:16:31.588 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Launching executor app-20150403121631-0000/0 on worker
>>> worker-20150403121631-Mynetwork.net-52020
>>> 15/04/03 12:16:31.589 sparkMaster-akka.actor.default-dispatcher-7
>>> INFO
>>> Master: Launching executor app-20150403121631-0000/1 on worker
>>> worker-20150403121631-Mynetwork.net-52006
>>> 15/04/03 12:16:31.590 sparkWorker2-akka.actor.default-dispatcher-3
>>> INFO Worker: Successfully registered with master
>>> spark://Mynetwork.net:51990
>>> 15/04/03 12:16:31.595 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> SparkDeploySchedulerBackend: Connected to Spark cluster with app ID
>>> app-20150403121631-0000
>>> 15/04/03 12:16:31.608 sparkWorker1-akka.actor.default-dispatcher-4
>>> INFO Worker: Asked to launch executor app-20150403121631-0000/1 for
>>> test-cluster
>>> 15/04/03 12:16:31.624 sparkWorker2-akka.actor.default-dispatcher-5
>>> INFO Worker: Asked to launch executor app-20150403121631-0000/0 for
>>> test-cluster
>>> 15/04/03 12:16:31.639 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor added: app-20150403121631-0000/0 on
>>> worker-20150403121631-Mynetwork.net-52020 (Mynetwork.net:52020) with
>>> 1 cores
>>> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> SparkDeploySchedulerBackend: Granted executor ID
>>> app-20150403121631-0000/0 on hostPort Mynetwork.net:52020 with 1
>>> cores, 512.0 MB RAM
>>> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor added: app-20150403121631-0000/1 on
>>> worker-20150403121631-Mynetwork.net-52006 (Mynetwork.net:52006) with
>>> 1 cores
>>> 15/04/03 12:16:31.684 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> SparkDeploySchedulerBackend: Granted executor ID
>>> app-20150403121631-0000/1 on hostPort Mynetwork.net:52006 with 1
>>> cores, 512.0 MB RAM
>>> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is
>>> now LOADING
>>> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is
>>> now LOADING
>>> 15/04/03 12:16:31.688 ExecutorRunner for app-20150403121631-0000/1
>>> ERROR ExecutorRunner: Error running executor
>>> java.lang.IllegalStateException: Cannot find any assembly build directories.
>>>                 at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:228)
>>>                 at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:283)
>>>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:150)
>>>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:111)
>>>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
>>>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:48)
>>>                 at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:61)
>>>                 at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:49)
>>>                 at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:132)
>>>                 at
>>> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRun
>>> n
>>> er.scala:68)
>>> 15/04/03 12:16:31.712 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is
>>> now RUNNING
>>> 15/04/03 12:16:31.725 sparkDriver-akka.actor.default-dispatcher-3
>>> INFO
>>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is
>>> now RUNNING
>>> 15/04/03 12:16:31.726 sparkWorker1-akka.actor.default-dispatcher-4 INFO Worker: Executor app-20150403121631-0000/1 finished with state FAILED message java.lang.IllegalStateException: Cannot find any assembly build directories.
>>> 15/04/03 12:16:31.689 ExecutorRunner for app-20150403121631-0000/0
>>> ERROR ExecutorRunner: Error running executor
>>> java.lang.IllegalStateException: Cannot find any assembly build directories.
>>>
>>> Best regards, Alexander
>>
>>
>>
>> --
>> Marcelo
>
>
>
> --
> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


RE: Running LocalClusterSparkContext

Posted by "Ulanov, Alexander" <al...@hp.com>.
Thanks! It worked. I've updated the PR and now start the test with working directory in SPARK_HOME

Now I get:
ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.

Any ideas?

-----Original Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] 
Sent: Friday, April 03, 2015 12:52 PM
To: Ulanov, Alexander
Cc: dev@spark.apache.org
Subject: Re: Running LocalClusterSparkContext

That looks like another bug on top of 6673; can you point that out in the PR to make sure it's covered?

In the meantime, could you try running the command from $SPARK_HOME instead of from inside the mllib directory?

On Fri, Apr 3, 2015 at 12:48 PM, Ulanov, Alexander <al...@hp.com> wrote:
> Hi Marcelo,
>
> Thank you for quick response!
> It seems that I get the issue 6673. If I set set SPARK_SCALA_VERSION=2.10 as suggested in https://issues.apache.org/jira/browse/SPARK-6673, I get instead:
>
> 15/04/03 12:46:24.510 ExecutorRunner for app-20150403124624-0000/0 
> ERROR ExecutorRunner: Error running executor
> java.lang.IllegalStateException: No assemblies found in 'C:\ulanov\dev\spark\mllib\.\assembly\target\scala-2.10'.
>
> -----Original Message-----
> From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
> Sent: Friday, April 03, 2015 12:31 PM
> To: Ulanov, Alexander
> Cc: dev@spark.apache.org
> Subject: Re: Running LocalClusterSparkContext
>
> When was the last time you pulled? That should have been fixed as part of SPARK-6473.
>
> Notice latest master suffers from SPARK-6673 on Windows.
>
> On Fri, Apr 3, 2015 at 12:21 PM, Ulanov, Alexander <al...@hp.com> wrote:
>> Hi,
>>
>> I am trying to execute unit tests with LocalClusterSparkContext on Windows 7. I am getting a bunch of error in the log saying that: "Cannot find any assembly build directories." Below is the part from the log where it brakes. Could you suggest what's happening? In addition the application has created a folder "app-20150403121631-0000" and is keeping to create empty folders there named with numbers until I stop the application.
>>
>>
>> 15/04/03 12:16:31.239 sparkWorker2-akka.actor.default-dispatcher-3 INFO Worker: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
>> 15/04/03 12:16:31.277 sparkDriver-akka.actor.default-dispatcher-2 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
>> 15/04/03 12:16:31.549 sparkMaster-akka.actor.default-dispatcher-7 
>> INFO
>> Master: Registering worker Mynetwork.net:52006 with 1 cores, 512.0 MB 
>> RAM
>> 15/04/03 12:16:31.556 sparkMaster-akka.actor.default-dispatcher-7 
>> INFO
>> Master: Registering worker Mynetwork.net:52020 with 1 cores, 512.0 MB 
>> RAM
>> 15/04/03 12:16:31.561 sparkMaster-akka.actor.default-dispatcher-7 
>> INFO
>> Master: Registering app test-cluster
>> 15/04/03 12:16:31.568 sparkMaster-akka.actor.default-dispatcher-7 
>> INFO
>> Master: Registered app test-cluster with ID app-20150403121631-0000
>> 15/04/03 12:16:31.573 sparkWorker1-akka.actor.default-dispatcher-3
>> INFO Worker: Successfully registered with master
>> spark://Mynetwork.net:51990
>> 15/04/03 12:16:31.588 sparkMaster-akka.actor.default-dispatcher-7 
>> INFO
>> Master: Launching executor app-20150403121631-0000/0 on worker
>> worker-20150403121631-Mynetwork.net-52020
>> 15/04/03 12:16:31.589 sparkMaster-akka.actor.default-dispatcher-7 
>> INFO
>> Master: Launching executor app-20150403121631-0000/1 on worker
>> worker-20150403121631-Mynetwork.net-52006
>> 15/04/03 12:16:31.590 sparkWorker2-akka.actor.default-dispatcher-3
>> INFO Worker: Successfully registered with master
>> spark://Mynetwork.net:51990
>> 15/04/03 12:16:31.595 sparkDriver-akka.actor.default-dispatcher-3 
>> INFO
>> SparkDeploySchedulerBackend: Connected to Spark cluster with app ID
>> app-20150403121631-0000
>> 15/04/03 12:16:31.608 sparkWorker1-akka.actor.default-dispatcher-4
>> INFO Worker: Asked to launch executor app-20150403121631-0000/1 for 
>> test-cluster
>> 15/04/03 12:16:31.624 sparkWorker2-akka.actor.default-dispatcher-5
>> INFO Worker: Asked to launch executor app-20150403121631-0000/0 for 
>> test-cluster
>> 15/04/03 12:16:31.639 sparkDriver-akka.actor.default-dispatcher-3 
>> INFO
>> AppClient$ClientActor: Executor added: app-20150403121631-0000/0 on
>> worker-20150403121631-Mynetwork.net-52020 (Mynetwork.net:52020) with 
>> 1 cores
>> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 
>> INFO
>> SparkDeploySchedulerBackend: Granted executor ID
>> app-20150403121631-0000/0 on hostPort Mynetwork.net:52020 with 1 
>> cores, 512.0 MB RAM
>> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 
>> INFO
>> AppClient$ClientActor: Executor added: app-20150403121631-0000/1 on
>> worker-20150403121631-Mynetwork.net-52006 (Mynetwork.net:52006) with 
>> 1 cores
>> 15/04/03 12:16:31.684 sparkDriver-akka.actor.default-dispatcher-3 
>> INFO
>> SparkDeploySchedulerBackend: Granted executor ID
>> app-20150403121631-0000/1 on hostPort Mynetwork.net:52006 with 1 
>> cores, 512.0 MB RAM
>> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 
>> INFO
>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is 
>> now LOADING
>> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 
>> INFO
>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is 
>> now LOADING
>> 15/04/03 12:16:31.688 ExecutorRunner for app-20150403121631-0000/1 
>> ERROR ExecutorRunner: Error running executor
>> java.lang.IllegalStateException: Cannot find any assembly build directories.
>>                 at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:228)
>>                 at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:283)
>>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:150)
>>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:111)
>>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
>>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:48)
>>                 at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:61)
>>                 at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:49)
>>                 at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:132)
>>                 at
>> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRun
>> n
>> er.scala:68)
>> 15/04/03 12:16:31.712 sparkDriver-akka.actor.default-dispatcher-3 
>> INFO
>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is 
>> now RUNNING
>> 15/04/03 12:16:31.725 sparkDriver-akka.actor.default-dispatcher-3 
>> INFO
>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is 
>> now RUNNING
>> 15/04/03 12:16:31.726 sparkWorker1-akka.actor.default-dispatcher-4 INFO Worker: Executor app-20150403121631-0000/1 finished with state FAILED message java.lang.IllegalStateException: Cannot find any assembly build directories.
>> 15/04/03 12:16:31.689 ExecutorRunner for app-20150403121631-0000/0 
>> ERROR ExecutorRunner: Error running executor
>> java.lang.IllegalStateException: Cannot find any assembly build directories.
>>
>> Best regards, Alexander
>
>
>
> --
> Marcelo



--
Marcelo

Re: Running LocalClusterSparkContext

Posted by Marcelo Vanzin <va...@cloudera.com>.
That looks like another bug on top of 6673; can you point that out in
the PR to make sure it's covered?

In the meantime, could you try running the command from $SPARK_HOME
instead of from inside the mllib directory?

On Fri, Apr 3, 2015 at 12:48 PM, Ulanov, Alexander
<al...@hp.com> wrote:
> Hi Marcelo,
>
> Thank you for quick response!
> It seems that I get the issue 6673. If I set set SPARK_SCALA_VERSION=2.10 as suggested in https://issues.apache.org/jira/browse/SPARK-6673, I get instead:
>
> 15/04/03 12:46:24.510 ExecutorRunner for app-20150403124624-0000/0 ERROR ExecutorRunner: Error running executor
> java.lang.IllegalStateException: No assemblies found in 'C:\ulanov\dev\spark\mllib\.\assembly\target\scala-2.10'.
>
> -----Original Message-----
> From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
> Sent: Friday, April 03, 2015 12:31 PM
> To: Ulanov, Alexander
> Cc: dev@spark.apache.org
> Subject: Re: Running LocalClusterSparkContext
>
> When was the last time you pulled? That should have been fixed as part of SPARK-6473.
>
> Notice latest master suffers from SPARK-6673 on Windows.
>
> On Fri, Apr 3, 2015 at 12:21 PM, Ulanov, Alexander <al...@hp.com> wrote:
>> Hi,
>>
>> I am trying to execute unit tests with LocalClusterSparkContext on Windows 7. I am getting a bunch of error in the log saying that: "Cannot find any assembly build directories." Below is the part from the log where it brakes. Could you suggest what's happening? In addition the application has created a folder "app-20150403121631-0000" and is keeping to create empty folders there named with numbers until I stop the application.
>>
>>
>> 15/04/03 12:16:31.239 sparkWorker2-akka.actor.default-dispatcher-3 INFO Worker: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
>> 15/04/03 12:16:31.277 sparkDriver-akka.actor.default-dispatcher-2 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
>> 15/04/03 12:16:31.549 sparkMaster-akka.actor.default-dispatcher-7 INFO
>> Master: Registering worker Mynetwork.net:52006 with 1 cores, 512.0 MB
>> RAM
>> 15/04/03 12:16:31.556 sparkMaster-akka.actor.default-dispatcher-7 INFO
>> Master: Registering worker Mynetwork.net:52020 with 1 cores, 512.0 MB
>> RAM
>> 15/04/03 12:16:31.561 sparkMaster-akka.actor.default-dispatcher-7 INFO
>> Master: Registering app test-cluster
>> 15/04/03 12:16:31.568 sparkMaster-akka.actor.default-dispatcher-7 INFO
>> Master: Registered app test-cluster with ID app-20150403121631-0000
>> 15/04/03 12:16:31.573 sparkWorker1-akka.actor.default-dispatcher-3
>> INFO Worker: Successfully registered with master
>> spark://Mynetwork.net:51990
>> 15/04/03 12:16:31.588 sparkMaster-akka.actor.default-dispatcher-7 INFO
>> Master: Launching executor app-20150403121631-0000/0 on worker
>> worker-20150403121631-Mynetwork.net-52020
>> 15/04/03 12:16:31.589 sparkMaster-akka.actor.default-dispatcher-7 INFO
>> Master: Launching executor app-20150403121631-0000/1 on worker
>> worker-20150403121631-Mynetwork.net-52006
>> 15/04/03 12:16:31.590 sparkWorker2-akka.actor.default-dispatcher-3
>> INFO Worker: Successfully registered with master
>> spark://Mynetwork.net:51990
>> 15/04/03 12:16:31.595 sparkDriver-akka.actor.default-dispatcher-3 INFO
>> SparkDeploySchedulerBackend: Connected to Spark cluster with app ID
>> app-20150403121631-0000
>> 15/04/03 12:16:31.608 sparkWorker1-akka.actor.default-dispatcher-4
>> INFO Worker: Asked to launch executor app-20150403121631-0000/1 for
>> test-cluster
>> 15/04/03 12:16:31.624 sparkWorker2-akka.actor.default-dispatcher-5
>> INFO Worker: Asked to launch executor app-20150403121631-0000/0 for
>> test-cluster
>> 15/04/03 12:16:31.639 sparkDriver-akka.actor.default-dispatcher-3 INFO
>> AppClient$ClientActor: Executor added: app-20150403121631-0000/0 on
>> worker-20150403121631-Mynetwork.net-52020 (Mynetwork.net:52020) with 1
>> cores
>> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 INFO
>> SparkDeploySchedulerBackend: Granted executor ID
>> app-20150403121631-0000/0 on hostPort Mynetwork.net:52020 with 1
>> cores, 512.0 MB RAM
>> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 INFO
>> AppClient$ClientActor: Executor added: app-20150403121631-0000/1 on
>> worker-20150403121631-Mynetwork.net-52006 (Mynetwork.net:52006) with 1
>> cores
>> 15/04/03 12:16:31.684 sparkDriver-akka.actor.default-dispatcher-3 INFO
>> SparkDeploySchedulerBackend: Granted executor ID
>> app-20150403121631-0000/1 on hostPort Mynetwork.net:52006 with 1
>> cores, 512.0 MB RAM
>> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 INFO
>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is
>> now LOADING
>> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 INFO
>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is
>> now LOADING
>> 15/04/03 12:16:31.688 ExecutorRunner for app-20150403121631-0000/1
>> ERROR ExecutorRunner: Error running executor
>> java.lang.IllegalStateException: Cannot find any assembly build directories.
>>                 at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:228)
>>                 at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:283)
>>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:150)
>>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:111)
>>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
>>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:48)
>>                 at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:61)
>>                 at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:49)
>>                 at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:132)
>>                 at
>> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunn
>> er.scala:68)
>> 15/04/03 12:16:31.712 sparkDriver-akka.actor.default-dispatcher-3 INFO
>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is
>> now RUNNING
>> 15/04/03 12:16:31.725 sparkDriver-akka.actor.default-dispatcher-3 INFO
>> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is
>> now RUNNING
>> 15/04/03 12:16:31.726 sparkWorker1-akka.actor.default-dispatcher-4 INFO Worker: Executor app-20150403121631-0000/1 finished with state FAILED message java.lang.IllegalStateException: Cannot find any assembly build directories.
>> 15/04/03 12:16:31.689 ExecutorRunner for app-20150403121631-0000/0
>> ERROR ExecutorRunner: Error running executor
>> java.lang.IllegalStateException: Cannot find any assembly build directories.
>>
>> Best regards, Alexander
>
>
>
> --
> Marcelo



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


RE: Running LocalClusterSparkContext

Posted by "Ulanov, Alexander" <al...@hp.com>.
Hi Marcelo,

Thank you for quick response!
It seems that I get the issue 6673. If I set set SPARK_SCALA_VERSION=2.10 as suggested in https://issues.apache.org/jira/browse/SPARK-6673, I get instead:

15/04/03 12:46:24.510 ExecutorRunner for app-20150403124624-0000/0 ERROR ExecutorRunner: Error running executor
java.lang.IllegalStateException: No assemblies found in 'C:\ulanov\dev\spark\mllib\.\assembly\target\scala-2.10'.

-----Original Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] 
Sent: Friday, April 03, 2015 12:31 PM
To: Ulanov, Alexander
Cc: dev@spark.apache.org
Subject: Re: Running LocalClusterSparkContext

When was the last time you pulled? That should have been fixed as part of SPARK-6473.

Notice latest master suffers from SPARK-6673 on Windows.

On Fri, Apr 3, 2015 at 12:21 PM, Ulanov, Alexander <al...@hp.com> wrote:
> Hi,
>
> I am trying to execute unit tests with LocalClusterSparkContext on Windows 7. I am getting a bunch of error in the log saying that: "Cannot find any assembly build directories." Below is the part from the log where it brakes. Could you suggest what's happening? In addition the application has created a folder "app-20150403121631-0000" and is keeping to create empty folders there named with numbers until I stop the application.
>
>
> 15/04/03 12:16:31.239 sparkWorker2-akka.actor.default-dispatcher-3 INFO Worker: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
> 15/04/03 12:16:31.277 sparkDriver-akka.actor.default-dispatcher-2 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
> 15/04/03 12:16:31.549 sparkMaster-akka.actor.default-dispatcher-7 INFO 
> Master: Registering worker Mynetwork.net:52006 with 1 cores, 512.0 MB 
> RAM
> 15/04/03 12:16:31.556 sparkMaster-akka.actor.default-dispatcher-7 INFO 
> Master: Registering worker Mynetwork.net:52020 with 1 cores, 512.0 MB 
> RAM
> 15/04/03 12:16:31.561 sparkMaster-akka.actor.default-dispatcher-7 INFO 
> Master: Registering app test-cluster
> 15/04/03 12:16:31.568 sparkMaster-akka.actor.default-dispatcher-7 INFO 
> Master: Registered app test-cluster with ID app-20150403121631-0000
> 15/04/03 12:16:31.573 sparkWorker1-akka.actor.default-dispatcher-3 
> INFO Worker: Successfully registered with master 
> spark://Mynetwork.net:51990
> 15/04/03 12:16:31.588 sparkMaster-akka.actor.default-dispatcher-7 INFO 
> Master: Launching executor app-20150403121631-0000/0 on worker 
> worker-20150403121631-Mynetwork.net-52020
> 15/04/03 12:16:31.589 sparkMaster-akka.actor.default-dispatcher-7 INFO 
> Master: Launching executor app-20150403121631-0000/1 on worker 
> worker-20150403121631-Mynetwork.net-52006
> 15/04/03 12:16:31.590 sparkWorker2-akka.actor.default-dispatcher-3 
> INFO Worker: Successfully registered with master 
> spark://Mynetwork.net:51990
> 15/04/03 12:16:31.595 sparkDriver-akka.actor.default-dispatcher-3 INFO 
> SparkDeploySchedulerBackend: Connected to Spark cluster with app ID 
> app-20150403121631-0000
> 15/04/03 12:16:31.608 sparkWorker1-akka.actor.default-dispatcher-4 
> INFO Worker: Asked to launch executor app-20150403121631-0000/1 for 
> test-cluster
> 15/04/03 12:16:31.624 sparkWorker2-akka.actor.default-dispatcher-5 
> INFO Worker: Asked to launch executor app-20150403121631-0000/0 for 
> test-cluster
> 15/04/03 12:16:31.639 sparkDriver-akka.actor.default-dispatcher-3 INFO 
> AppClient$ClientActor: Executor added: app-20150403121631-0000/0 on 
> worker-20150403121631-Mynetwork.net-52020 (Mynetwork.net:52020) with 1 
> cores
> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 INFO 
> SparkDeploySchedulerBackend: Granted executor ID 
> app-20150403121631-0000/0 on hostPort Mynetwork.net:52020 with 1 
> cores, 512.0 MB RAM
> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 INFO 
> AppClient$ClientActor: Executor added: app-20150403121631-0000/1 on 
> worker-20150403121631-Mynetwork.net-52006 (Mynetwork.net:52006) with 1 
> cores
> 15/04/03 12:16:31.684 sparkDriver-akka.actor.default-dispatcher-3 INFO 
> SparkDeploySchedulerBackend: Granted executor ID 
> app-20150403121631-0000/1 on hostPort Mynetwork.net:52006 with 1 
> cores, 512.0 MB RAM
> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 INFO 
> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is 
> now LOADING
> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 INFO 
> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is 
> now LOADING
> 15/04/03 12:16:31.688 ExecutorRunner for app-20150403121631-0000/1 
> ERROR ExecutorRunner: Error running executor
> java.lang.IllegalStateException: Cannot find any assembly build directories.
>                 at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:228)
>                 at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:283)
>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:150)
>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:111)
>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:48)
>                 at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:61)
>                 at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:49)
>                 at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:132)
>                 at 
> org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunn
> er.scala:68)
> 15/04/03 12:16:31.712 sparkDriver-akka.actor.default-dispatcher-3 INFO 
> AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is 
> now RUNNING
> 15/04/03 12:16:31.725 sparkDriver-akka.actor.default-dispatcher-3 INFO 
> AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is 
> now RUNNING
> 15/04/03 12:16:31.726 sparkWorker1-akka.actor.default-dispatcher-4 INFO Worker: Executor app-20150403121631-0000/1 finished with state FAILED message java.lang.IllegalStateException: Cannot find any assembly build directories.
> 15/04/03 12:16:31.689 ExecutorRunner for app-20150403121631-0000/0 
> ERROR ExecutorRunner: Error running executor
> java.lang.IllegalStateException: Cannot find any assembly build directories.
>
> Best regards, Alexander



--
Marcelo

Re: Running LocalClusterSparkContext

Posted by Marcelo Vanzin <va...@cloudera.com>.
When was the last time you pulled? That should have been fixed as part
of SPARK-6473.

Notice latest master suffers from SPARK-6673 on Windows.

On Fri, Apr 3, 2015 at 12:21 PM, Ulanov, Alexander
<al...@hp.com> wrote:
> Hi,
>
> I am trying to execute unit tests with LocalClusterSparkContext on Windows 7. I am getting a bunch of error in the log saying that: "Cannot find any assembly build directories." Below is the part from the log where it brakes. Could you suggest what's happening? In addition the application has created a folder "app-20150403121631-0000" and is keeping to create empty folders there named with numbers until I stop the application.
>
>
> 15/04/03 12:16:31.239 sparkWorker2-akka.actor.default-dispatcher-3 INFO Worker: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
> 15/04/03 12:16:31.277 sparkDriver-akka.actor.default-dispatcher-2 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@Mynetwork.net:51990/user/Master...
> 15/04/03 12:16:31.549 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Registering worker Mynetwork.net:52006 with 1 cores, 512.0 MB RAM
> 15/04/03 12:16:31.556 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Registering worker Mynetwork.net:52020 with 1 cores, 512.0 MB RAM
> 15/04/03 12:16:31.561 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Registering app test-cluster
> 15/04/03 12:16:31.568 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Registered app test-cluster with ID app-20150403121631-0000
> 15/04/03 12:16:31.573 sparkWorker1-akka.actor.default-dispatcher-3 INFO Worker: Successfully registered with master spark://Mynetwork.net:51990
> 15/04/03 12:16:31.588 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Launching executor app-20150403121631-0000/0 on worker worker-20150403121631-Mynetwork.net-52020
> 15/04/03 12:16:31.589 sparkMaster-akka.actor.default-dispatcher-7 INFO Master: Launching executor app-20150403121631-0000/1 on worker worker-20150403121631-Mynetwork.net-52006
> 15/04/03 12:16:31.590 sparkWorker2-akka.actor.default-dispatcher-3 INFO Worker: Successfully registered with master spark://Mynetwork.net:51990
> 15/04/03 12:16:31.595 sparkDriver-akka.actor.default-dispatcher-3 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150403121631-0000
> 15/04/03 12:16:31.608 sparkWorker1-akka.actor.default-dispatcher-4 INFO Worker: Asked to launch executor app-20150403121631-0000/1 for test-cluster
> 15/04/03 12:16:31.624 sparkWorker2-akka.actor.default-dispatcher-5 INFO Worker: Asked to launch executor app-20150403121631-0000/0 for test-cluster
> 15/04/03 12:16:31.639 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor added: app-20150403121631-0000/0 on worker-20150403121631-Mynetwork.net-52020 (Mynetwork.net:52020) with 1 cores
> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150403121631-0000/0 on hostPort Mynetwork.net:52020 with 1 cores, 512.0 MB RAM
> 15/04/03 12:16:31.683 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor added: app-20150403121631-0000/1 on worker-20150403121631-Mynetwork.net-52006 (Mynetwork.net:52006) with 1 cores
> 15/04/03 12:16:31.684 sparkDriver-akka.actor.default-dispatcher-3 INFO SparkDeploySchedulerBackend: Granted executor ID app-20150403121631-0000/1 on hostPort Mynetwork.net:52006 with 1 cores, 512.0 MB RAM
> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is now LOADING
> 15/04/03 12:16:31.688 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is now LOADING
> 15/04/03 12:16:31.688 ExecutorRunner for app-20150403121631-0000/1 ERROR ExecutorRunner: Error running executor
> java.lang.IllegalStateException: Cannot find any assembly build directories.
>                 at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:228)
>                 at org.apache.spark.launcher.AbstractCommandBuilder.getScalaVersion(AbstractCommandBuilder.java:283)
>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:150)
>                 at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:111)
>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:39)
>                 at org.apache.spark.launcher.WorkerCommandBuilder.buildCommand(WorkerCommandBuilder.scala:48)
>                 at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:61)
>                 at org.apache.spark.deploy.worker.CommandUtils$.buildProcessBuilder(CommandUtils.scala:49)
>                 at org.apache.spark.deploy.worker.ExecutorRunner.org$apache$spark$deploy$worker$ExecutorRunner$$fetchAndRunExecutor(ExecutorRunner.scala:132)
>                 at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:68)
> 15/04/03 12:16:31.712 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor updated: app-20150403121631-0000/0 is now RUNNING
> 15/04/03 12:16:31.725 sparkDriver-akka.actor.default-dispatcher-3 INFO AppClient$ClientActor: Executor updated: app-20150403121631-0000/1 is now RUNNING
> 15/04/03 12:16:31.726 sparkWorker1-akka.actor.default-dispatcher-4 INFO Worker: Executor app-20150403121631-0000/1 finished with state FAILED message java.lang.IllegalStateException: Cannot find any assembly build directories.
> 15/04/03 12:16:31.689 ExecutorRunner for app-20150403121631-0000/0 ERROR ExecutorRunner: Error running executor
> java.lang.IllegalStateException: Cannot find any assembly build directories.
>
> Best regards, Alexander



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org