You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Mr rty ff <ya...@yahoo.com.INVALID> on 2016/07/06 19:49:38 UTC

Stopping Spark executors

HiI like to recreate this bug https://issues.apache.org/jira/browse/SPARK-13979They talking about stopping Spark executors.Its not clear exactly how do I stop the executorsThanks
 

Re: Stopping Spark executors

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

Read the doc http://spark.apache.org/docs/latest/spark-standalone.html
which seems to be the cluster manager the OP uses.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Thu, Jul 7, 2016 at 11:26 PM, Mr rty ff <ya...@yahoo.com> wrote:
> Hi
> I am sorry but its still not clear
> Do you mean ./bin/spark-shell --master local
> And what I do after that killing the
> org.apache.spark.deploy.SparkSubmit --master local --class
> org.apache.spark.repl.Main --name Spark shell spark-shell
> will kill the shell so I couldn't send the commands .
> Thanks
>
>
> On Friday, July 8, 2016 12:05 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>
>
> Hi,
>
> Then use --master with spark standalone, yarn, or mesos.
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Thu, Jul 7, 2016 at 10:35 PM, Mr rty ff <ya...@yahoo.com> wrote:
>> I don't think Its the proper way to recreate the bug becouse I should
>> continue to send commands to the shell
>> They talking about killing the CoarseGrainedExecutorBackend
>>
>>
>> On Thursday, July 7, 2016 11:32 PM, Jacek Laskowski <ja...@japila.pl>
>> wrote:
>>
>>
>> Hi,
>>
>> It appears you're running local mode (local[*] assumed) so killing
>> spark-shell *will* kill the one and only executor -- the driver :)
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Thu, Jul 7, 2016 at 10:27 PM, Mr rty ff <ya...@yahoo.com> wrote:
>>> This what I get when I run the command
>>> 946 sun.tools.jps.Jps -lm
>>> 7443 org.apache.spark.deploy.SparkSubmit --class
>>> org.apache.spark.repl.Main
>>> --name Spark shell spark-shell
>>> I don't think that shululd kill SparkSubmit  process
>>>
>>>
>>>
>>> On Thursday, July 7, 2016 9:58 PM, Jacek Laskowski <ja...@japila.pl>
>>> wrote:
>>>
>>>
>>> Hi,
>>>
>>> Use jps -lm and see the processes on the machine(s) to kill.
>>>
>>> Pozdrawiam,
>>> Jacek Laskowski
>>> ----
>>> https://medium.com/@jaceklaskowski/
>>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>>> Follow me at https://twitter.com/jaceklaskowski
>>>
>>>
>>> On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <ya...@yahoo.com.invalid>
>>> wrote:
>>>> Hi
>>>> I like to recreate this bug
>>>> https://issues.apache.org/jira/browse/SPARK-13979
>>>> They talking about stopping Spark executors. Its not clear exactly how
>>>> do
>>>> I
>>>> stop the executors
>>>> Thanks
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>>
>>>
>>>
>>>
>>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Stopping Spark executors

Posted by Mr rty ff <ya...@yahoo.com.INVALID>.
HiI am sorry but its still not clearDo you mean ./bin/spark-shell --master localAnd what I do after that killing the org.apache.spark.deploy.SparkSubmit --master local --class org.apache.spark.repl.Main --name Spark shell spark-shell
will kill the shell so I couldn't send the commands .Thanks 

    On Friday, July 8, 2016 12:05 AM, Jacek Laskowski <ja...@japila.pl> wrote:
 

 Hi,

Then use --master with spark standalone, yarn, or mesos.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Thu, Jul 7, 2016 at 10:35 PM, Mr rty ff <ya...@yahoo.com> wrote:
> I don't think Its the proper way to recreate the bug becouse I should
> continue to send commands to the shell
> They talking about killing the CoarseGrainedExecutorBackend
>
>
> On Thursday, July 7, 2016 11:32 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>
>
> Hi,
>
> It appears you're running local mode (local[*] assumed) so killing
> spark-shell *will* kill the one and only executor -- the driver :)
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Thu, Jul 7, 2016 at 10:27 PM, Mr rty ff <ya...@yahoo.com> wrote:
>> This what I get when I run the command
>> 946 sun.tools.jps.Jps -lm
>> 7443 org.apache.spark.deploy.SparkSubmit --class
>> org.apache.spark.repl.Main
>> --name Spark shell spark-shell
>> I don't think that shululd kill SparkSubmit  process
>>
>>
>>
>> On Thursday, July 7, 2016 9:58 PM, Jacek Laskowski <ja...@japila.pl>
>> wrote:
>>
>>
>> Hi,
>>
>> Use jps -lm and see the processes on the machine(s) to kill.
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <ya...@yahoo.com.invalid>
>> wrote:
>>> Hi
>>> I like to recreate this bug
>>> https://issues.apache.org/jira/browse/SPARK-13979
>>> They talking about stopping Spark executors. Its not clear exactly how do
>>> I
>>> stop the executors
>>> Thanks
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>>
>>
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org



  

Re: Stopping Spark executors

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

Then use --master with spark standalone, yarn, or mesos.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Thu, Jul 7, 2016 at 10:35 PM, Mr rty ff <ya...@yahoo.com> wrote:
> I don't think Its the proper way to recreate the bug becouse I should
> continue to send commands to the shell
> They talking about killing the CoarseGrainedExecutorBackend
>
>
> On Thursday, July 7, 2016 11:32 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>
>
> Hi,
>
> It appears you're running local mode (local[*] assumed) so killing
> spark-shell *will* kill the one and only executor -- the driver :)
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Thu, Jul 7, 2016 at 10:27 PM, Mr rty ff <ya...@yahoo.com> wrote:
>> This what I get when I run the command
>> 946 sun.tools.jps.Jps -lm
>> 7443 org.apache.spark.deploy.SparkSubmit --class
>> org.apache.spark.repl.Main
>> --name Spark shell spark-shell
>> I don't think that shululd kill SparkSubmit  process
>>
>>
>>
>> On Thursday, July 7, 2016 9:58 PM, Jacek Laskowski <ja...@japila.pl>
>> wrote:
>>
>>
>> Hi,
>>
>> Use jps -lm and see the processes on the machine(s) to kill.
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <ya...@yahoo.com.invalid>
>> wrote:
>>> Hi
>>> I like to recreate this bug
>>> https://issues.apache.org/jira/browse/SPARK-13979
>>> They talking about stopping Spark executors. Its not clear exactly how do
>>> I
>>> stop the executors
>>> Thanks
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>>
>>
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Stopping Spark executors

Posted by Mr rty ff <ya...@yahoo.com.INVALID>.
I don't think Its the proper way to recreate the bug becouse I should continue to send commands to the shellThey talking about killing the CoarseGrainedExecutorBackend

    On Thursday, July 7, 2016 11:32 PM, Jacek Laskowski <ja...@japila.pl> wrote:
 

 Hi,

It appears you're running local mode (local[*] assumed) so killing
spark-shell *will* kill the one and only executor -- the driver :)

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Thu, Jul 7, 2016 at 10:27 PM, Mr rty ff <ya...@yahoo.com> wrote:
> This what I get when I run the command
> 946 sun.tools.jps.Jps -lm
> 7443 org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main
> --name Spark shell spark-shell
> I don't think that shululd kill SparkSubmit  process
>
>
>
> On Thursday, July 7, 2016 9:58 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>
>
> Hi,
>
> Use jps -lm and see the processes on the machine(s) to kill.
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <ya...@yahoo.com.invalid> wrote:
>> Hi
>> I like to recreate this bug
>> https://issues.apache.org/jira/browse/SPARK-13979
>> They talking about stopping Spark executors. Its not clear exactly how do
>> I
>> stop the executors
>> Thanks
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org



  

Re: Stopping Spark executors

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

It appears you're running local mode (local[*] assumed) so killing
spark-shell *will* kill the one and only executor -- the driver :)

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Thu, Jul 7, 2016 at 10:27 PM, Mr rty ff <ya...@yahoo.com> wrote:
> This what I get when I run the command
> 946 sun.tools.jps.Jps -lm
> 7443 org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main
> --name Spark shell spark-shell
> I don't think that shululd kill SparkSubmit  process
>
>
>
> On Thursday, July 7, 2016 9:58 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>
>
> Hi,
>
> Use jps -lm and see the processes on the machine(s) to kill.
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <ya...@yahoo.com.invalid> wrote:
>> Hi
>> I like to recreate this bug
>> https://issues.apache.org/jira/browse/SPARK-13979
>> They talking about stopping Spark executors. Its not clear exactly how do
>> I
>> stop the executors
>> Thanks
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Stopping Spark executors

Posted by Mr rty ff <ya...@yahoo.com.INVALID>.
This what I get when I run the command946 sun.tools.jps.Jps -lm7443 org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name Spark shell spark-shellI don't think that shululd kill SparkSubmit  process
 

    On Thursday, July 7, 2016 9:58 PM, Jacek Laskowski <ja...@japila.pl> wrote:
 

 Hi,

Use jps -lm and see the processes on the machine(s) to kill.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <ya...@yahoo.com.invalid> wrote:
> Hi
> I like to recreate this bug
> https://issues.apache.org/jira/browse/SPARK-13979
> They talking about stopping Spark executors. Its not clear exactly how do I
> stop the executors
> Thanks

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org



  

Re: Stopping Spark executors

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

Use jps -lm and see the processes on the machine(s) to kill.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <ya...@yahoo.com.invalid> wrote:
> Hi
> I like to recreate this bug
> https://issues.apache.org/jira/browse/SPARK-13979
> They talking about stopping Spark executors. Its not clear exactly how do I
> stop the executors
> Thanks

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org