You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nirav Patel <np...@xactlycorp.com> on 2015/03/17 20:40:13 UTC

Spark 1.0.2 failover doesnt port running application context to new master

We have spark 1.0.2 cluster with 3 nodes under HA setup using zookeeper. We
have long running self contained spark service that serves on-demand
requests.
I tried to do failover test by killing spark master and see if our
application get ported over to new master. Looks like killing master
doesn't really kills executors that were created by application. So our
application is still able to serve request but problem is I can no longer
see our applicaiton running in UI. Probably just issue of not having a
history server ? Driver UI still works. Can someone confirm this?
I am also attaching screenshot of new master console that display "Running
Application" section

Nirav

-- 


[image: What's New with Xactly] <http://www.xactlycorp.com/email-click/>

[image: Facebook] <http://www.facebook.com/XactlyCorp>  [image: LinkedIn] 
<http://www.linkedin.com/company/xactly-corporation>  [image: Twitter] 
<https://twitter.com/xactly>  [image: YouTube] 
<http://www.youtube.com/xactlycorporation>