You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by Stefan Miklosovic <mi...@gmail.com> on 2017/10/29 10:19:41 UTC

After successfully computed job submitted programmatically, it is still running in Spark UI and marked as idle in Livy UI

Title says it all, I upload a JAR, I run a job via client.run(Job<T>
job).get(); and I do get a result - all is computed ok, however, that
application is not marked as "completed" in Spark UI and it hangs
there indefinitely and I have to kill it myself.

What should I do, if I want to mark successfully run application as
completed so it is not running / is not idle anymore?

Thanks!

-- 
Stefan Miklosovic

Re: After successfully computed job submitted programmatically, it is still running in Spark UI and marked as idle in Livy UI

Posted by Marcelo Vanzin <va...@cloudera.com>.
You have to call "client.stop(true)" if you want to shut down the
Spark application.

On Sun, Oct 29, 2017 at 3:19 AM, Stefan Miklosovic <mi...@gmail.com> wrote:
> Title says it all, I upload a JAR, I run a job via client.run(Job<T>
> job).get(); and I do get a result - all is computed ok, however, that
> application is not marked as "completed" in Spark UI and it hangs
> there indefinitely and I have to kill it myself.
>
> What should I do, if I want to mark successfully run application as
> completed so it is not running / is not idle anymore?
>
> Thanks!
>
> --
> Stefan Miklosovic



-- 
Marcelo