You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Harsh Choudhary <sh...@gmail.com> on 2017/11/21 07:29:18 UTC

Long running Spark Job Status on Remote Submission

Hi

I am submitting a Spark Job on Yarn cluster from a remote machine which is
not in the cluster itself. When there are some jobs which take some large
time, the spark-submit process never exits as it still waits for the status
of the job. Though on the cluster, the job gets finished successfully.

How do I get the status of such long-running jobs so that I can do the
further tasks on my remote machine after the job completion? Livy is one
choice but I want to do it without that, if possible.

*Thanks!*

Harsh Choudhary