You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Yeachan Park <ye...@gmail.com> on 2023/01/19 13:59:25 UTC

How to check the liveness of a SparkSession

Hi all,

We have a long running PySpark session running on client mode that
occasionally dies.

We'd like to check whether the session is still alive. One solution we came
up with was checking whether the UI is still up, but we were wondering if
there's maybe an easier way then that.

Maybe something like spark.getActiveSession() might do the same. I noticed
that it throws a connection refused error if the current spark session dies.

Are there any official/suggested ways to check this? I couldn't find much
in the docs/previous mailing lists.

Kind regards,
Yeachan