You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by Ryan Quey <qu...@gmail.com> on 2021/06/15 05:33:12 UTC

Running Livy with external spark cluster

I am starting to use Kubernetes and want to run Livy within Kubernetes, but
keep my existing spark cluster outside of Kubernetes. I'm setting
`livy.spark.master` to my external spark cluster, but having trouble with
the SPARK_HOME env var. All the documentation I've seen seems to have Livy
co-located with Spark, and if I try not setting SPARK_HOME, it
always returns the same error:

`Exception in thread "main" java.lang.IllegalArgumentException: requirement
failed: SPARK_HOME path does not exist`

Is there a way to run Livy without having spark on the same node?

-- 
Ryan Quey
Full Stack Data Engineer
LinkedIn <https://www.linkedin.com/in/ryan-quey/> | Github
<https://github.com/RyanQuey> | RyanQuey.com <https://www.ryanquey.com>