You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Vladimir Tretyakov <vl...@sematext.com> on 2016/10/04 18:44:01 UTC

Re: Spark metrics when running with YARN?

Hi,

When I start Spark v1.6 (cdh5.8.0) in YARN Master mode I don't see API (
http://localhost:4040/api/v1/applications is unavailable) on port 4040.

I started Spark application like this:

spark-submit     --master yarn-cluster     --class
org.apache.spark.examples.SparkPi
/usr/lib/spark/examples/lib/spark-examples-1.6.0-cdh5.8.0-hadoop2.6.0-cdh5.8.0.jar
    10000


Nothing on 4040 port:
telnet 4040
Trying 0.0.15.200...
telnet: Unable to connect to remote host: Invalid argument

Any idea why that is?

I've checked the code and it looks like in Yarn Master mode port set to 0.

if (isClusterMode) {
  // Set the web ui port to be ephemeral for yarn so we don't conflict with
  // other spark processes running on the same box
  System.setProperty("spark.ui.port", "0")
...
}


Does that mean one cannot count on port 4040 always being available?

Is there ony other universal way to get information about Spark RUNNING
applications?

Best regards, Vladimir.