You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Divya Narayan <na...@gmail.com> on 2017/10/31 12:51:40 UTC

Spark job's application tracking URL not accessible from docker container

We have streaming jobs and batch jobs running inside the docker containers
with spark driver launched within the container

Now when we open the Resource manager UI http://<RM Node>:8080, and try to
access the application tracking URL of any running job, the page times out
with error:

HTTP ERROR 500

Problem accessing /proxy/redirect/application_1509358290085_0011/jobs/.
Reason:

    Connection to http://<containerIP>:<spark ui port> refused


It works only when container is launched on the node that is currently the
yarn RM master.On any other node it doesn't work

Re: Spark job's application tracking URL not accessible from docker container

Posted by Harsh <ta...@gmail.com>.
Hi 

I am facing the same issue while launching the application inside docker
container.


Kind Regards
Harsh



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org