You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Omar Padron (JIRA)" <ji...@apache.org> on 2015/07/23 19:14:04 UTC

[jira] [Commented] (SPARK-9279) Spark Master Refuses to Bind WebUI to a Privileged Port

    [ https://issues.apache.org/jira/browse/SPARK-9279?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14639150#comment-14639150 ] 

Omar Padron commented on SPARK-9279:
------------------------------------

bq. This has nothing to do with Spark. Any Linux-like OS X requires root privileges for any process to bind to a port under 1024.

Am I missing something?  

bq. When trying to start a spark master server as root...

bq. ... Users choosing to run spark as root should be allowed to choose their own ports.

I've mentioned twice that I am running spark as root, including on the very first sentence of the bug description.

In case I was not clear: I am reporting on an issue with the port selection logic that is *explicitly coded* into spark.  This is not a matter of not having access to the privileged ports, but of spark explicitly refusing to bind to them.  This has everything to do with spark.

See:
https://github.com/apache/spark/blob/c032b0bf92130dc4facb003f0deaeb1228aefded/core/src/main/scala/org/apache/spark/util/Utils.scala#L2019

> Spark Master Refuses to Bind WebUI to a Privileged Port
> -------------------------------------------------------
>
>                 Key: SPARK-9279
>                 URL: https://issues.apache.org/jira/browse/SPARK-9279
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.1
>         Environment: Ubuntu Trusty running in a docker container
>            Reporter: Omar Padron
>            Priority: Minor
>
> When trying to start a spark master server as root...
> {code}
> export SPARK_MASTER_PORT=7077
> export SPARK_MASTER_WEBUI_PORT=80
> spark-class org.apache.spark.deploy.master.Master \
>     --host "$( hostname )" \
>     --port "$SPARK_MASTER_PORT" \
>     --webui-port "$SPARK_MASTER_WEBUI_PORT"
> {code}
> The process terminates with IllegalArgumentException "requirement failed: startPort should be between 1024 and 65535 (inclusive), or 0 for a random free port."
> But, when SPARK_MASTER_WEBUI_PORT=8080 (or anything >1024), the process runs fine.
> I do not understand why the usable ports have been arbitrarily restricted to the non-privileged.  Users choosing to run spark as root should be allowed to choose their own ports.
> Full output from a sample run below:
> {code}
> 2015-07-23 14:36:50,892 INFO  [main] master.Master (SignalLogger.scala:register(47)) - Registered signal handlers for [TERM, HUP, INT]
> 2015-07-23 14:36:51,399 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 2015-07-23 14:36:51,586 INFO  [main] spark.SecurityManager (Logging.scala:logInfo(59)) - Changing view acls to: root
> 2015-07-23 14:36:51,587 INFO  [main] spark.SecurityManager (Logging.scala:logInfo(59)) - Changing modify acls to: root
> 2015-07-23 14:36:51,588 INFO  [main] spark.SecurityManager (Logging.scala:logInfo(59)) - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
> 2015-07-23 14:36:52,295 INFO  [sparkMaster-akka.actor.default-dispatcher-2] slf4j.Slf4jLogger (Slf4jLogger.scala:applyOrElse(80)) - Slf4jLogger started
> 2015-07-23 14:36:52,349 INFO  [sparkMaster-akka.actor.default-dispatcher-2] Remoting (Slf4jLogger.scala:apply$mcV$sp(74)) - Starting remoting
> 2015-07-23 14:36:52,489 INFO  [sparkMaster-akka.actor.default-dispatcher-2] Remoting (Slf4jLogger.scala:apply$mcV$sp(74)) - Remoting started; listening on addresses :[akka.tcp://sparkMaster@sparkmaster:7077]
> 2015-07-23 14:36:52,497 INFO  [main] util.Utils (Logging.scala:logInfo(59)) - Successfully started service 'sparkMaster' on port 7077.
> 2015-07-23 14:36:52,717 INFO  [sparkMaster-akka.actor.default-dispatcher-4] server.Server (Server.java:doStart(272)) - jetty-8.y.z-SNAPSHOT
> 2015-07-23 14:36:52,759 INFO  [sparkMaster-akka.actor.default-dispatcher-4] server.AbstractConnector (AbstractConnector.java:doStart(338)) - Started SelectChannelConnector@sparkmaster:6066
> 2015-07-23 14:36:52,759 INFO  [sparkMaster-akka.actor.default-dispatcher-4] util.Utils (Logging.scala:logInfo(59)) - Successfully started service on port 6066.
> 2015-07-23 14:36:52,760 INFO  [sparkMaster-akka.actor.default-dispatcher-4] rest.StandaloneRestServer (Logging.scala:logInfo(59)) - Started REST server for submitting applications on port 6066
> 2015-07-23 14:36:52,765 INFO  [sparkMaster-akka.actor.default-dispatcher-4] master.Master (Logging.scala:logInfo(59)) - Starting Spark master at spark://sparkmaster:7077
> 2015-07-23 14:36:52,766 INFO  [sparkMaster-akka.actor.default-dispatcher-4] master.Master (Logging.scala:logInfo(59)) - Running Spark version 1.4.1
> 2015-07-23 14:36:52,772 ERROR [sparkMaster-akka.actor.default-dispatcher-4] ui.MasterWebUI (Logging.scala:logError(96)) - Failed to bind MasterWebUI
> java.lang.IllegalArgumentException: requirement failed: startPort should be between 1024 and 65535 (inclusive), or 0 for a random free port.
>         at scala.Predef$.require(Predef.scala:233)
>         at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1977)
>         at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:238)
>         at org.apache.spark.ui.WebUI.bind(WebUI.scala:117)
>         at org.apache.spark.deploy.master.Master.preStart(Master.scala:144)
>         at akka.actor.Actor$class.aroundPreStart(Actor.scala:470)
>         at org.apache.spark.deploy.master.Master.aroundPreStart(Master.scala:52)
>         at akka.actor.ActorCell.create(ActorCell.scala:580)
>         at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
>         at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
>         at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
>         at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>         at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>         at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>         at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>         at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>         at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> 2015-07-23 14:36:52,778 INFO  [Thread-1] util.Utils (Logging.scala:logInfo(59)) - Shutdown hook called
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org