You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Dhimant <dh...@gmail.com> on 2014/09/04 11:58:28 UTC

Multiple spark shell sessions

Hi,
I am receiving following error while connecting the spark server via shell
if one shell is already open.
How can I open multiple sessions ?

Does anyone know abt Workflow Engine/Job Server like apache oozie for spark
?

/
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.0.2
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
1.7.0_60)
Type in expressions to have them evaluated.
Type :help for more information.
14/09/04 15:07:46 INFO spark.SecurityManager: Changing view acls to: root
14/09/04 15:07:46 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(root)
14/09/04 15:07:46 INFO slf4j.Slf4jLogger: Slf4jLogger started
14/09/04 15:07:47 INFO Remoting: Starting remoting
14/09/04 15:07:47 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://spark@sparkmaster.guavus.com:42236]
14/09/04 15:07:47 INFO Remoting: Remoting now listens on addresses:
[akka.tcp://spark@sparkmaster.guavus.com:42236]
14/09/04 15:07:47 INFO spark.SparkEnv: Registering MapOutputTracker
14/09/04 15:07:47 INFO spark.SparkEnv: Registering BlockManagerMaster
14/09/04 15:07:47 INFO storage.DiskBlockManager: Created local directory at
/tmp/spark-local-20140904150747-4dcd
14/09/04 15:07:47 INFO storage.MemoryStore: MemoryStore started with
capacity 294.9 MB.
14/09/04 15:07:47 INFO network.ConnectionManager: Bound socket to port 54453
with id = ConnectionManagerId(sparkmaster.guavus.com,54453)
14/09/04 15:07:47 INFO storage.BlockManagerMaster: Trying to register
BlockManager
14/09/04 15:07:47 INFO storage.BlockManagerInfo: Registering block manager
sparkmaster.guavus.com:54453 with 294.9 MB RAM
14/09/04 15:07:47 INFO storage.BlockManagerMaster: Registered BlockManager
14/09/04 15:07:47 INFO spark.HttpServer: Starting HTTP Server
14/09/04 15:07:47 INFO server.Server: jetty-8.y.z-SNAPSHOT
14/09/04 15:07:47 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:48977
14/09/04 15:07:47 INFO broadcast.HttpBroadcast: Broadcast server started at
http://192.168.1.21:48977
14/09/04 15:07:47 INFO spark.HttpFileServer: HTTP File server directory is
/tmp/spark-0e45759a-2c58-439a-8e96-95b0bc1d6136
14/09/04 15:07:47 INFO spark.HttpServer: Starting HTTP Server
14/09/04 15:07:47 INFO server.Server: jetty-8.y.z-SNAPSHOT
14/09/04 15:07:47 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:39962
14/09/04 15:07:48 INFO server.Server: jetty-8.y.z-SNAPSHOT
14/09/04 15:07:48 WARN component.AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already
in use
java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:444)
        at sun.nio.ch.Net.bind(Net.java:436)
        at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.eclipse.jetty.server.Server.doStart(Server.java:293)
        at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at
org.apache.spark.ui.JettyUtils$$anonfun$1.apply$mcV$sp(JettyUtils.scala:192)
        at
org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
        at
org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
        at scala.util.Try$.apply(Try.scala:161)
        at org.apache.spark.ui.JettyUtils$.connect$1(JettyUtils.scala:191)
        at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:205)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:99)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
        at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957)
        at $line3.$read$$iwC$$iwC.<init>(<console>:8)
        at $line3.$read$$iwC.<init>(<console>:14)
        at $line3.$read.<init>(<console>:16)
        at $line3.$read$.<init>(<console>:20)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.<init>(<console>:7)
        at $line3.$eval$.<clinit>(<console>)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
        at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
        at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
        at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
        at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
        at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
        at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
        at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
        at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
        at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
        at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
        at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
        at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
14/09/04 15:07:48 WARN component.AbstractLifeCycle: FAILED
org.eclipse.jetty.server.Server@20a88f61: java.net.BindException: Address
already in use
java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:444)
        at sun.nio.ch.Net.bind(Net.java:436)
        at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.eclipse.jetty.server.Server.doStart(Server.java:293)
        at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at
org.apache.spark.ui.JettyUtils$$anonfun$1.apply$mcV$sp(JettyUtils.scala:192)
        at
org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
        at
org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
        at scala.util.Try$.apply(Try.scala:161)
        at org.apache.spark.ui.JettyUtils$.connect$1(JettyUtils.scala:191)
        at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:205)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:99)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
        at
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957)
        at $line3.$read$$iwC$$iwC.<init>(<console>:8)
        at $line3.$read$$iwC.<init>(<console>:14)
        at $line3.$read.<init>(<console>:16)
        at $line3.$read$.<init>(<console>:20)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.<init>(<console>:7)
        at $line3.$eval$.<clinit>(<console>)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
        at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
        at
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
        at
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
        at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
        at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
        at
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
        at
org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
        at
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
        at
org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
        at
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
        at
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
        at
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
/



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-spark-shell-sessions-tp13441.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Multiple spark shell sessions

Posted by Andrew Ash <an...@andrewash.com>.
Hi Dhimant,

We also cleaned up these needless warnings on port failover in Spark 1.1 --
see https://issues.apache.org/jira/browse/SPARK-1902

Andrew


On Thu, Sep 4, 2014 at 7:38 AM, Dhimant <dh...@gmail.com> wrote:

> Thanks Yana,
> I am able to execute application and command via another session, i also
> received another port for UI application.
>
> Thanks,
> Dhimant
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-spark-shell-sessions-tp13441p13459.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Multiple spark shell sessions

Posted by Dhimant <dh...@gmail.com>.
Thanks Yana,
I am able to execute application and command via another session, i also
received another port for UI application.

Thanks,
Dhimant



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-spark-shell-sessions-tp13441p13459.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Multiple spark shell sessions

Posted by Yana Kadiyska <ya...@gmail.com>.
These are just warnings from the web server. Normally your application will
have a UI page on port 4040. In your case, a little after the warning it
should bind just fine to another port (mine picked 4041). Im running on
0.9.1. Do you actually see the application failing? The main thing when
running more than one apps against the master is to make sure you have
enough resources for each (i.e. neither is grabbing too many resources or
cores). You can see if your app is successfully connected by looking at the
master UI page.


On Thu, Sep 4, 2014 at 5:58 AM, Dhimant <dh...@gmail.com> wrote:

> Hi,
> I am receiving following error while connecting the spark server via shell
> if one shell is already open.
> How can I open multiple sessions ?
>
> Does anyone know abt Workflow Engine/Job Server like apache oozie for spark
> ?
>
> /
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 1.0.2
>       /_/
>
> Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
> 1.7.0_60)
> Type in expressions to have them evaluated.
> Type :help for more information.
> 14/09/04 15:07:46 INFO spark.SecurityManager: Changing view acls to: root
> 14/09/04 15:07:46 INFO spark.SecurityManager: SecurityManager:
> authentication disabled; ui acls disabled; users with view permissions:
> Set(root)
> 14/09/04 15:07:46 INFO slf4j.Slf4jLogger: Slf4jLogger started
> 14/09/04 15:07:47 INFO Remoting: Starting remoting
> 14/09/04 15:07:47 INFO Remoting: Remoting started; listening on addresses
> :[akka.tcp://spark@sparkmaster.guavus.com:42236]
> 14/09/04 15:07:47 INFO Remoting: Remoting now listens on addresses:
> [akka.tcp://spark@sparkmaster.guavus.com:42236]
> 14/09/04 15:07:47 INFO spark.SparkEnv: Registering MapOutputTracker
> 14/09/04 15:07:47 INFO spark.SparkEnv: Registering BlockManagerMaster
> 14/09/04 15:07:47 INFO storage.DiskBlockManager: Created local directory at
> /tmp/spark-local-20140904150747-4dcd
> 14/09/04 15:07:47 INFO storage.MemoryStore: MemoryStore started with
> capacity 294.9 MB.
> 14/09/04 15:07:47 INFO network.ConnectionManager: Bound socket to port
> 54453
> with id = ConnectionManagerId(sparkmaster.guavus.com,54453)
> 14/09/04 15:07:47 INFO storage.BlockManagerMaster: Trying to register
> BlockManager
> 14/09/04 15:07:47 INFO storage.BlockManagerInfo: Registering block manager
> sparkmaster.guavus.com:54453 with 294.9 MB RAM
> 14/09/04 15:07:47 INFO storage.BlockManagerMaster: Registered BlockManager
> 14/09/04 15:07:47 INFO spark.HttpServer: Starting HTTP Server
> 14/09/04 15:07:47 INFO server.Server: jetty-8.y.z-SNAPSHOT
> 14/09/04 15:07:47 INFO server.AbstractConnector: Started
> SocketConnector@0.0.0.0:48977
> 14/09/04 15:07:47 INFO broadcast.HttpBroadcast: Broadcast server started at
> http://192.168.1.21:48977
> 14/09/04 15:07:47 INFO spark.HttpFileServer: HTTP File server directory is
> /tmp/spark-0e45759a-2c58-439a-8e96-95b0bc1d6136
> 14/09/04 15:07:47 INFO spark.HttpServer: Starting HTTP Server
> 14/09/04 15:07:47 INFO server.Server: jetty-8.y.z-SNAPSHOT
> 14/09/04 15:07:47 INFO server.AbstractConnector: Started
> SocketConnector@0.0.0.0:39962
> 14/09/04 15:07:48 INFO server.Server: jetty-8.y.z-SNAPSHOT
> 14/09/04 15:07:48 WARN component.AbstractLifeCycle: FAILED
> SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address
> already
> in use
> java.net.BindException: Address already in use
>         at sun.nio.ch.Net.bind0(Native Method)
>         at sun.nio.ch.Net.bind(Net.java:444)
>         at sun.nio.ch.Net.bind(Net.java:436)
>         at
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>         at
>
> org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
>         at
>
> org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
>         at
>
> org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
>         at
>
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>         at org.eclipse.jetty.server.Server.doStart(Server.java:293)
>         at
>
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>         at
>
> org.apache.spark.ui.JettyUtils$$anonfun$1.apply$mcV$sp(JettyUtils.scala:192)
>         at
> org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
>         at
> org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
>         at scala.util.Try$.apply(Try.scala:161)
>         at org.apache.spark.ui.JettyUtils$.connect$1(JettyUtils.scala:191)
>         at
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:205)
>         at org.apache.spark.ui.WebUI.bind(WebUI.scala:99)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
>         at
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957)
>         at $line3.$read$$iwC$$iwC.<init>(<console>:8)
>         at $line3.$read$$iwC.<init>(<console>:14)
>         at $line3.$read.<init>(<console>:16)
>         at $line3.$read$.<init>(<console>:20)
>         at $line3.$read$.<clinit>(<console>)
>         at $line3.$eval$.<init>(<console>:7)
>         at $line3.$eval$.<clinit>(<console>)
>         at $line3.$eval.$print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
>         at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
>         at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
>         at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
>         at
>
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
>         at
>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
>         at
>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
>         at
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
>         at
>
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
>         at
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
>         at
>
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
>         at
>
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
>         at
> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
>         at
>
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 14/09/04 15:07:48 WARN component.AbstractLifeCycle: FAILED
> org.eclipse.jetty.server.Server@20a88f61: java.net.BindException: Address
> already in use
> java.net.BindException: Address already in use
>         at sun.nio.ch.Net.bind0(Native Method)
>         at sun.nio.ch.Net.bind(Net.java:444)
>         at sun.nio.ch.Net.bind(Net.java:436)
>         at
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
>         at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>         at
>
> org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
>         at
>
> org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
>         at
>
> org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
>         at
>
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>         at org.eclipse.jetty.server.Server.doStart(Server.java:293)
>         at
>
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>         at
>
> org.apache.spark.ui.JettyUtils$$anonfun$1.apply$mcV$sp(JettyUtils.scala:192)
>         at
> org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
>         at
> org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
>         at scala.util.Try$.apply(Try.scala:161)
>         at org.apache.spark.ui.JettyUtils$.connect$1(JettyUtils.scala:191)
>         at
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:205)
>         at org.apache.spark.ui.WebUI.bind(WebUI.scala:99)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
>         at
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957)
>         at $line3.$read$$iwC$$iwC.<init>(<console>:8)
>         at $line3.$read$$iwC.<init>(<console>:14)
>         at $line3.$read.<init>(<console>:16)
>         at $line3.$read$.<init>(<console>:20)
>         at $line3.$read$.<clinit>(<console>)
>         at $line3.$eval$.<init>(<console>:7)
>         at $line3.$eval$.<clinit>(<console>)
>         at $line3.$eval.$print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
>         at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
>         at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
>         at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
>         at
>
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
>         at
>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
>         at
>
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
>         at
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
>         at
>
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
>         at
> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
>         at
>
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
>         at
>
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
>         at
> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
>         at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
>         at
>
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>         at org.apache.spark.repl.Main.main(Main.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:303)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> /
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-spark-shell-sessions-tp13441.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>