You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by scwf <gi...@git.apache.org> on 2014/04/03 11:26:35 UTC

[GitHub] spark pull request: method getAllPools in SC throws NPE

GitHub user scwf opened a pull request:

    https://github.com/apache/spark/pull/312

    method getAllPools in SC throws NPE

    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/wangfei/spark0403/spark-master/examples/target/scala-2.10/spark-examples_2.10-assembly-1.0.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/wangfei/spark0403/spark-master/assembly/target/scala-2.10/spark-assembly_2.10-1.0.0-SNAPSHOT-hadoop2.3.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    14/04/03 16:43:19 INFO SecurityManager: SecurityManager, is authentication enabled: false are ui acls enabled: false users with view permissions: Set(root)
    14/04/03 16:43:19 INFO Slf4jLogger: Slf4jLogger started
    14/04/03 16:43:19 INFO Remoting: Starting remoting
    14/04/03 16:43:20 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@VM-13:55231]
    14/04/03 16:43:20 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@VM-13:55231]
    14/04/03 16:43:20 INFO SparkEnv: Registering BlockManagerMaster
    14/04/03 16:43:20 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20140403164320-b637
    14/04/03 16:43:20 INFO MemoryStore: MemoryStore started with capacity 1141.5 MB.
    14/04/03 16:43:20 INFO ConnectionManager: Bound socket to port 33813 with id = ConnectionManagerId(VM-13,33813)
    14/04/03 16:43:20 INFO BlockManagerMaster: Trying to register BlockManager
    14/04/03 16:43:20 INFO BlockManagerInfo: Registering block manager VM-13:33813 with 1141.5 MB RAM
    14/04/03 16:43:20 INFO BlockManagerMaster: Registered BlockManager
    14/04/03 16:43:20 INFO HttpServer: Starting HTTP Server
    14/04/03 16:43:21 INFO HttpBroadcast: Broadcast server started at http://9.91.11.32:58746
    14/04/03 16:43:21 INFO SparkEnv: Registering MapOutputTracker
    14/04/03 16:43:21 INFO HttpFileServer: HTTP File server directory is /tmp/spark-34b290f1-ca2f-489e-93c5-1c3e860874b0
    14/04/03 16:43:21 INFO HttpServer: Starting HTTP Server
    14/04/03 16:43:22 INFO SparkUI: Started Spark Web UI at http://VM-13:4040
    14/04/03 16:43:22 WARN ServletHandler: /stages/
    java.lang.NullPointerException
            at org.apache.spark.SparkContext.getAllPools(SparkContext.scala:712)
            at org.apache.spark.ui.jobs.IndexPage.render(IndexPage.scala:50)
            at org.apache.spark.ui.jobs.JobProgressUI$$anonfun$getHandlers$3.apply(JobProgressUI.scala:58)
            at org.apache.spark.ui.jobs.JobProgressUI$$anonfun$getHandlers$3.apply(JobProgressUI.scala:58)
            at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:68)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
            at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
            at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
            at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
            at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
            at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
            at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
            at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
            at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
            at org.eclipse.jetty.server.Server.handle(Server.java:370)
            at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
            at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971)
            at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033)
            at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:644)
            at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
            at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
            at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
            at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
            at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
            at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
            at java.lang.Thread.run(Thread.java:662)
    
    this is because rootPool method is not implemented in TaskSchedulerImpl

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/scwf/spark patch-1

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/312.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #312
    
----
commit 832741124a69a9f3993921f415cb3c93c52de001
Author: wangfei <wa...@huawei.com>
Date:   2014-04-03T09:24:09Z

    method getAllPools in SC throws NPE 
    
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/wangfei/spark0403/spark-master/examples/target/scala-2.10/spark-examples_2.10-assembly-1.0.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/wangfei/spark0403/spark-master/assembly/target/scala-2.10/spark-assembly_2.10-1.0.0-SNAPSHOT-hadoop2.3.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    14/04/03 16:43:19 INFO SecurityManager: SecurityManager, is authentication enabled: false are ui acls enabled: false users with view permissions: Set(root)
    14/04/03 16:43:19 INFO Slf4jLogger: Slf4jLogger started
    14/04/03 16:43:19 INFO Remoting: Starting remoting
    14/04/03 16:43:20 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://spark@VM-13:55231]
    14/04/03 16:43:20 INFO Remoting: Remoting now listens on addresses: [akka.tcp://spark@VM-13:55231]
    14/04/03 16:43:20 INFO SparkEnv: Registering BlockManagerMaster
    14/04/03 16:43:20 INFO DiskBlockManager: Created local directory at /tmp/spark-local-20140403164320-b637
    14/04/03 16:43:20 INFO MemoryStore: MemoryStore started with capacity 1141.5 MB.
    14/04/03 16:43:20 INFO ConnectionManager: Bound socket to port 33813 with id = ConnectionManagerId(VM-13,33813)
    14/04/03 16:43:20 INFO BlockManagerMaster: Trying to register BlockManager
    14/04/03 16:43:20 INFO BlockManagerInfo: Registering block manager VM-13:33813 with 1141.5 MB RAM
    14/04/03 16:43:20 INFO BlockManagerMaster: Registered BlockManager
    14/04/03 16:43:20 INFO HttpServer: Starting HTTP Server
    14/04/03 16:43:21 INFO HttpBroadcast: Broadcast server started at http://9.91.11.32:58746
    14/04/03 16:43:21 INFO SparkEnv: Registering MapOutputTracker
    14/04/03 16:43:21 INFO HttpFileServer: HTTP File server directory is /tmp/spark-34b290f1-ca2f-489e-93c5-1c3e860874b0
    14/04/03 16:43:21 INFO HttpServer: Starting HTTP Server
    14/04/03 16:43:22 INFO SparkUI: Started Spark Web UI at http://VM-13:4040
    14/04/03 16:43:22 WARN ServletHandler: /stages/
    java.lang.NullPointerException
            at org.apache.spark.SparkContext.getAllPools(SparkContext.scala:712)
            at org.apache.spark.ui.jobs.IndexPage.render(IndexPage.scala:50)
            at org.apache.spark.ui.jobs.JobProgressUI$$anonfun$getHandlers$3.apply(JobProgressUI.scala:58)
            at org.apache.spark.ui.jobs.JobProgressUI$$anonfun$getHandlers$3.apply(JobProgressUI.scala:58)
            at org.apache.spark.ui.JettyUtils$$anon$1.doGet(JettyUtils.scala:68)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
            at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
            at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
            at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
            at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
            at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
            at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
            at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
            at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
            at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
            at org.eclipse.jetty.server.Server.handle(Server.java:370)
            at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
            at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971)
            at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033)
            at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:644)
            at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
            at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
            at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
            at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
            at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
            at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
            at java.lang.Thread.run(Thread.java:662)

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: method getAllPools in SC throws NPE

Posted by scwf <gi...@git.apache.org>.
Github user scwf commented on the pull request:

    https://github.com/apache/spark/pull/312#issuecomment-39431524
  
    TaskSchedulerImpl should override rootPool method


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: method getAllPools in SC throws NPE

Posted by scwf <gi...@git.apache.org>.
Github user scwf commented on the pull request:

    https://github.com/apache/spark/pull/312#issuecomment-39434966
  
    i am wrong, i think this is an order issue, it is a matter of the order of starting of ui and the taskScheduler in sc


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: method getAllPools in SC throws NPE

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/312#issuecomment-39529689
  
    I don't think this is an ordering issue. `getAllPools` is only called if you render the page, and by the time  `render` is called the task scheduler should already be loaded and initialized.
    
    How did you produce this NPE? (Also, in the future could you add an accompanying JIRA [here](https://issues.apache.org/jira/browse/SPARK)?)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: method getAllPools in SC throws NPE

Posted by scwf <gi...@git.apache.org>.
Github user scwf closed the pull request at:

    https://github.com/apache/spark/pull/312


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: method getAllPools in SC throws NPE

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/312#issuecomment-39431227
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: method getAllPools in SC throws NPE

Posted by scwf <gi...@git.apache.org>.
Github user scwf commented on the pull request:

    https://github.com/apache/spark/pull/312#issuecomment-39522217
  
    patch updated


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---