You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by tanyatik <gi...@git.apache.org> on 2014/08/20 20:21:58 UTC

[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

GitHub user tanyatik opened a pull request:

    https://github.com/apache/spark/pull/2062

    [SPARK-3150] Fix NullPointerException in in Spark recovery: Add initializing default values in DriverInfo.init()

    The issue happens when Spark is run standalone on a cluster.
    When master and driver fall simultaneously on one node in a cluster, master tries to recover its state and restart spark driver.
    While restarting driver, it falls with NPE exception (stacktrace is below).
    After falling, it restarts and tries to recover its state and restart Spark driver again. It happens over and over in an infinite cycle.
    Namely, Spark tries to read DriverInfo state from zookeeper, but after reading it happens to be null in DriverInfo.worker.
    
    https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20AND%20created%3E%3D-1w%20ORDER%20BY%20created%20DESC

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/tanyatik/spark spark-3150

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/2062.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2062
    
----
commit 9936043f26937887a23aa01d340c34ac71a51673
Author: Tatiana Borisova <ta...@yandex.ru>
Date:   2014-08-20T18:17:45Z

    Add initializing default values in DriverInfo.init()

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2062#discussion_r16687440
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/master/DriverInfo.scala ---
    @@ -33,4 +33,17 @@ private[spark] class DriverInfo(
       @transient var exception: Option[Exception] = None
       /* Most recent worker assigned to this driver */
       @transient var worker: Option[WorkerInfo] = None
    +
    +  init()
    +
    +  private def readObject(in: java.io.ObjectInputStream): Unit = {
    +    in.defaultReadObject()
    +    init()
    +  }
    +
    +  private def init(): Unit = {
    +    state = DriverState.SUBMITTED
    +    worker = None
    +    exception = None
    +  }
    --- End diff --
    
    Ah I see, thanks for the context.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/2062#issuecomment-52829752
  
    Jenkins, this is ok to test.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by tanyatik <gi...@git.apache.org>.
Github user tanyatik commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2062#discussion_r16541679
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/master/DriverInfo.scala ---
    @@ -33,4 +33,17 @@ private[spark] class DriverInfo(
       @transient var exception: Option[Exception] = None
       /* Most recent worker assigned to this driver */
       @transient var worker: Option[WorkerInfo] = None
    +
    +  init()
    +
    +  private def readObject(in: java.io.ObjectInputStream): Unit = {
    +    in.defaultReadObject()
    +    init()
    +  }
    +
    +  private def init(): Unit = {
    +    state = DriverState.SUBMITTED
    +    worker = None
    +    exception = None
    +  }
    --- End diff --
    
    Similar approach is used in WorkerInfo


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by tanyatik <gi...@git.apache.org>.
Github user tanyatik commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2062#discussion_r16532536
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/master/DriverInfo.scala ---
    @@ -33,4 +33,17 @@ private[spark] class DriverInfo(
       @transient var exception: Option[Exception] = None
       /* Most recent worker assigned to this driver */
       @transient var worker: Option[WorkerInfo] = None
    +
    +  init()
    +
    +  private def readObject(in: java.io.ObjectInputStream): Unit = {
    +    in.defaultReadObject()
    +    init()
    +  }
    +
    +  private def init(): Unit = {
    +    state = DriverState.SUBMITTED
    +    worker = None
    +    exception = None
    +  }
    --- End diff --
    
    When a field is deserialized, transient fields become null value, not None, thats why NullPointerException happens.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2062#issuecomment-52837741
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18983/consoleFull) for   PR 2062 at commit [`9936043`](https://github.com/apache/spark/commit/9936043f26937887a23aa01d340c34ac71a51673).
     * This patch **passes** unit tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2062#issuecomment-52830617
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18983/consoleFull) for   PR 2062 at commit [`9936043`](https://github.com/apache/spark/commit/9936043f26937887a23aa01d340c34ac71a51673).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/2062


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by tanyatik <gi...@git.apache.org>.
Github user tanyatik commented on the pull request:

    https://github.com/apache/spark/pull/2062#issuecomment-53388890
  
    Yes I did, this patch fixes NPE and Spark restarts successfully.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2062#issuecomment-52819930
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2062#discussion_r16507237
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/master/DriverInfo.scala ---
    @@ -33,4 +33,17 @@ private[spark] class DriverInfo(
       @transient var exception: Option[Exception] = None
       /* Most recent worker assigned to this driver */
       @transient var worker: Option[WorkerInfo] = None
    +
    +  init()
    +
    +  private def readObject(in: java.io.ObjectInputStream): Unit = {
    +    in.defaultReadObject()
    +    init()
    +  }
    +
    +  private def init(): Unit = {
    +    state = DriverState.SUBMITTED
    +    worker = None
    +    exception = None
    +  }
    --- End diff --
    
    Not sure if I understand your intention. These are all transient and won't actually be serialized, so why do we need to set them every time we `readObject`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/2062#issuecomment-53762494
  
    This looks good to me, too.  I'm merging this into `master`, `branch-1.1`, and `branch-1.0`.  Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/2062#issuecomment-53351589
  
    @tanyatik Have you verified that this fixes the NPE you ran into? If so this LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3150] Fix NullPointerException in in Sp...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/2062#issuecomment-52848628
  
    FWIW, we don't have any Jenkins tests for Zookeeper-based multi-master FT.  There's a docker-based set of integration tests in `FaultToleranceTest.scala`, so maybe we could add a test case there.
    
    We should create a JIRA for proper automated testing of this, including a sub-task to a regression-test for this issue.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org