You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by heary-cao <gi...@git.apache.org> on 2018/02/06 09:13:52 UTC

[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

GitHub user heary-cao opened a pull request:

    https://github.com/apache/spark/pull/20516

    [SPARK-23343][CORE][TEST] Increase the exception test for the bind port

    ## What changes were proposed in this pull request?
    
    this PR add new test case, 
    1、add the boundary value test of port 65535
    2、add the privileged port to test,
    3、add rebinding port test when set `spark.port.maxRetries` is 1,
    4、add `Utils.userPort` self circulation to generating port, 
    
    in addition, in the existing test case, if you don't set the `spark.testing` for true, the default value for `spark.port.maxRetries` is not 100, but 16, (expectedPort + 100) is a little mistake.
    
    ## How was this patch tested?
    
    add new test case.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/heary-cao/spark NettyBlockTransferServiceSuite

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20516.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20516
    
----
commit b276c02956dfc4d475b3e3b42657e10ceab2669e
Author: caoxuewen <ca...@...>
Date:   2018-02-06T08:51:57Z

    Increase the exception test for the bind port

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166507524
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,68 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to max specific ports is true") {
    +    service0 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +  }
    +
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    val excMsg = intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }.getMessage
    +
    +    assert(excMsg.contains("startPort should be between 1024 and 65535 (inclusive), " +
    +      "or 0 for a random free port."))
    +  }
    +
    +  test("can't to bind same port") {
    +    val port = 17634 + Random.nextInt(10000)
    +    logInfo("random port for test: " + port)
    +    service0 = createService(port)
    +    verifyServicePort(expectedPort = port, actualPort = service0.port)
    +
    +    service1 = createService(service0.port)
    +    // `service0.port` is occupied, so `service1.port` should not be `service0.port`
    +    verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
    +
    +    // bind already in use `service0.port`
    +    val excMsg = intercept[BindException] {
    +      val conf = new SparkConf()
    +        .set("spark.app.id", s"test-${getClass.getName}")
    +        .set("spark.testing", "true")
    +        .set("spark.port.maxRetries", "1")
    +
    +      val securityManager = new SecurityManager(conf)
    +      val blockDataManager = mock(classOf[BlockDataManager])
    +      val service = new NettyBlockTransferService(conf, securityManager, "localhost", "localhost",
    +        service0.port, 1)
    +      service.init(blockDataManager)
    +    }.getMessage
    +
    +    assert(excMsg.contains("Address already in use: bind"))
    --- End diff --
    
    It may be a bit deviant for the test purpose.  mean turn off spark.port.maxRetries, bind repeat port is fail. thanks.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166767674
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,53 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }
    +  }
    +
    +  test("turn off spark.port.maxRetries, bind repeat port is fail") {
    +    val port = 17634 + Random.nextInt(10000)
    +    logInfo("random port for test: " + port)
    +    service0 = createService(port)
    +
    +    // `service0.port` is occupied, bind repeat port throw BindException.
    +    intercept[BindException] {
    --- End diff --
    
    I still don't think this is something to test. You're basically testing Netty's API


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166348587
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,68 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to max specific ports is true") {
    +    service0 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +  }
    +
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    val excMsg = intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }.getMessage
    +
    +    assert(excMsg.contains("startPort should be between 1024 and 65535 (inclusive), " +
    --- End diff --
    
    It might be a little brittle to assert so much about the exact text of the message.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r167175502
  
    --- Diff: core/src/test/scala/org/apache/spark/SparkFunSuite.scala ---
    @@ -59,6 +59,7 @@ abstract class SparkFunSuite
       protected val enableAutoThreadAudit = true
     
       protected override def beforeAll(): Unit = {
    +    System.setProperty("spark.testing", "true")
    --- End diff --
    
    @cloud-fan ,thanks, I have update it. Can you help me to test it. thanks.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao closed the pull request at:

    https://github.com/apache/spark/pull/20516


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166767860
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,53 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }
    +  }
    +
    +  test("turn off spark.port.maxRetries, bind repeat port is fail") {
    +    val port = 17634 + Random.nextInt(10000)
    +    logInfo("random port for test: " + port)
    +    service0 = createService(port)
    +
    +    // `service0.port` is occupied, bind repeat port throw BindException.
    +    intercept[BindException] {
    +      val conf = new SparkConf()
    +        .set("spark.app.id", s"test-${getClass.getName}")
    +        .set("spark.testing", "true")
    +        .set("spark.port.maxRetries", "0")
    +
    +      val securityManager = new SecurityManager(conf)
    +      val blockDataManager = mock(classOf[BlockDataManager])
    +      val service = new NettyBlockTransferService(conf, securityManager, "localhost", "localhost",
    +        service0.port, 1)
    +      service.init(blockDataManager)
    +    }
    +  }
    +
       private def verifyServicePort(expectedPort: Int, actualPort: Int): Unit = {
         actualPort should be >= expectedPort
         // avoid testing equality in case of simultaneous tests
    +    // `spark.testing` is true,
         // the default value for `spark.port.maxRetries` is 100 under test
         actualPort should be <= (expectedPort + 100)
       }
     
       private def createService(port: Int): NettyBlockTransferService = {
         val conf = new SparkConf()
           .set("spark.app.id", s"test-${getClass.getName}")
    +      .set("spark.testing", "true")
    --- End diff --
    
    This seems OK, as does cleaning up closing of services, but I'm really not sure about the rest.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by jiangxb1987 <gi...@git.apache.org>.
Github user jiangxb1987 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166849307
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,53 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }
    +  }
    +
    +  test("turn off spark.port.maxRetries, bind repeat port is fail") {
    +    val port = 17634 + Random.nextInt(10000)
    +    logInfo("random port for test: " + port)
    +    service0 = createService(port)
    +
    +    // `service0.port` is occupied, bind repeat port throw BindException.
    +    intercept[BindException] {
    +      val conf = new SparkConf()
    +        .set("spark.app.id", s"test-${getClass.getName}")
    +        .set("spark.testing", "true")
    +        .set("spark.port.maxRetries", "0")
    +
    +      val securityManager = new SecurityManager(conf)
    +      val blockDataManager = mock(classOf[BlockDataManager])
    +      val service = new NettyBlockTransferService(conf, securityManager, "localhost", "localhost",
    +        service0.port, 1)
    +      service.init(blockDataManager)
    +    }
    +  }
    +
       private def verifyServicePort(expectedPort: Int, actualPort: Int): Unit = {
         actualPort should be >= expectedPort
         // avoid testing equality in case of simultaneous tests
    +    // `spark.testing` is true,
         // the default value for `spark.port.maxRetries` is 100 under test
         actualPort should be <= (expectedPort + 100)
       }
     
       private def createService(port: Int): NettyBlockTransferService = {
         val conf = new SparkConf()
           .set("spark.app.id", s"test-${getClass.getName}")
    +      .set("spark.testing", "true")
    --- End diff --
    
    +1


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    are you sure `./project/SparkBuild.scala:795: javaOptions in Test += "-Dspark.testing=1"` only affect non-test code path? Then we have a lot of places to fix.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166494159
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,68 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to max specific ports is true") {
    +    service0 = createService(port = 65535)
    --- End diff --
    
    okay,I update it.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    @jiangxb1987 thank you for review it.
    First, the source of` 100 `in the revised` (expectedPort + 100)` must be after the setting of the spark.testing,
    Second, to add some boundary tests, such as port 65535 and Utils.userPort self circulation to generating port when base port is 65535.



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    @cloud-fan thank you for suggest.
    `./project/SparkBuild.scala:795: javaOptions in Test += "-Dspark.testing=1"`  seems only the compiler of the spark effectively,
    No effect on the SparkFunSuite unit test.  I update this PR to provides a solution to fix it.
    Can you help me to review it.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166814031
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,53 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }
    +  }
    +
    +  test("turn off spark.port.maxRetries, bind repeat port is fail") {
    +    val port = 17634 + Random.nextInt(10000)
    +    logInfo("random port for test: " + port)
    +    service0 = createService(port)
    +
    +    // `service0.port` is occupied, bind repeat port throw BindException.
    +    intercept[BindException] {
    +      val conf = new SparkConf()
    +        .set("spark.app.id", s"test-${getClass.getName}")
    +        .set("spark.testing", "true")
    +        .set("spark.port.maxRetries", "0")
    +
    +      val securityManager = new SecurityManager(conf)
    +      val blockDataManager = mock(classOf[BlockDataManager])
    +      val service = new NettyBlockTransferService(conf, securityManager, "localhost", "localhost",
    +        service0.port, 1)
    +      service.init(blockDataManager)
    +    }
    +  }
    +
       private def verifyServicePort(expectedPort: Int, actualPort: Int): Unit = {
         actualPort should be >= expectedPort
         // avoid testing equality in case of simultaneous tests
    +    // `spark.testing` is true,
         // the default value for `spark.port.maxRetries` is 100 under test
         actualPort should be <= (expectedPort + 100)
       }
     
       private def createService(port: Int): NettyBlockTransferService = {
         val conf = new SparkConf()
           .set("spark.app.id", s"test-${getClass.getName}")
    +      .set("spark.testing", "true")
    --- End diff --
    
    @srowen,
     if you don't set the spark.testing for true, the default value for spark.port.maxRetries is not 100, but 16,
     so in verifyServicePort function,
     actualPort should be <= (expectedPort + 100) is not true.
     because
     def portMaxRetries(conf: SparkConf): Int = {
     val maxRetries = conf.getOption("spark.port.maxRetries").map(_.toInt)
     if (conf.contains("spark.testing")) {
     // Set a higher number of retries for tests...
     maxRetries.getOrElse(100)
     } else {
     maxRetries.getOrElse(16)
     }
     }
     thanks.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r167133433
  
    --- Diff: core/src/test/scala/org/apache/spark/SparkFunSuite.scala ---
    @@ -59,6 +59,7 @@ abstract class SparkFunSuite
       protected val enableAutoThreadAudit = true
     
       protected override def beforeAll(): Unit = {
    +    System.setProperty("spark.testing", "true")
    --- End diff --
    
    why we need this?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by jiangxb1987 <gi...@git.apache.org>.
Github user jiangxb1987 commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    Does this PR add any value or fix any bugs?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    I try with mavn(using command line) to test case , it is right. thanks.
    Then, whether we add System.setProperty("spark.testing", "true") in SparkFunSuite to slove the IDE test tool problem. Be similar to HiveSparkSubmitSuite RPackageUtilsSuite SparkSubmitSuite. thanks.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r167159549
  
    --- Diff: core/src/test/scala/org/apache/spark/SparkFunSuite.scala ---
    @@ -59,6 +59,7 @@ abstract class SparkFunSuite
       protected val enableAutoThreadAudit = true
     
       protected override def beforeAll(): Unit = {
    +    System.setProperty("spark.testing", "true")
    --- End diff --
    
    if we are already doing this, let's make it more explicit that we should remove `./project/SparkBuild.scala:795: javaOptions in Test += "-Dspark.testing=1"` and set `spark.testing` in `SparkFunSuite.beforeAll`.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r167156814
  
    --- Diff: core/src/test/scala/org/apache/spark/SparkFunSuite.scala ---
    @@ -59,6 +59,7 @@ abstract class SparkFunSuite
       protected val enableAutoThreadAudit = true
     
       protected override def beforeAll(): Unit = {
    +    System.setProperty("spark.testing", "true")
    --- End diff --
    
    Sorry let me make the question more clear.
    
    Why we need this if `./project/SparkBuild.scala:795: javaOptions in Test += "-Dspark.testing=1"` works?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166350486
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,68 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to max specific ports is true") {
    +    service0 = createService(port = 65535)
    --- End diff --
    
    This is redundant with the test below.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166349531
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,68 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to max specific ports is true") {
    +    service0 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +  }
    +
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    val excMsg = intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }.getMessage
    +
    +    assert(excMsg.contains("startPort should be between 1024 and 65535 (inclusive), " +
    +      "or 0 for a random free port."))
    +  }
    +
    +  test("can't to bind same port") {
    +    val port = 17634 + Random.nextInt(10000)
    +    logInfo("random port for test: " + port)
    +    service0 = createService(port)
    +    verifyServicePort(expectedPort = port, actualPort = service0.port)
    --- End diff --
    
    I don't believe this part tests anything new?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r167158512
  
    --- Diff: core/src/test/scala/org/apache/spark/SparkFunSuite.scala ---
    @@ -59,6 +59,7 @@ abstract class SparkFunSuite
       protected val enableAutoThreadAudit = true
     
       protected override def beforeAll(): Unit = {
    +    System.setProperty("spark.testing", "true")
    --- End diff --
    
    My debugging tool is IDEA, I think IDE had no relevance to the process of setting.
    Be similar to HiveSparkSubmitSuite RPackageUtilsSuite SparkSubmitSuite.
    There are also manually add System.setProperty("spark.testing", "true").
    Of course, I try with mavn(using command line) to test case , it is right.  thanks.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    sure,
    Operation environment: IDEA test tool.
    test case:  test("can bind to a specific port")
    Test code:
    val maxRetries = portMaxRetries(conf)
        println("maxRetries:" + maxRetries)
    run result:
        maxRetries:16
    
    if and only if add System.setProperty("spark.testing", "true")  in SparkFunSuite.
    run result:
        maxRetries: 100
    thanks.



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r167137766
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -2222,7 +2222,7 @@ private[spark] object Utils extends Logging {
        */
       def portMaxRetries(conf: SparkConf): Int = {
         val maxRetries = conf.getOption("spark.port.maxRetries").map(_.toInt)
    -    if (conf.contains("spark.testing")) {
    +    if (isTesting || conf.contains("spark.testing")) {
    --- End diff --
    
    Sorry, may I have this understanding of one-sided point. It is not just in the test call. But when we need to get the default value for spark.port.maxRetries is 100.  still need to set the `'spark.testing` .
    Or in the Spark unit test set test sign. so I added it to here. thanks.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166767449
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,53 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    --- End diff --
    
    Don't all these tests need to close() the service they create?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r167133408
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -2222,7 +2222,7 @@ private[spark] object Utils extends Logging {
        */
       def portMaxRetries(conf: SparkConf): Int = {
         val maxRetries = conf.getOption("spark.port.maxRetries").map(_.toInt)
    -    if (conf.contains("spark.testing")) {
    +    if (isTesting || conf.contains("spark.testing")) {
    --- End diff --
    
    shall we just call `isTesting` here?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    Can you try with SBT(using command line)? Usually we don't trust the test result of IDE.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    @srowen please review it again.thanks.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    @srowen,
    if you don't set the spark.testing for true, the default value for spark.port.maxRetries is not 100, but 16,
    so in verifyServicePort function, 
    actualPort should be <= (expectedPort + 100)  is not true.
    because 
    def portMaxRetries(conf: SparkConf): Int = {
        val maxRetries = conf.getOption("spark.port.maxRetries").map(_.toInt)
        if (conf.contains("spark.testing")) {
          // Set a higher number of retries for tests...
          maxRetries.getOrElse(100)
        } else {
          maxRetries.getOrElse(16)
        }
      }
    thanks.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166350354
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,68 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to max specific ports is true") {
    +    service0 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +  }
    +
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    val excMsg = intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }.getMessage
    +
    +    assert(excMsg.contains("startPort should be between 1024 and 65535 (inclusive), " +
    +      "or 0 for a random free port."))
    +  }
    +
    +  test("can't to bind same port") {
    +    val port = 17634 + Random.nextInt(10000)
    +    logInfo("random port for test: " + port)
    +    service0 = createService(port)
    +    verifyServicePort(expectedPort = port, actualPort = service0.port)
    +
    +    service1 = createService(service0.port)
    +    // `service0.port` is occupied, so `service1.port` should not be `service0.port`
    +    verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
    +
    +    // bind already in use `service0.port`
    +    val excMsg = intercept[BindException] {
    +      val conf = new SparkConf()
    +        .set("spark.app.id", s"test-${getClass.getName}")
    +        .set("spark.testing", "true")
    +        .set("spark.port.maxRetries", "1")
    +
    +      val securityManager = new SecurityManager(conf)
    +      val blockDataManager = mock(classOf[BlockDataManager])
    +      val service = new NettyBlockTransferService(conf, securityManager, "localhost", "localhost",
    +        service0.port, 1)
    +      service.init(blockDataManager)
    +    }.getMessage
    +
    +    assert(excMsg.contains("Address already in use: bind"))
    --- End diff --
    
    Hm, is this verifying that the port actually bound? that's much easier to test directly.
    Is this verifying `BindException` is thrown? not sure that's a behavior that we need to guarantee, even if it's almost certainly what the implementation will do.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    Can one of the admins verify this patch?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    @srowen, I think this is a function provided by spark for port use,
    One is that the spark user only needs to specify start port and the offset of ports (spark.port.maxRetries settings), the port binding is automatically generated by the spark.
    two is that when the spark user has must be bind the specified port (set spark.port.maxRetries = 0). but, Once the specified port has been bound to the system, then spark will be thrown bind exception.
    thanks.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    @heary-cao I'm not sure how that addresses the question here, which is just about cleaning up opened resources in the test code. It's not directly related to your change, but affects code you are changing. The tests don't fail, it seems, just seems odd that the tests don't clean up the services they open.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166766796
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,53 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }
    +  }
    +
    +  test("turn off spark.port.maxRetries, bind repeat port is fail") {
    +    val port = 17634 + Random.nextInt(10000)
    +    logInfo("random port for test: " + port)
    +    service0 = createService(port)
    +
    +    // `service0.port` is occupied, bind repeat port throw BindException.
    +    intercept[BindException] {
    +      val conf = new SparkConf()
    +        .set("spark.app.id", s"test-${getClass.getName}")
    +        .set("spark.testing", "true")
    +        .set("spark.port.maxRetries", "0")
    +
    +      val securityManager = new SecurityManager(conf)
    +      val blockDataManager = mock(classOf[BlockDataManager])
    +      val service = new NettyBlockTransferService(conf, securityManager, "localhost", "localhost",
    +        service0.port, 1)
    +      service.init(blockDataManager)
    +    }
    +  }
    +
       private def verifyServicePort(expectedPort: Int, actualPort: Int): Unit = {
         actualPort should be >= expectedPort
         // avoid testing equality in case of simultaneous tests
    +    // `spark.testing` is true,
    --- End diff --
    
    Is this missing the word "if"?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20516: [SPARK-23343][CORE][TEST] Increase the exception test fo...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/20516
  
    Now this no longer changes `spark.testing`, which was the main point here. I think this should just be closed; the purpose is changing and unclear here.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by heary-cao <gi...@git.apache.org>.
Github user heary-cao commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r167134346
  
    --- Diff: core/src/test/scala/org/apache/spark/SparkFunSuite.scala ---
    @@ -59,6 +59,7 @@ abstract class SparkFunSuite
       protected val enableAutoThreadAudit = true
     
       protected override def beforeAll(): Unit = {
    +    System.setProperty("spark.testing", "true")
    --- End diff --
    
    If this parameter is not set, sys.props.contains ("spark.testing") is false.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166767609
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,53 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }
    +  }
    +
    +  test("turn off spark.port.maxRetries, bind repeat port is fail") {
    +    val port = 17634 + Random.nextInt(10000)
    --- End diff --
    
    This seems to duplicate setup from other tests. Is it not possible to test several conditions in one test passage?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r166848789
  
    --- Diff: core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferServiceSuite.scala ---
    @@ -77,16 +79,53 @@ class NettyBlockTransferServiceSuite
         verifyServicePort(expectedPort = service0.port + 1, actualPort = service1.port)
       }
     
    +  test("can bind to two max specific ports") {
    +    service0 = createService(port = 65535)
    +    service1 = createService(port = 65535)
    +    verifyServicePort(expectedPort = 65535, actualPort = service0.port)
    +    // see `Utils.userPort` the user port to try when trying to bind a service,
    +    // the max privileged port is 1024.
    +    verifyServicePort(expectedPort = 1024, actualPort = service1.port)
    +  }
    +
    +  test("can't bind to a privileged port") {
    +    intercept[IllegalArgumentException] {
    +      service0 = createService(port = 23)
    +    }
    +  }
    +
    +  test("turn off spark.port.maxRetries, bind repeat port is fail") {
    +    val port = 17634 + Random.nextInt(10000)
    +    logInfo("random port for test: " + port)
    +    service0 = createService(port)
    +
    +    // `service0.port` is occupied, bind repeat port throw BindException.
    +    intercept[BindException] {
    +      val conf = new SparkConf()
    +        .set("spark.app.id", s"test-${getClass.getName}")
    +        .set("spark.testing", "true")
    +        .set("spark.port.maxRetries", "0")
    +
    +      val securityManager = new SecurityManager(conf)
    +      val blockDataManager = mock(classOf[BlockDataManager])
    +      val service = new NettyBlockTransferService(conf, securityManager, "localhost", "localhost",
    +        service0.port, 1)
    +      service.init(blockDataManager)
    +    }
    +  }
    +
       private def verifyServicePort(expectedPort: Int, actualPort: Int): Unit = {
         actualPort should be >= expectedPort
         // avoid testing equality in case of simultaneous tests
    +    // `spark.testing` is true,
         // the default value for `spark.port.maxRetries` is 100 under test
         actualPort should be <= (expectedPort + 100)
       }
     
       private def createService(port: Int): NettyBlockTransferService = {
         val conf = new SparkConf()
           .set("spark.app.id", s"test-${getClass.getName}")
    +      .set("spark.testing", "true")
    --- End diff --
    
    I think the test framework will set `spark.testing`, in 
    `./project/SparkBuild.scala:795:    javaOptions in Test += "-Dspark.testing=1"`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20516: [SPARK-23343][CORE][TEST] Increase the exception ...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20516#discussion_r167403542
  
    --- Diff: project/SparkBuild.scala ---
    @@ -792,7 +792,6 @@ object TestSettings {
           "JAVA_HOME" -> sys.env.get("JAVA_HOME").getOrElse(sys.props("java.home"))),
         javaOptions in Test += s"-Djava.io.tmpdir=$testTempDir",
         javaOptions in Test += "-Dspark.test.home=" + sparkHome,
    -    javaOptions in Test += "-Dspark.testing=1",
    --- End diff --
    
    This is still set in the Maven build though. While you could also remove it there, maybe simplest to leave this as-is. 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org