You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2022/06/20 08:41:17 UTC
[spark] branch master updated: [SPARK-39464][CORE][TESTS][FOLLOWUP] Use Utils.localHostNameForURI instead of Utils.localCanonicalHostName in tests
This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 540e695e70c [SPARK-39464][CORE][TESTS][FOLLOWUP] Use Utils.localHostNameForURI instead of Utils.localCanonicalHostName in tests
540e695e70c is described below
commit 540e695e70c8d53d70f1b74234877ac5733fae4b
Author: yangjie01 <ya...@baidu.com>
AuthorDate: Mon Jun 20 01:41:04 2022 -0700
[SPARK-39464][CORE][TESTS][FOLLOWUP] Use Utils.localHostNameForURI instead of Utils.localCanonicalHostName in tests
### What changes were proposed in this pull request?
This PR aims to use `Utils.localHostNameForURI` instead of `Utils.localCanonicalHostName` in the following suites which changed in https://github.com/apache/spark/pull/36866
- `MasterSuite`
- `MasterWebUISuite`
- `RocksDBBackendHistoryServerSuite`
### Why are the changes needed?
These test cases fails when we run with `SPARK_LOCAL_IP=::1` and `-Djava.net.preferIPv6Addresses=true`
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
- Pass GA
- Manual test:
1. `export SPARK_LOCAL_IP=::1`
```
echo $SPARK_LOCAL_IP
::1
```
2. add `-Djava.net.preferIPv6Addresses=true` to MAVEN_OPTS, for example:
```
diff --git a/pom.xml b/pom.xml
index 1ce3b43faf..3356622985 100644
--- a/pom.xml
+++ b/pom.xml
-2943,7 +2943,7
<include>**/*Suite.java</include>
</includes>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
- <argLine>-ea -Xmx4g -Xss4m -XX:MaxMetaspaceSize=2g -XX:ReservedCodeCacheSize=${CodeCacheSize} ${extraJavaTestArgs} -Dio.netty.tryReflectionSetAccessible=true</argLine>
+ <argLine>-ea -Xmx4g -Xss4m -XX:MaxMetaspaceSize=2g -XX:ReservedCodeCacheSize=${CodeCacheSize} ${extraJavaTestArgs} -Dio.netty.tryReflectionSetAccessible=true -Djava.net.preferIPv6Addresses=true</argLine>
<environmentVariables>
<!--
Setting SPARK_DIST_CLASSPATH is a simple way to make sure any child processes
```
3. maven test `RocksDBBackendHistoryServerSuite`, `MasterSuite` and `MasterWebUISuite`
```
mvn clean install -DskipTests -pl core -am
mvn clean test -pl core -Dtest=none -DwildcardSuites=org.apache.spark.deploy.history.RocksDBBackendHistoryServerSuite
mvn clean test -pl core -Dtest=none -DwildcardSuites=org.apache.spark.deploy.master.MasterSuite
mvn clean test -pl core -Dtest=none -DwildcardSuites=org.apache.spark.deploy.master.ui.MasterWebUISuite
```
**Before**
RocksDBBackendHistoryServerSuite:
```
- Redirect to the root page when accessed to /history/ *** FAILED ***
java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:613)
at java.net.Socket.connect(Socket.java:561)
at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:463)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:558)
...
Run completed in 31 seconds, 745 milliseconds.
Total number of tests run: 73
Suites: completed 2, aborted 0
Tests: succeeded 3, failed 70, canceled 0, ignored 0, pending 0
*** 70 TESTS FAILED ***
```
MasterSuite:
```
- master/worker web ui available behind front-end reverseProxy *** FAILED ***
The code passed to eventually never returned normally. Attempted 487 times over 50.079685917 seconds. Last failure message: Connection refused (Connection refused). (MasterSuite.scala:405)
Run completed in 3 minutes, 48 seconds.
Total number of tests run: 32
Suites: completed 2, aborted 0
Tests: succeeded 29, failed 3, canceled 0, ignored 0, pending 0
*** 3 TESTS FAILED ***
```
MasterWebUISuite:
```
- Kill multiple hosts *** FAILED ***
java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:613)
at java.net.Socket.connect(Socket.java:561)
at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:463)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:558)
...
Run completed in 7 seconds, 83 milliseconds.
Total number of tests run: 4
Suites: completed 2, aborted 0
Tests: succeeded 0, failed 4, canceled 0, ignored 0, pending 0
*** 4 TESTS FAILED ***
```
**After**
RocksDBBackendHistoryServerSuite:
```
Run completed in 38 seconds, 205 milliseconds.
Total number of tests run: 73
Suites: completed 2, aborted 0
Tests: succeeded 73, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
```
MasterSuite:
```
Run completed in 1 minute, 10 seconds.
Total number of tests run: 32
Suites: completed 2, aborted 0
Tests: succeeded 32, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
```
MasterWebUISuite:
```
Run completed in 6 seconds, 330 milliseconds.
Total number of tests run: 4
Suites: completed 2, aborted 0
Tests: succeeded 4, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
```
Closes #36876 from LuciferYang/SPARK-39464-FOLLOWUP.
Authored-by: yangjie01 <ya...@baidu.com>
Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
.../org/apache/spark/deploy/history/HistoryServerSuite.scala | 2 +-
.../test/scala/org/apache/spark/deploy/master/MasterSuite.scala | 6 +++---
.../org/apache/spark/deploy/master/ui/MasterWebUISuite.scala | 8 ++++----
3 files changed, 8 insertions(+), 8 deletions(-)
diff --git a/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala b/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala
index 1aa846b3ac4..6322661f4af 100644
--- a/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala
+++ b/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerSuite.scala
@@ -74,7 +74,7 @@ abstract class HistoryServerSuite extends SparkFunSuite with BeforeAndAfter with
private var provider: FsHistoryProvider = null
private var server: HistoryServer = null
- private val localhost: String = Utils.localCanonicalHostName()
+ private val localhost: String = Utils.localHostNameForURI()
private var port: Int = -1
protected def diskBackend: HybridStoreDiskBackend.Value
diff --git a/core/src/test/scala/org/apache/spark/deploy/master/MasterSuite.scala b/core/src/test/scala/org/apache/spark/deploy/master/MasterSuite.scala
index 1fac3522aed..b66b39c3c07 100644
--- a/core/src/test/scala/org/apache/spark/deploy/master/MasterSuite.scala
+++ b/core/src/test/scala/org/apache/spark/deploy/master/MasterSuite.scala
@@ -325,7 +325,7 @@ class MasterSuite extends SparkFunSuite
val conf = new SparkConf()
val localCluster = LocalSparkCluster(2, 2, 512, conf)
localCluster.start()
- val masterUrl = s"http://${Utils.localCanonicalHostName()}:${localCluster.masterWebUIPort}"
+ val masterUrl = s"http://${Utils.localHostNameForURI()}:${localCluster.masterWebUIPort}"
try {
eventually(timeout(50.seconds), interval(100.milliseconds)) {
val json = Utils
@@ -362,7 +362,7 @@ class MasterSuite extends SparkFunSuite
conf.set(UI_REVERSE_PROXY, true)
val localCluster = LocalSparkCluster(2, 2, 512, conf)
localCluster.start()
- val masterUrl = s"http://${Utils.localCanonicalHostName()}:${localCluster.masterWebUIPort}"
+ val masterUrl = s"http://${Utils.localHostNameForURI()}:${localCluster.masterWebUIPort}"
try {
eventually(timeout(50.seconds), interval(100.milliseconds)) {
val json = Utils
@@ -400,7 +400,7 @@ class MasterSuite extends SparkFunSuite
conf.set(UI_REVERSE_PROXY_URL, reverseProxyUrl)
val localCluster = LocalSparkCluster(2, 2, 512, conf)
localCluster.start()
- val masterUrl = s"http://${Utils.localCanonicalHostName()}:${localCluster.masterWebUIPort}"
+ val masterUrl = s"http://${Utils.localHostNameForURI()}:${localCluster.masterWebUIPort}"
try {
eventually(timeout(50.seconds), interval(100.milliseconds)) {
val json = Utils
diff --git a/core/src/test/scala/org/apache/spark/deploy/master/ui/MasterWebUISuite.scala b/core/src/test/scala/org/apache/spark/deploy/master/ui/MasterWebUISuite.scala
index 253517fdcf9..b28651ea79c 100644
--- a/core/src/test/scala/org/apache/spark/deploy/master/ui/MasterWebUISuite.scala
+++ b/core/src/test/scala/org/apache/spark/deploy/master/ui/MasterWebUISuite.scala
@@ -69,7 +69,7 @@ class MasterWebUISuite extends SparkFunSuite with BeforeAndAfterAll {
when(master.idToApp).thenReturn(HashMap[String, ApplicationInfo]((activeApp.id, activeApp)))
- val url = s"http://${Utils.localCanonicalHostName()}:${masterWebUI.boundPort}/app/kill/"
+ val url = s"http://${Utils.localHostNameForURI()}:${masterWebUI.boundPort}/app/kill/"
val body = convPostDataToString(Map(("id", activeApp.id), ("terminate", "true")))
val conn = sendHttpRequest(url, "POST", body)
conn.getResponseCode
@@ -80,7 +80,7 @@ class MasterWebUISuite extends SparkFunSuite with BeforeAndAfterAll {
test("kill driver") {
val activeDriverId = "driver-0"
- val url = s"http://${Utils.localCanonicalHostName()}:${masterWebUI.boundPort}/driver/kill/"
+ val url = s"http://${Utils.localHostNameForURI()}:${masterWebUI.boundPort}/driver/kill/"
val body = convPostDataToString(Map(("id", activeDriverId), ("terminate", "true")))
val conn = sendHttpRequest(url, "POST", body)
conn.getResponseCode
@@ -90,7 +90,7 @@ class MasterWebUISuite extends SparkFunSuite with BeforeAndAfterAll {
}
private def testKillWorkers(hostnames: Seq[String]): Unit = {
- val url = s"http://${Utils.localCanonicalHostName()}:${masterWebUI.boundPort}/workers/kill/"
+ val url = s"http://${Utils.localHostNameForURI()}:${masterWebUI.boundPort}/workers/kill/"
val body = convPostDataToString(hostnames.map(("host", _)))
val conn = sendHttpRequest(url, "POST", body)
// The master is mocked here, so cannot assert on the response code
@@ -100,7 +100,7 @@ class MasterWebUISuite extends SparkFunSuite with BeforeAndAfterAll {
}
test("Kill one host") {
- testKillWorkers(Seq("${Utils.localCanonicalHostName()}"))
+ testKillWorkers(Seq(s"${Utils.localHostNameForURI()}"))
}
test("Kill multiple hosts") {
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org