You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2022/06/18 05:10:00 UTC

[jira] [Comment Edited] (SPARK-39500) Ivy doesn't work correctly on IPv6-only environment

    [ https://issues.apache.org/jira/browse/SPARK-39500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17555835#comment-17555835 ] 

Dongjoon Hyun edited comment on SPARK-39500 at 6/18/22 5:09 AM:
----------------------------------------------------------------

Hi, [~xkrogen]. This specific case is working like the following if we add `--driver-java-options="-Djava.net.preferIPv6Addresses=true"`. Here is the result.

{code}
dongjoon@m1ipv6 spark-3.3.0-bin-hadoop3 % bin/spark-shell --driver-java-options="-Djava.net.preferIPv6Addresses=true"
22/06/17 22:07:23 WARN Utils: Your hostname, m1ipv6.local resolves to a loopback address: 0:0:0:0:0:0:0:1; using 2600:***:0:0:b instead (on interface en0)
22/06/17 22:07:23 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/06/17 22:07:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://***.attlocal.net:4040
Spark context available as 'sc' (master = local[*], app id = local-1655528846969).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.3.0
      /_/

Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 17.0.3)
Type in expressions to have them evaluated.
Type :help for more information.

scala> :paste -raw
// Entering paste mode (ctrl-D to finish)

package org.apache.spark.deploy

object Download {
  SparkSubmitUtils.resolveMavenCoordinates(
    "org.apache.logging.log4j:log4j-api:2.17.2",
    SparkSubmitUtils.buildIvySettings(None, Some("/tmp/ivy")),
    transitive = true)
}

// Exiting paste mode, now interpreting.


scala> org.apache.spark.deploy.Download
:: loading settings :: url = jar:file:/Users/dongjoon/APACHE/spark-release/spark-3.3.0-bin-hadoop3/jars/ivy-2.5.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
Ivy Default Cache set to: /tmp/ivy/cache
The jars for the packages stored in: /tmp/ivy/jars
org.apache.logging.log4j#log4j-api added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-ce1b2653-5336-4402-af10-5fd83354f9b0;1.0
	confs: [default]
	found org.apache.logging.log4j#log4j-api;2.17.2 in central
downloading https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.jar ...
	[SUCCESSFUL ] org.apache.logging.log4j#log4j-api;2.17.2!log4j-api.jar (423ms)
:: resolution report :: resolve 1216ms :: artifacts dl 424ms
	:: modules in use:
	org.apache.logging.log4j#log4j-api;2.17.2 from central in [default]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   1   |   1   |   1   |   0   ||   1   |   1   |
	---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-ce1b2653-5336-4402-af10-5fd83354f9b0
	confs: [default]
	1 artifacts copied, 0 already retrieved (295kB/4ms)
res0: org.apache.spark.deploy.Download.type = org.apache.spark.deploy.Download$@36ad4b58

{code}


was (Author: dongjoon):
Hi, [~xkrogen]. This specific case is working like the following if we add `--driver-java-options="-Djava.net.preferIPv6Addresses=true"`. Here is the result.

{code}
dongjoon@m1ipv6 spark-3.3.0-bin-hadoop3 % bin/spark-shell --driver-java-options="-Djava.net.preferIPv6Addresses=true"
22/06/17 22:07:23 WARN Utils: Your hostname, m1ipv6.local resolves to a loopback address: 0:0:0:0:0:0:0:1; using 2600:1700:232e:3de0:0:0:0:b instead (on interface en0)
22/06/17 22:07:23 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/06/17 22:07:26 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://unknown1498776019fa.attlocal.net:4040
Spark context available as 'sc' (master = local[*], app id = local-1655528846969).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.3.0
      /_/

Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 17.0.3)
Type in expressions to have them evaluated.
Type :help for more information.

scala> :paste -raw
// Entering paste mode (ctrl-D to finish)

package org.apache.spark.deploy

object Download {
  SparkSubmitUtils.resolveMavenCoordinates(
    "org.apache.logging.log4j:log4j-api:2.17.2",
    SparkSubmitUtils.buildIvySettings(None, Some("/tmp/ivy")),
    transitive = true)
}

// Exiting paste mode, now interpreting.


scala> org.apache.spark.deploy.Download
:: loading settings :: url = jar:file:/Users/dongjoon/APACHE/spark-release/spark-3.3.0-bin-hadoop3/jars/ivy-2.5.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
Ivy Default Cache set to: /tmp/ivy/cache
The jars for the packages stored in: /tmp/ivy/jars
org.apache.logging.log4j#log4j-api added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-ce1b2653-5336-4402-af10-5fd83354f9b0;1.0
	confs: [default]
	found org.apache.logging.log4j#log4j-api;2.17.2 in central
downloading https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.jar ...
	[SUCCESSFUL ] org.apache.logging.log4j#log4j-api;2.17.2!log4j-api.jar (423ms)
:: resolution report :: resolve 1216ms :: artifacts dl 424ms
	:: modules in use:
	org.apache.logging.log4j#log4j-api;2.17.2 from central in [default]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   1   |   1   |   1   |   0   ||   1   |   1   |
	---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-ce1b2653-5336-4402-af10-5fd83354f9b0
	confs: [default]
	1 artifacts copied, 0 already retrieved (295kB/4ms)
res0: org.apache.spark.deploy.Download.type = org.apache.spark.deploy.Download$@36ad4b58

{code}

> Ivy doesn't work correctly on IPv6-only environment
> ---------------------------------------------------
>
>                 Key: SPARK-39500
>                 URL: https://issues.apache.org/jira/browse/SPARK-39500
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>    Affects Versions: 3.4.0
>            Reporter: Dongjoon Hyun
>            Priority: Major
>
> Ivy doesn't work correctly on IPv6.
> {code}
>   SparkSubmitUtils.resolveMavenCoordinates(
>     "org.apache.logging.log4j:log4j-api:2.17.2",
>     SparkSubmitUtils.buildIvySettings(None, Some("/tmp/ivy")),
>     transitive = true)
> {code}
> {code}
> % bin/spark-shell
> 22/06/16 22:22:12 WARN Utils: Your hostname, m1ipv6.local resolves to a loopback address: 127.0.0.1; using 2600:1700:232e:3de0:0:0:0:b instead (on interface en0)
> 22/06/16 22:22:12 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
> ===== https://ipv6.repo1.maven.org/maven2/
> =====https://maven-central.storage-download.googleapis.com/maven2/
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
> 22/06/16 22:22:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> Spark context Web UI available at http://unknown1498776019fa.attlocal.net:4040
> Spark context available as 'sc' (master = local[*], app id = local-1655443334687).
> Spark session available as 'spark'.
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 3.4.0-SNAPSHOT
>       /_/
> Using Scala version 2.12.16 (OpenJDK 64-Bit Server VM, Java 17.0.3)
> Type in expressions to have them evaluated.
> Type :help for more information.
> scala> :paste -raw
> // Entering paste mode (ctrl-D to finish)
> package org.apache.spark.deploy
> object Download {
>   SparkSubmitUtils.resolveMavenCoordinates(
>     "org.apache.logging.log4j:log4j-api:2.17.2",
>     SparkSubmitUtils.buildIvySettings(None, Some("/tmp/ivy")),
>     transitive = true)
> }
> // Exiting paste mode, now interpreting.
> scala> org.apache.spark.deploy.Download
> ===== https://ipv6.repo1.maven.org/maven2/
> =====https://maven-central.storage-download.googleapis.com/maven2/
> :: loading settings :: url = jar:file:/Users/dongjoon/APACHE/spark/assembly/target/scala-2.12/jars/ivy-2.5.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
> Ivy Default Cache set to: /tmp/ivy/cache
> The jars for the packages stored in: /tmp/ivy/jars
> org.apache.logging.log4j#log4j-api added as a dependency
> :: resolving dependencies :: org.apache.spark#spark-submit-parent-f47b503f-897e-4b92-95da-3806c32c220f;1.0
>         confs: [default]
> :: resolution report :: resolve 95ms :: artifacts dl 0ms
>         :: modules in use:
>         ---------------------------------------------------------------------
>         |                  |            modules            ||   artifacts   |
>         |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
>         ---------------------------------------------------------------------
>         |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
>         ---------------------------------------------------------------------
> :: problems summary ::
> :::: WARNINGS
>                 module not found: org.apache.logging.log4j#log4j-api;2.17.2
>         ==== local-m2-cache: tried
>           file:/Users/dongjoon/.m2/repository/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.pom
>           -- artifact org.apache.logging.log4j#log4j-api;2.17.2!log4j-api.jar:
>           file:/Users/dongjoon/.m2/repository/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.jar
>         ==== local-ivy-cache: tried
>           /tmp/ivy/local/org.apache.logging.log4j/log4j-api/2.17.2/ivys/ivy.xml
>           -- artifact org.apache.logging.log4j#log4j-api;2.17.2!log4j-api.jar:
>           /tmp/ivy/local/org.apache.logging.log4j/log4j-api/2.17.2/jars/log4j-api.jar
>         ==== ipv6: tried
>           https://ipv6.repo1.maven.org/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.pom
>           -- artifact org.apache.logging.log4j#log4j-api;2.17.2!log4j-api.jar:
>           https://ipv6.repo1.maven.org/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.jar
>         ==== central: tried
>           https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.pom
>           -- artifact org.apache.logging.log4j#log4j-api;2.17.2!log4j-api.jar:
>           https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.jar
>         ==== spark-packages: tried
>           https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.pom
>           -- artifact org.apache.logging.log4j#log4j-api;2.17.2!log4j-api.jar:
>           https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.jar
>                 ::::::::::::::::::::::::::::::::::::::::::::::
>                 ::          UNRESOLVED DEPENDENCIES         ::
>                 ::::::::::::::::::::::::::::::::::::::::::::::
>                 :: org.apache.logging.log4j#log4j-api;2.17.2: not found
>                 ::::::::::::::::::::::::::::::::::::::::::::::
> :::: ERRORS
>         Server access error at url https://ipv6.repo1.maven.org/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.pom (java.net.SocketException: Network is unreachable)
>         Server access error at url https://ipv6.repo1.maven.org/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.jar (java.net.SocketException: Network is unreachable)
>         Server access error at url https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.pom (java.net.SocketException: Network is unreachable)
>         Server access error at url https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.jar (java.net.SocketException: Network is unreachable)
>         Server access error at url https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.pom (java.net.SocketException: Network is unreachable)
>         Server access error at url https://maven-central.storage-download.googleapis.com/maven2/org/apache/logging/log4j/log4j-api/2.17.2/log4j-api-2.17.2.jar (java.net.SocketException: Network is unreachable)
> :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
> java.lang.RuntimeException: [unresolved dependency: org.apache.logging.log4j#log4j-api;2.17.2: not found]
>   at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1466)
>   at org.apache.spark.deploy.Download$.<init>(<pastie>:4)
>   at org.apache.spark.deploy.Download$.<clinit>(<pastie>)
>   ... 47 elided
> {code}



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org