You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Akhil Das <ak...@sigmoidanalytics.com> on 2015/06/24 20:07:21 UTC

Re: IPv6 support

+Dev list

Thanks
Best Regards

On Wed, Jun 24, 2015 at 11:26 PM, Kevin Liu <ke...@fb.com> wrote:

>  Continuing this thread beyond standalone - onto clusters, does anyone
> have experience successfully running any Spark cluster on IPv6 *only*
> (not dual stack) machines? More companies are moving to IPv6 and some such
> as Facebook are only allocating new clusters on IPv6 only network, so this
> is getting more relevant.
>
>  YARN still doesn’t support IPv6 per
> http://wiki.apache.org/hadoop/HadoopIPv6
>
>  Mesos is questionable per
> https://issues.apache.org/jira/browse/MESOS-1027 , did anyone get it
> working?
>
>  Standalone: Even though below worked in a single node mode, when I tried
> to connect to a remote master - it failed with the following, nor did it
> work with IPv6 address directly like "./bin/spark-shell --master
> spark://[2401:db00:2030:709b:face:0:9:0]:7078"
> client side:
>
> [root@dispark002.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ./bin/spark-shell
> --master spark://dispark001:7078
>
> log4j:WARN No appenders could be found for logger
> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
>
> log4j:WARN Please initialize the log4j system properly.
>
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> more info.
>
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
>
> 15/06/24 10:34:03 INFO SecurityManager: Changing view acls to: root
>
> 15/06/24 10:34:03 INFO SecurityManager: Changing modify acls to: root
>
> 15/06/24 10:34:03 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(root); users
> with modify permissions: Set(root)
>
> 15/06/24 10:34:03 INFO HttpServer: Starting HTTP Server
>
> 15/06/24 10:34:03 INFO Utils: Successfully started service 'HTTP class
> server' on port 49189.
>
> Welcome to
>
>       ____              __
>
>      / __/__  ___ _____/ /__
>
>     _\ \/ _ \/ _ `/ __/  '_/
>
>    /___/ .__/\_,_/_/ /_/\_\   version 1.4.0
>
>       /_/
>
>
>  Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
> 1.8.0_25)
>
> Type in expressions to have them evaluated.
>
> Type :help for more information.
>
> 15/06/24 10:34:05 INFO SparkContext: Running Spark version 1.4.0
>
> 15/06/24 10:34:05 INFO SecurityManager: Changing view acls to: root
>
> 15/06/24 10:34:05 INFO SecurityManager: Changing modify acls to: root
>
> 15/06/24 10:34:05 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(root); users
> with modify permissions: Set(root)
>
> 15/06/24 10:34:06 INFO Slf4jLogger: Slf4jLogger started
>
> 15/06/24 10:34:06 INFO Remoting: Starting remoting
>
> 15/06/24 10:34:06 INFO Remoting: Remoting started; listening on addresses
> :[akka.tcp://sparkDriver@dispark002:59150]
>
> 15/06/24 10:34:06 INFO Utils: Successfully started service 'sparkDriver'
> on port 59150.
>
> 15/06/24 10:34:06 INFO SparkEnv: Registering MapOutputTracker
>
> 15/06/24 10:34:06 INFO SparkEnv: Registering BlockManagerMaster
>
> 15/06/24 10:34:06 INFO DiskBlockManager: Created local directory at
> /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/blockmgr-bb240921-31bf-48da-b96a-7120f118d002
>
> 15/06/24 10:34:06 INFO MemoryStore: MemoryStore started with capacity
> 265.1 MB
>
> 15/06/24 10:34:06 INFO HttpFileServer: HTTP File server directory is
> /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/httpd-a7cbeb43-aefd-4da8-8df2-89a528b35c9e
>
> 15/06/24 10:34:06 INFO HttpServer: Starting HTTP Server
>
> 15/06/24 10:34:06 INFO Utils: Successfully started service 'HTTP file
> server' on port 57293.
>
> 15/06/24 10:34:06 INFO SparkEnv: Registering OutputCommitCoordinator
>
> 15/06/24 10:34:06 INFO Utils: Successfully started service 'SparkUI' on
> port 4040.
>
> 15/06/24 10:34:06 INFO SparkUI: Started SparkUI at http://
> [2401:db00:2030:709b:face:0:f:0]:4040
>
> 15/06/24 10:34:06 INFO AppClient$ClientActor: Connecting to master
> akka.tcp://sparkMaster@dispark001:7078/user/Master...
>
> 15/06/24 10:34:06 INFO SparkDeploySchedulerBackend: Connected to Spark
> cluster with app ID app-20150624103406-0004
>
> 15/06/24 10:34:06 INFO Utils: Successfully started service
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 64775.
>
> 15/06/24 10:34:06 INFO NettyBlockTransferService: Server created on 64775
>
> 15/06/24 10:34:06 ERROR SparkContext: Error initializing SparkContext.
>
> java.lang.AssertionError: assertion failed: Expected hostname
>
> at scala.Predef$.assert(Predef.scala:179)
>
> at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)
>
> at org.apache.spark.storage.BlockManagerId.<init>(BlockManagerId.scala:48)
>
> at org.apache.spark.storage.BlockManagerId$.apply(BlockManagerId.scala:107)
>
> at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:188)
>
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
>
> at
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
>
> at $line3.$read$$iwC$$iwC.<init>(<console>:9)
>
> at $line3.$read$$iwC.<init>(<console>:18)
>
> at $line3.$read.<init>(<console>:20)
>
> at $line3.$read$.<init>(<console>:24)
>
> at $line3.$read$.<clinit>(<console>)
>
> at $line3.$eval$.<init>(<console>:7)
>
> at $line3.$eval$.<clinit>(<console>)
>
> at $line3.$eval.$print(<console>)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:483)
>
> at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>
> at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>
> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>
> at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>
> at
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>
> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>
> at
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
>
> at
> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
>
> at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>
> at
> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
>
> at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>
> at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>
> at
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
>
> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>
> at
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
>
> at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>
> at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>
> at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>
> at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>
> at
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>
> at org.apache.spark.repl.SparkILoop.org
> $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>
> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>
> at org.apache.spark.repl.Main$.main(Main.scala:31)
>
> at org.apache.spark.repl.Main.main(Main.scala)
>
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
> at java.lang.reflect.Method.invoke(Method.java:483)
>
> at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>  server side:
>
> 15/06/24 10:34:06 INFO Master: Registering app Spark shell
>
> 15/06/24 10:34:06 INFO Master: Registered app Spark shell with ID
> app-20150624103406-0004
>
> 15/06/24 10:34:06 INFO Master: Received unregister request from
> application app-20150624103406-0004
>
> 15/06/24 10:34:06 INFO Master: Removing app app-20150624103406-0004
>
> 15/06/24 10:34:07 INFO Master: akka.tcp://sparkDriver@dispark002:59150
> got disassociated, removing it.
>
>
>
>   From: Kevin Liu <ke...@fb.com>
> Date: Wednesday, June 17, 2015 at 11:21 AM
> To: Akhil Das <ak...@sigmoidanalytics.com>
>
> Cc: "user@spark.apache.org" <us...@spark.apache.org>
> Subject: Re: IPv6 support
>
>   You Sir - are a genius. Thank you so much, works now…
>
>  Still wondering why I have to do this for IPv6 only machines when the
> default just works on dual-stack machines, but I am happy enough for now.
>
>  Kevin
>
>   From: Akhil Das <ak...@sigmoidanalytics.com>
> Date: Wednesday, June 17, 2015 at 12:33 AM
> To: Kevin Liu <ke...@fb.com>
> Cc: "user@spark.apache.org" <us...@spark.apache.org>
> Subject: Re: IPv6 support
>
>    If you look at this,
>
>   15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host
> localhost
>
> 15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext.
>
> java.lang.AssertionError: assertion failed: Expected hostname
>
>  your spark.driver.host
> <https://github.com/apache/spark/blob/3c0156899dc1ec1f7dfe6d7c8af47fa6dc7d00bf/core/src/main/scala/org/apache/spark/util/RpcUtils.scala#L33> is
> being set to localhost which fails host.indexOf(':') == -1
> <https://github.com/apache/spark/blob/branch-1.4/core/src/main/scala/org/apache/spark/util/Utils.scala#L882>.
> Try setting your spark.driver.host to dispark001.ash3 (from whichever
> machine you are running the code)
>
>
>
>
>  Thanks
> Best Regards
>
> On Wed, Jun 17, 2015 at 10:57 AM, Kevin Liu <ke...@fb.com> wrote:
>
>>  Thanks Akhil, it seems that what you said should have fixed it.
>>
>>  1) There was a known issue directly related to below, but it has been
>> fixed in 1.4 https://issues.apache.org/jira/browse/SPARK-6440
>> <https://urldefense.proofpoint.com/v1/url?u=https://issues.apache.org/jira/browse/SPARK-6440&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=464bc64957c6ddd8bb506c8ec449f3c26f8c6f4482dc6ea7356df4e0cb5c98e9>
>> 2) Now, with 1.4 - I see the following errors - even after I setting
>> SPARK_MASTER_IP - your description seems to be right on, any more thoughts?
>>
>>   [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ping6
>> dispark001.ash3
>>
>> PING dispark001.ash3(dispark001.ash3.facebook.com) 56 data bytes
>>
>> 64 bytes from dispark001.ash3.facebook.com: icmp_seq=1 ttl=64 time=0.012
>> ms
>>
>> 64 bytes from dispark001.ash3.facebook.com: icmp_seq=2 ttl=64 time=0.021
>> ms
>>
>> 64 bytes from dispark001.ash3.facebook.com: icmp_seq=3 ttl=64 time=0.011
>> ms
>>
>> ^C
>>
>> --- dispark001.ash3 ping statistics ---
>>
>> 3 packets transmitted, 3 received, 0% packet loss, time 2113ms
>>
>> rtt min/avg/max/mdev = 0.011/0.014/0.021/0.006 ms
>>
>> [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# export
>> SPARK_MASTER_IP="dispark001.ash3"
>>
>> [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ./bin/run-example
>> SparkPi 10
>>
>> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>>
>> 15/06/16 22:25:13 INFO SparkContext: Running Spark version 1.4.0
>>
>> 15/06/16 22:25:13 WARN NativeCodeLoader: Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>>
>> 15/06/16 22:25:13 INFO SecurityManager: Changing view acls to: root
>>
>> 15/06/16 22:25:13 INFO SecurityManager: Changing modify acls to: root
>>
>> 15/06/16 22:25:13 INFO SecurityManager: SecurityManager: authentication
>> disabled; ui acls disabled; users with view permissions: Set(root); users
>> with modify permissions: Set(root)
>>
>> 15/06/16 22:25:13 INFO Slf4jLogger: Slf4jLogger started
>>
>> 15/06/16 22:25:13 INFO Remoting: Starting remoting
>>
>> 15/06/16 22:25:13 INFO Remoting: Remoting started; listening on addresses
>> :[akka.tcp://sparkDriver@2401:db00:2030:709b:face:0:9:0:50916]
>>
>> 15/06/16 22:25:13 INFO Utils: Successfully started service 'sparkDriver'
>> on port 50916.
>>
>> 15/06/16 22:25:13 INFO SparkEnv: Registering MapOutputTracker
>>
>> 15/06/16 22:25:13 INFO SparkEnv: Registering BlockManagerMaster
>>
>> 15/06/16 22:25:13 INFO DiskBlockManager: Created local directory at
>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b
>>
>> 15/06/16 22:25:13 INFO MemoryStore: MemoryStore started with capacity
>> 265.1 MB
>>
>> 15/06/16 22:25:13 INFO HttpFileServer: HTTP File server directory is
>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/httpd-2b6a5162-0686-4cc5-accb-1bb66fddf705
>>
>> 15/06/16 22:25:13 INFO HttpServer: Starting HTTP Server
>>
>> 15/06/16 22:25:13 INFO Utils: Successfully started service 'HTTP file
>> server' on port 35895.
>>
>> 15/06/16 22:25:13 INFO SparkEnv: Registering OutputCommitCoordinator
>>
>> 15/06/16 22:25:13 INFO Utils: Successfully started service 'SparkUI' on
>> port 4040.
>>
>> 15/06/16 22:25:13 INFO SparkUI: Started SparkUI at
>> http://[2401:db00:2030:709b:face:0:9:0]:4040
>>
>> 15/06/16 22:25:14 INFO SparkContext: Added JAR
>> file:/root/spark-1.4.0-bin-hadoop2.6/lib/spark-examples-1.4.0-hadoop2.6.0.jar
>> at
>> http://[2401:db00:2030:709b:face:0:9:0]:35895/jars/spark-examples-1.4.0-hadoop2.6.0.jar
>> with timestamp 1434518714122
>>
>> 15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host
>> localhost
>>
>> 15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext.
>>
>> java.lang.AssertionError: assertion failed: Expected hostname
>>
>> at scala.Predef$.assert(Predef.scala:179)
>>
>> at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)
>>
>> at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35)
>>
>> at org.apache.spark.executor.Executor.<init>(Executor.scala:413)
>>
>> at
>> org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:53)
>>
>> at
>> org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103)
>>
>> at
>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
>>
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
>>
>> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>>
>> at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>> at java.lang.reflect.Method.invoke(Method.java:483)
>>
>> at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>
>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>
>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> 15/06/16 22:25:14 INFO SparkUI: Stopped Spark web UI at
>> http://[2401:db00:2030:709b:face:0:9:0]:4040
>>
>> 15/06/16 22:25:14 INFO DAGScheduler: Stopping DAGScheduler
>>
>> 15/06/16 22:25:14 ERROR SparkContext: Error stopping SparkContext after
>> init error.
>>
>> java.lang.NullPointerException
>>
>> at
>> org.apache.spark.scheduler.local.LocalBackend.stop(LocalBackend.scala:107)
>>
>> at
>> org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:416)
>>
>> at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1404)
>>
>> at org.apache.spark.SparkContext.stop(SparkContext.scala:1642)
>>
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:565)
>>
>> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>>
>> at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>> at java.lang.reflect.Method.invoke(Method.java:483)
>>
>> at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>
>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>
>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> Exception in thread "main" java.lang.AssertionError: assertion failed:
>> Expected hostname
>>
>> at scala.Predef$.assert(Predef.scala:179)
>>
>> at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)
>>
>> at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35)
>>
>> at org.apache.spark.executor.Executor.<init>(Executor.scala:413)
>>
>> at
>> org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:53)
>>
>> at
>> org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103)
>>
>> at
>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
>>
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
>>
>> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>>
>> at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>> at java.lang.reflect.Method.invoke(Method.java:483)
>>
>> at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>
>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>
>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>> 15/06/16 22:25:14 INFO DiskBlockManager: Shutdown hook called
>>
>> 15/06/16 22:25:14 INFO Utils: path =
>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b,
>> already present as root for deletion.
>>
>> 15/06/16 22:25:14 INFO Utils: Shutdown hook called
>>
>> 15/06/16 22:25:14 INFO Utils: Deleting directory
>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66
>>
>> [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]#
>>
>>
>>   From: Akhil Das <ak...@sigmoidanalytics.com>
>> Date: Monday, May 25, 2015 at 9:33 AM
>> To: Kevin Liu <ke...@fb.com>
>> Cc: "user@spark.apache.org" <us...@spark.apache.org>
>> Subject: Re: IPv6 support
>>
>>    Hi Kevin,
>>
>>  Did you try adding a host name for the ipv6? I have a few ipv6 boxes,
>> spark failed for me when i use just the ipv6 addresses, but it works fine
>> when i use the host names.
>>
>>  Here's an entry in my /etc/hosts:
>>
>>  2607:5300:0100:0200:0000:0000:0000:0a4d hacked.work
>>
>>
>>  My spark-env.sh file:
>>
>>  export SPARK_MASTER_IP="hacked.work"
>>
>>
>>  Here's the master listening on my v6:
>>
>>   [image: Inline image 1]
>>
>>
>>  The Master UI with running spark-shell:
>>
>>   [image: Inline image 2]
>>
>>
>>  I even ran a simple sc.parallelize(1 to 100).collect().
>>
>>
>>
>>  Thanks
>> Best Regards
>>
>> On Wed, May 20, 2015 at 11:09 PM, Kevin Liu <ke...@fb.com> wrote:
>>
>>> Hello, I have to work with IPv6 only servers and when I installed the
>>> 1.3.1 hadoop 2.6 build, I couldn¹t get the example to run due to IPv6
>>> issues (errors below). I tried to add the
>>> -Djava.net.preferIPv6Addresses=true setting but it still doesn¹t work. A
>>> search on Spark¹s support for IPv6 is inconclusive. Can someone help
>>> clarify the current status for IPv6?
>>>
>>> Thanks
>>> Kevin
>>>
>>>
>>> ‹‹ errors ‹
>>>
>>> 5/05/20 10:17:30 INFO Executor: Fetching
>>>
>>> http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo
>>> <https://urldefense.proofpoint.com/v1/url?u=http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=36fe35441ecafdbc99ac8e5605e91d0a4ea88855d27d1a964e55e04ae65c7fde>
>>> p2.6.0.jar with timestamp 1432142250197
>>> 15/05/20 10:17:30 INFO Executor: Fetching
>>>
>>> http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo
>>> <https://urldefense.proofpoint.com/v1/url?u=http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=36fe35441ecafdbc99ac8e5605e91d0a4ea88855d27d1a964e55e04ae65c7fde>
>>> p2.6.0.jar with timestamp 1432142250197
>>> 15/05/20 10:17:30 ERROR Executor: Exception in task 5.0 in stage 0.0 (TID
>>> 5)
>>> java.net.MalformedURLException: For input string:
>>> "db00:2030:709b:face:0:9:0:51453"
>>>         at java.net.URL.<init>(URL.java:620)
>>>         at java.net.URL.<init>(URL.java:483)
>>>         at java.net.URL.<init>(URL.java:432)
>>>         at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:603)
>>>         at org.apache.spark.util.Utils$.fetchFile(Utils.scala:431)
>>>         at
>>>
>>> org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu
>>> tor$$updateDependencies$5.apply(Executor.scala:374)
>>>         at
>>>
>>> org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu
>>> tor$$updateDependencies$5.apply(Executor.scala:366)
>>>         at
>>>
>>> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(Traver
>>> sableLike.scala:772)
>>>         at
>>>
>>> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>>         at
>>>
>>> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>>         at
>>>
>>> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>>>         at
>>> scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>>>         at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>>>         at
>>>
>>> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:7
>>> 71)
>>>         at
>>> org.apache.spark.executor.Executor.org
>>> <https://urldefense.proofpoint.com/v1/url?u=http://org.apache.spark.executor.Executor.org&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=cToWyWSkkQGPchtZEHgZAymvCC%2FYOX8btPSeh%2Bth5wM%3D%0A&s=69d2377deecbf9077810fa58426e84fc72e54932d7bf06064e130e9a3ac6af04>
>>> $apache$spark$executor$Executor$$upda
>>> teDependencies(Executor.scala:366)
>>>         at
>>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184)
>>>         at
>>>
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1
>>> 142)
>>>         at
>>>
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:
>>> 617)
>>>         at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.NumberFormatException: For input string:
>>> "db00:2030:709b:face:0:9:0:51453"
>>>         at
>>>
>>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:6
>>> 5)
>>>         at java.lang.Integer.parseInt(Integer.java:580)
>>>         at java.lang.Integer.parseInt(Integer.java:615)
>>>         at java.net.URLStreamHandler.parseURL(URLStreamHandler.java:216)
>>>         at java.net.URL.<init>(URL.java:615)
>>>         ... 18 more
>>>
>>>
>>>
>>>
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>

Re: IPv6 support

Posted by Ruslan Dautkhanov <da...@gmail.com>.
Per Cloudera, "ipv6 is not supported"
http://www.cloudera.com/content/cloudera/en/documentation/cdh5/v5-0-0/PDF/CDH5-Requirements-and-Supported-Versions.pdf

>From my experience, ZooKeeper works unstable in ipv4+ipv6. Had to disable
ipv6 and ZK started working.



-- 
Ruslan Dautkhanov

On Mon, Jun 29, 2015 at 1:19 PM, Kevin Liu <ke...@fb.com> wrote:

>  Yes.
>
>  #spark.master                     spark://dispark001:7078
> spark.driver.host       dispark001
>
>  Without spark.master specified, it worked. However, when spark.master is
> uncommented, it will fail with the error below. (simplified the test with
> all running on a single machine dispark001).
>
>  Thanks
>
>   From: Akhil Das <ak...@sigmoidanalytics.com>
> Date: Thursday, June 25, 2015 at 12:25 AM
> To: Kevin Liu <ke...@fb.com>, dev <de...@spark.apache.org>
> Cc: "user@spark.apache.org" <us...@spark.apache.org>
> Subject: Re: IPv6 support
>
>    Its the BlockManager hostname
> <https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/storage/BlockManager.scala#L190>
> messing up this time, are you having spark.driver.host available in the
> conf/spark-defaults.conf file?
>
>  Thanks
> Best Regards
>
> On Wed, Jun 24, 2015 at 11:37 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>>  +Dev list
>>
>>  Thanks
>> Best Regards
>>
>> On Wed, Jun 24, 2015 at 11:26 PM, Kevin Liu <ke...@fb.com> wrote:
>>
>>>  Continuing this thread beyond standalone - onto clusters, does anyone
>>> have experience successfully running any Spark cluster on IPv6 *only*
>>> (not dual stack) machines? More companies are moving to IPv6 and some such
>>> as Facebook are only allocating new clusters on IPv6 only network, so this
>>> is getting more relevant.
>>>
>>>  YARN still doesn’t support IPv6 per
>>> http://wiki.apache.org/hadoop/HadoopIPv6
>>> <https://urldefense.proofpoint.com/v1/url?u=http://wiki.apache.org/hadoop/HadoopIPv6&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=1ad213f27da032cb3e6a765bd947c9722513c909f83d3ed977b9815ac9a739c0>
>>>
>>>  Mesos is questionable per
>>> https://issues.apache.org/jira/browse/MESOS-1027
>>> <https://urldefense.proofpoint.com/v1/url?u=https://issues.apache.org/jira/browse/MESOS-1027&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=ec04f3bf7f66695ed59c4fd1809733859e24a4fb24805c5fe8f115f8f38048d3> ,
>>> did anyone get it working?
>>>
>>>  Standalone: Even though below worked in a single node mode, when I
>>> tried to connect to a remote master - it failed with the following, nor did
>>> it work with IPv6 address directly like "./bin/spark-shell --master
>>> spark://[2401:db00:2030:709b:face:0:9:0]:7078"
>>> client side:
>>>
>>> [root@dispark002.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ./bin/spark-shell
>>> --master spark://dispark001:7078
>>>
>>> log4j:WARN No appenders could be found for logger
>>> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
>>>
>>> log4j:WARN Please initialize the log4j system properly.
>>>
>>> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
>>> <https://urldefense.proofpoint.com/v1/url?u=http://logging.apache.org/log4j/1.2/faq.html%23noconfig&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=44556dc9d4c8aa248284cfd7689682989b53368214c1e75fc13d2364664e463f>
>>> for more info.
>>>
>>> Using Spark's default log4j profile:
>>> org/apache/spark/log4j-defaults.properties
>>>
>>> 15/06/24 10:34:03 INFO SecurityManager: Changing view acls to: root
>>>
>>> 15/06/24 10:34:03 INFO SecurityManager: Changing modify acls to: root
>>>
>>> 15/06/24 10:34:03 INFO SecurityManager: SecurityManager: authentication
>>> disabled; ui acls disabled; users with view permissions: Set(root); users
>>> with modify permissions: Set(root)
>>>
>>> 15/06/24 10:34:03 INFO HttpServer: Starting HTTP Server
>>>
>>> 15/06/24 10:34:03 INFO Utils: Successfully started service 'HTTP class
>>> server' on port 49189.
>>>
>>> Welcome to
>>>
>>>       ____              __
>>>
>>>      / __/__  ___ _____/ /__
>>>
>>>     _\ \/ _ \/ _ `/ __/  '_/
>>>
>>>    /___/ .__/\_,_/_/ /_/\_\   version 1.4.0
>>>
>>>       /_/
>>>
>>>
>>>  Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
>>> 1.8.0_25)
>>>
>>> Type in expressions to have them evaluated.
>>>
>>> Type :help for more information.
>>>
>>> 15/06/24 10:34:05 INFO SparkContext: Running Spark version 1.4.0
>>>
>>> 15/06/24 10:34:05 INFO SecurityManager: Changing view acls to: root
>>>
>>> 15/06/24 10:34:05 INFO SecurityManager: Changing modify acls to: root
>>>
>>> 15/06/24 10:34:05 INFO SecurityManager: SecurityManager: authentication
>>> disabled; ui acls disabled; users with view permissions: Set(root); users
>>> with modify permissions: Set(root)
>>>
>>> 15/06/24 10:34:06 INFO Slf4jLogger: Slf4jLogger started
>>>
>>> 15/06/24 10:34:06 INFO Remoting: Starting remoting
>>>
>>> 15/06/24 10:34:06 INFO Remoting: Remoting started; listening on
>>> addresses :[akka.tcp://sparkDriver@dispark002:59150]
>>>
>>> 15/06/24 10:34:06 INFO Utils: Successfully started service 'sparkDriver'
>>> on port 59150.
>>>
>>> 15/06/24 10:34:06 INFO SparkEnv: Registering MapOutputTracker
>>>
>>> 15/06/24 10:34:06 INFO SparkEnv: Registering BlockManagerMaster
>>>
>>> 15/06/24 10:34:06 INFO DiskBlockManager: Created local directory at
>>> /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/blockmgr-bb240921-31bf-48da-b96a-7120f118d002
>>>
>>> 15/06/24 10:34:06 INFO MemoryStore: MemoryStore started with capacity
>>> 265.1 MB
>>>
>>> 15/06/24 10:34:06 INFO HttpFileServer: HTTP File server directory is
>>> /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/httpd-a7cbeb43-aefd-4da8-8df2-89a528b35c9e
>>>
>>> 15/06/24 10:34:06 INFO HttpServer: Starting HTTP Server
>>>
>>> 15/06/24 10:34:06 INFO Utils: Successfully started service 'HTTP file
>>> server' on port 57293.
>>>
>>> 15/06/24 10:34:06 INFO SparkEnv: Registering OutputCommitCoordinator
>>>
>>> 15/06/24 10:34:06 INFO Utils: Successfully started service 'SparkUI' on
>>> port 4040.
>>>
>>> 15/06/24 10:34:06 INFO SparkUI: Started SparkUI at
>>> http://[2401:db00:2030:709b:face:0:f:0]:4040
>>>
>>> 15/06/24 10:34:06 INFO AppClient$ClientActor: Connecting to master
>>> akka.tcp://sparkMaster@dispark001:7078/user/Master...
>>>
>>> 15/06/24 10:34:06 INFO SparkDeploySchedulerBackend: Connected to Spark
>>> cluster with app ID app-20150624103406-0004
>>>
>>> 15/06/24 10:34:06 INFO Utils: Successfully started service
>>> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 64775.
>>>
>>> 15/06/24 10:34:06 INFO NettyBlockTransferService: Server created on 64775
>>>
>>> 15/06/24 10:34:06 ERROR SparkContext: Error initializing SparkContext.
>>>
>>> java.lang.AssertionError: assertion failed: Expected hostname
>>>
>>> at scala.Predef$.assert(Predef.scala:179)
>>>
>>> at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)
>>>
>>> at
>>> org.apache.spark.storage.BlockManagerId.<init>(BlockManagerId.scala:48)
>>>
>>> at
>>> org.apache.spark.storage.BlockManagerId$.apply(BlockManagerId.scala:107)
>>>
>>> at
>>> org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:188)
>>>
>>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
>>>
>>> at $line3.$read$$iwC$$iwC.<init>(<console>:9)
>>>
>>> at $line3.$read$$iwC.<init>(<console>:18)
>>>
>>> at $line3.$read.<init>(<console>:20)
>>>
>>> at $line3.$read$.<init>(<console>:24)
>>>
>>> at $line3.$read$.<clinit>(<console>)
>>>
>>> at $line3.$eval$.<init>(<console>:7)
>>>
>>> at $line3.$eval$.<clinit>(<console>)
>>>
>>> at $line3.$eval.$print(<console>)
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:483)
>>>
>>> at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>>
>>> at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>>>
>>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>>
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>
>>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>>
>>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
>>>
>>> at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
>>>
>>> at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
>>>
>>> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>
>>> at
>>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>
>>> at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>
>>> at org.apache.spark.repl.SparkILoop.org
>>> <https://urldefense.proofpoint.com/v1/url?u=http://org.apache.spark.repl.SparkILoop.org&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=1340df7798a6e49f1216a6e05947ea0801a69d04c5480fb201288052a53ed8d6>
>>> $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>>
>>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>>
>>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>>
>>> at org.apache.spark.repl.Main.main(Main.scala)
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:483)
>>>
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>>
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>>
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>>
>>>  server side:
>>>
>>> 15/06/24 10:34:06 INFO Master: Registering app Spark shell
>>>
>>> 15/06/24 10:34:06 INFO Master: Registered app Spark shell with ID
>>> app-20150624103406-0004
>>>
>>> 15/06/24 10:34:06 INFO Master: Received unregister request from
>>> application app-20150624103406-0004
>>>
>>> 15/06/24 10:34:06 INFO Master: Removing app app-20150624103406-0004
>>>
>>> 15/06/24 10:34:07 INFO Master: akka.tcp://sparkDriver@dispark002:59150
>>> got disassociated, removing it.
>>>
>>>
>>>
>>>   From: Kevin Liu <ke...@fb.com>
>>> Date: Wednesday, June 17, 2015 at 11:21 AM
>>> To: Akhil Das <ak...@sigmoidanalytics.com>
>>>
>>> Cc: "user@spark.apache.org" <us...@spark.apache.org>
>>> Subject: Re: IPv6 support
>>>
>>>   You Sir - are a genius. Thank you so much, works now…
>>>
>>>  Still wondering why I have to do this for IPv6 only machines when the
>>> default just works on dual-stack machines, but I am happy enough for now.
>>>
>>>  Kevin
>>>
>>>   From: Akhil Das <ak...@sigmoidanalytics.com>
>>> Date: Wednesday, June 17, 2015 at 12:33 AM
>>> To: Kevin Liu <ke...@fb.com>
>>> Cc: "user@spark.apache.org" <us...@spark.apache.org>
>>> Subject: Re: IPv6 support
>>>
>>>    If you look at this,
>>>
>>>   15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host
>>> localhost
>>>
>>> 15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext.
>>>
>>> java.lang.AssertionError: assertion failed: Expected hostname
>>>
>>>  your spark.driver.host
>>> <https://github.com/apache/spark/blob/3c0156899dc1ec1f7dfe6d7c8af47fa6dc7d00bf/core/src/main/scala/org/apache/spark/util/RpcUtils.scala#L33> is
>>> being set to localhost which fails host.indexOf(':') == -1
>>> <https://github.com/apache/spark/blob/branch-1.4/core/src/main/scala/org/apache/spark/util/Utils.scala#L882>.
>>> Try setting your spark.driver.host to dispark001.ash3 (from whichever
>>> machine you are running the code)
>>>
>>>
>>>
>>>
>>>  Thanks
>>> Best Regards
>>>
>>> On Wed, Jun 17, 2015 at 10:57 AM, Kevin Liu <ke...@fb.com> wrote:
>>>
>>>>  Thanks Akhil, it seems that what you said should have fixed it.
>>>>
>>>>  1) There was a known issue directly related to below, but it has been
>>>> fixed in 1.4 https://issues.apache.org/jira/browse/SPARK-6440
>>>> <https://urldefense.proofpoint.com/v1/url?u=https://issues.apache.org/jira/browse/SPARK-6440&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=464bc64957c6ddd8bb506c8ec449f3c26f8c6f4482dc6ea7356df4e0cb5c98e9>
>>>> 2) Now, with 1.4 - I see the following errors - even after I setting
>>>> SPARK_MASTER_IP - your description seems to be right on, any more thoughts?
>>>>
>>>>   [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ping6
>>>> dispark001.ash3
>>>>
>>>> PING dispark001.ash3(dispark001.ash3.facebook.com) 56 data bytes
>>>>
>>>> 64 bytes from dispark001.ash3.facebook.com: icmp_seq=1 ttl=64
>>>> time=0.012 ms
>>>>
>>>> 64 bytes from dispark001.ash3.facebook.com: icmp_seq=2 ttl=64
>>>> time=0.021 ms
>>>>
>>>> 64 bytes from dispark001.ash3.facebook.com: icmp_seq=3 ttl=64
>>>> time=0.011 ms
>>>>
>>>> ^C
>>>>
>>>> --- dispark001.ash3 ping statistics ---
>>>>
>>>> 3 packets transmitted, 3 received, 0% packet loss, time 2113ms
>>>>
>>>> rtt min/avg/max/mdev = 0.011/0.014/0.021/0.006 ms
>>>>
>>>> [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# export
>>>> SPARK_MASTER_IP="dispark001.ash3"
>>>>
>>>> [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ./bin/run-example
>>>> SparkPi 10
>>>>
>>>> Using Spark's default log4j profile:
>>>> org/apache/spark/log4j-defaults.properties
>>>>
>>>> 15/06/16 22:25:13 INFO SparkContext: Running Spark version 1.4.0
>>>>
>>>> 15/06/16 22:25:13 WARN NativeCodeLoader: Unable to load native-hadoop
>>>> library for your platform... using builtin-java classes where applicable
>>>>
>>>> 15/06/16 22:25:13 INFO SecurityManager: Changing view acls to: root
>>>>
>>>> 15/06/16 22:25:13 INFO SecurityManager: Changing modify acls to: root
>>>>
>>>> 15/06/16 22:25:13 INFO SecurityManager: SecurityManager: authentication
>>>> disabled; ui acls disabled; users with view permissions: Set(root); users
>>>> with modify permissions: Set(root)
>>>>
>>>> 15/06/16 22:25:13 INFO Slf4jLogger: Slf4jLogger started
>>>>
>>>> 15/06/16 22:25:13 INFO Remoting: Starting remoting
>>>>
>>>> 15/06/16 22:25:13 INFO Remoting: Remoting started; listening on
>>>> addresses :[akka.tcp://sparkDriver@2401
>>>> :db00:2030:709b:face:0:9:0:50916]
>>>>
>>>> 15/06/16 22:25:13 INFO Utils: Successfully started service
>>>> 'sparkDriver' on port 50916.
>>>>
>>>> 15/06/16 22:25:13 INFO SparkEnv: Registering MapOutputTracker
>>>>
>>>> 15/06/16 22:25:13 INFO SparkEnv: Registering BlockManagerMaster
>>>>
>>>> 15/06/16 22:25:13 INFO DiskBlockManager: Created local directory at
>>>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b
>>>>
>>>> 15/06/16 22:25:13 INFO MemoryStore: MemoryStore started with capacity
>>>> 265.1 MB
>>>>
>>>> 15/06/16 22:25:13 INFO HttpFileServer: HTTP File server directory is
>>>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/httpd-2b6a5162-0686-4cc5-accb-1bb66fddf705
>>>>
>>>> 15/06/16 22:25:13 INFO HttpServer: Starting HTTP Server
>>>>
>>>> 15/06/16 22:25:13 INFO Utils: Successfully started service 'HTTP file
>>>> server' on port 35895.
>>>>
>>>> 15/06/16 22:25:13 INFO SparkEnv: Registering OutputCommitCoordinator
>>>>
>>>> 15/06/16 22:25:13 INFO Utils: Successfully started service 'SparkUI' on
>>>> port 4040.
>>>>
>>>> 15/06/16 22:25:13 INFO SparkUI: Started SparkUI at
>>>> http://[2401:db00:2030:709b:face:0:9:0]:4040
>>>> <https://urldefense.proofpoint.com/v1/url?u=http://%5B2401:db00:2030:709b:face:0:9:0%5D:4040&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=92e15781e13e5e474d1bca8148f5fefd80b945eef95378d45fb52397fa9d7f79>
>>>>
>>>> 15/06/16 22:25:14 INFO SparkContext: Added JAR
>>>> file:/root/spark-1.4.0-bin-hadoop2.6/lib/spark-examples-1.4.0-hadoop2.6.0.jar
>>>> at
>>>> http://[2401:db00:2030:709b:face:0:9:0]:35895/jars/spark-examples-1.4.0-hadoop2.6.0.jar
>>>> <https://urldefense.proofpoint.com/v1/url?u=http://%5B2401:db00:2030:709b:face:0:9:0%5D:35895/jars/spark-examples-1.4.0-hadoop2.6.0.jar&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=b914685c5841cb7d585614b192572c79069dd8354422500d21b00cc3fba4c97e>
>>>> with timestamp 1434518714122
>>>>
>>>> 15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host
>>>> localhost
>>>>
>>>> 15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext.
>>>>
>>>> java.lang.AssertionError: assertion failed: Expected hostname
>>>>
>>>> at scala.Predef$.assert(Predef.scala:179)
>>>>
>>>> at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)
>>>>
>>>> at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35)
>>>>
>>>> at org.apache.spark.executor.Executor.<init>(Executor.scala:413)
>>>>
>>>> at
>>>> org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:53)
>>>>
>>>> at
>>>> org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103)
>>>>
>>>> at
>>>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
>>>>
>>>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
>>>>
>>>> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>>>>
>>>> at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>>>>
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>
>>>> at java.lang.reflect.Method.invoke(Method.java:483)
>>>>
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>>>
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>
>>>> 15/06/16 22:25:14 INFO SparkUI: Stopped Spark web UI at
>>>> http://[2401:db00:2030:709b:face:0:9:0]:4040
>>>> <https://urldefense.proofpoint.com/v1/url?u=http://%5B2401:db00:2030:709b:face:0:9:0%5D:4040&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=92e15781e13e5e474d1bca8148f5fefd80b945eef95378d45fb52397fa9d7f79>
>>>>
>>>> 15/06/16 22:25:14 INFO DAGScheduler: Stopping DAGScheduler
>>>>
>>>> 15/06/16 22:25:14 ERROR SparkContext: Error stopping SparkContext after
>>>> init error.
>>>>
>>>> java.lang.NullPointerException
>>>>
>>>> at
>>>> org.apache.spark.scheduler.local.LocalBackend.stop(LocalBackend.scala:107)
>>>>
>>>> at
>>>> org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:416)
>>>>
>>>> at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1404)
>>>>
>>>> at org.apache.spark.SparkContext.stop(SparkContext.scala:1642)
>>>>
>>>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:565)
>>>>
>>>> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>>>>
>>>> at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>>>>
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>
>>>> at java.lang.reflect.Method.invoke(Method.java:483)
>>>>
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>>>
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>
>>>> Exception in thread "main" java.lang.AssertionError: assertion failed:
>>>> Expected hostname
>>>>
>>>> at scala.Predef$.assert(Predef.scala:179)
>>>>
>>>> at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)
>>>>
>>>> at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35)
>>>>
>>>> at org.apache.spark.executor.Executor.<init>(Executor.scala:413)
>>>>
>>>> at
>>>> org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:53)
>>>>
>>>> at
>>>> org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103)
>>>>
>>>> at
>>>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
>>>>
>>>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
>>>>
>>>> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>>>>
>>>> at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>>>>
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>
>>>> at java.lang.reflect.Method.invoke(Method.java:483)
>>>>
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>>>
>>>> at
>>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>>>
>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>
>>>> 15/06/16 22:25:14 INFO DiskBlockManager: Shutdown hook called
>>>>
>>>> 15/06/16 22:25:14 INFO Utils: path =
>>>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b,
>>>> already present as root for deletion.
>>>>
>>>> 15/06/16 22:25:14 INFO Utils: Shutdown hook called
>>>>
>>>> 15/06/16 22:25:14 INFO Utils: Deleting directory
>>>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66
>>>>
>>>> [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]#
>>>>
>>>>
>>>>   From: Akhil Das <ak...@sigmoidanalytics.com>
>>>> Date: Monday, May 25, 2015 at 9:33 AM
>>>> To: Kevin Liu <ke...@fb.com>
>>>> Cc: "user@spark.apache.org" <us...@spark.apache.org>
>>>> Subject: Re: IPv6 support
>>>>
>>>>    Hi Kevin,
>>>>
>>>>  Did you try adding a host name for the ipv6? I have a few ipv6 boxes,
>>>> spark failed for me when i use just the ipv6 addresses, but it works fine
>>>> when i use the host names.
>>>>
>>>>  Here's an entry in my /etc/hosts:
>>>>
>>>>  2607:5300:0100:0200:0000:0000:0000:0a4d hacked.work
>>>>
>>>>
>>>>  My spark-env.sh file:
>>>>
>>>>  export SPARK_MASTER_IP="hacked.work"
>>>>
>>>>
>>>>  Here's the master listening on my v6:
>>>>
>>>>   [image: Inline image 1]
>>>>
>>>>
>>>>  The Master UI with running spark-shell:
>>>>
>>>>   [image: Inline image 2]
>>>>
>>>>
>>>>  I even ran a simple sc.parallelize(1 to 100).collect().
>>>>
>>>>
>>>>
>>>>  Thanks
>>>> Best Regards
>>>>
>>>> On Wed, May 20, 2015 at 11:09 PM, Kevin Liu <ke...@fb.com> wrote:
>>>>
>>>>> Hello, I have to work with IPv6 only servers and when I installed the
>>>>> 1.3.1 hadoop 2.6 build, I couldn¹t get the example to run due to IPv6
>>>>> issues (errors below). I tried to add the
>>>>> -Djava.net.preferIPv6Addresses=true setting but it still doesn¹t work.
>>>>> A
>>>>> search on Spark¹s support for IPv6 is inconclusive. Can someone help
>>>>> clarify the current status for IPv6?
>>>>>
>>>>> Thanks
>>>>> Kevin
>>>>>
>>>>>
>>>>> ‹‹ errors ‹
>>>>>
>>>>> 5/05/20 10:17:30 INFO Executor: Fetching
>>>>>
>>>>> http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo
>>>>> <https://urldefense.proofpoint.com/v1/url?u=http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=36fe35441ecafdbc99ac8e5605e91d0a4ea88855d27d1a964e55e04ae65c7fde>
>>>>> p2.6.0.jar with timestamp 1432142250197
>>>>> 15/05/20 10:17:30 INFO Executor: Fetching
>>>>>
>>>>> http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo
>>>>> <https://urldefense.proofpoint.com/v1/url?u=http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=36fe35441ecafdbc99ac8e5605e91d0a4ea88855d27d1a964e55e04ae65c7fde>
>>>>> p2.6.0.jar with timestamp 1432142250197
>>>>> 15/05/20 10:17:30 ERROR Executor: Exception in task 5.0 in stage 0.0
>>>>> (TID
>>>>> 5)
>>>>> java.net.MalformedURLException: For input string:
>>>>> "db00:2030:709b:face:0:9:0:51453"
>>>>>         at java.net.URL.<init>(URL.java:620)
>>>>>         at java.net.URL.<init>(URL.java:483)
>>>>>         at java.net.URL.<init>(URL.java:432)
>>>>>         at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:603)
>>>>>         at org.apache.spark.util.Utils$.fetchFile(Utils.scala:431)
>>>>>         at
>>>>>
>>>>> org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu
>>>>> tor$$updateDependencies$5.apply(Executor.scala:374)
>>>>>         at
>>>>>
>>>>> org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu
>>>>> tor$$updateDependencies$5.apply(Executor.scala:366)
>>>>>         at
>>>>>
>>>>> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(Traver
>>>>> sableLike.scala:772)
>>>>>         at
>>>>>
>>>>> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>>>>         at
>>>>>
>>>>> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>>>>         at
>>>>>
>>>>> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>>>>>         at
>>>>> scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>>>>>         at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>>>>>         at
>>>>>
>>>>> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:7
>>>>> 71)
>>>>>         at
>>>>> org.apache.spark.executor.Executor.org
>>>>> <https://urldefense.proofpoint.com/v1/url?u=http://org.apache.spark.executor.Executor.org&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=cToWyWSkkQGPchtZEHgZAymvCC%2FYOX8btPSeh%2Bth5wM%3D%0A&s=69d2377deecbf9077810fa58426e84fc72e54932d7bf06064e130e9a3ac6af04>
>>>>> $apache$spark$executor$Executor$$upda
>>>>> teDependencies(Executor.scala:366)
>>>>>         at
>>>>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184)
>>>>>         at
>>>>>
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1
>>>>> 142)
>>>>>         at
>>>>>
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:
>>>>> 617)
>>>>>         at java.lang.Thread.run(Thread.java:745)
>>>>> Caused by: java.lang.NumberFormatException: For input string:
>>>>> "db00:2030:709b:face:0:9:0:51453"
>>>>>         at
>>>>>
>>>>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:6
>>>>> 5)
>>>>>         at java.lang.Integer.parseInt(Integer.java:580)
>>>>>         at java.lang.Integer.parseInt(Integer.java:615)
>>>>>         at
>>>>> java.net.URLStreamHandler.parseURL(URLStreamHandler.java:216)
>>>>>         at java.net.URL.<init>(URL.java:615)
>>>>>         ... 18 more
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

Re: IPv6 support

Posted by Kevin Liu <ke...@fb.com>.
Yes.

#spark.master                     spark://dispark001:7078
spark.driver.host       dispark001

Without spark.master specified, it worked. However, when spark.master is uncommented, it will fail with the error below. (simplified the test with all running on a single machine dispark001).

Thanks

From: Akhil Das <ak...@sigmoidanalytics.com>>
Date: Thursday, June 25, 2015 at 12:25 AM
To: Kevin Liu <ke...@fb.com>>, dev <de...@spark.apache.org>>
Cc: "user@spark.apache.org<ma...@spark.apache.org>" <us...@spark.apache.org>>
Subject: Re: IPv6 support

Its the BlockManager hostname<https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/storage/BlockManager.scala#L190> messing up this time, are you having spark.driver.host available in the conf/spark-defaults.conf file?

Thanks
Best Regards

On Wed, Jun 24, 2015 at 11:37 PM, Akhil Das <ak...@sigmoidanalytics.com>> wrote:
+Dev list

Thanks
Best Regards

On Wed, Jun 24, 2015 at 11:26 PM, Kevin Liu <ke...@fb.com>> wrote:
Continuing this thread beyond standalone - onto clusters, does anyone have experience successfully running any Spark cluster on IPv6 only (not dual stack) machines? More companies are moving to IPv6 and some such as Facebook are only allocating new clusters on IPv6 only network, so this is getting more relevant.

YARN still doesn’t support IPv6 per http://wiki.apache.org/hadoop/HadoopIPv6<https://urldefense.proofpoint.com/v1/url?u=http://wiki.apache.org/hadoop/HadoopIPv6&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=1ad213f27da032cb3e6a765bd947c9722513c909f83d3ed977b9815ac9a739c0>

Mesos is questionable per https://issues.apache.org/jira/browse/MESOS-1027<https://urldefense.proofpoint.com/v1/url?u=https://issues.apache.org/jira/browse/MESOS-1027&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=ec04f3bf7f66695ed59c4fd1809733859e24a4fb24805c5fe8f115f8f38048d3> , did anyone get it working?

Standalone: Even though below worked in a single node mode, when I tried to connect to a remote master - it failed with the following, nor did it work with IPv6 address directly like "./bin/spark-shell --master spark://[2401:db00:2030:709b:face:0:9:0]:7078"
client side:

[root@dispark002.ash3<ma...@dispark002.ash3> ~/spark-1.4.0-bin-hadoop2.6]# ./bin/spark-shell --master spark://dispark001:7078

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).

log4j:WARN Please initialize the log4j system properly.

log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig<https://urldefense.proofpoint.com/v1/url?u=http://logging.apache.org/log4j/1.2/faq.html%23noconfig&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=44556dc9d4c8aa248284cfd7689682989b53368214c1e75fc13d2364664e463f> for more info.

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

15/06/24 10:34:03 INFO SecurityManager: Changing view acls to: root

15/06/24 10:34:03 INFO SecurityManager: Changing modify acls to: root

15/06/24 10:34:03 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)

15/06/24 10:34:03 INFO HttpServer: Starting HTTP Server

15/06/24 10:34:03 INFO Utils: Successfully started service 'HTTP class server' on port 49189.

Welcome to

      ____              __

     / __/__  ___ _____/ /__

    _\ \/ _ \/ _ `/ __/  '_/

   /___/ .__/\_,_/_/ /_/\_\   version 1.4.0

      /_/


Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_25)

Type in expressions to have them evaluated.

Type :help for more information.

15/06/24 10:34:05 INFO SparkContext: Running Spark version 1.4.0

15/06/24 10:34:05 INFO SecurityManager: Changing view acls to: root

15/06/24 10:34:05 INFO SecurityManager: Changing modify acls to: root

15/06/24 10:34:05 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)

15/06/24 10:34:06 INFO Slf4jLogger: Slf4jLogger started

15/06/24 10:34:06 INFO Remoting: Starting remoting

15/06/24 10:34:06 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@dispark002:59150]

15/06/24 10:34:06 INFO Utils: Successfully started service 'sparkDriver' on port 59150.

15/06/24 10:34:06 INFO SparkEnv: Registering MapOutputTracker

15/06/24 10:34:06 INFO SparkEnv: Registering BlockManagerMaster

15/06/24 10:34:06 INFO DiskBlockManager: Created local directory at /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/blockmgr-bb240921-31bf-48da-b96a-7120f118d002

15/06/24 10:34:06 INFO MemoryStore: MemoryStore started with capacity 265.1 MB

15/06/24 10:34:06 INFO HttpFileServer: HTTP File server directory is /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/httpd-a7cbeb43-aefd-4da8-8df2-89a528b35c9e

15/06/24 10:34:06 INFO HttpServer: Starting HTTP Server

15/06/24 10:34:06 INFO Utils: Successfully started service 'HTTP file server' on port 57293.

15/06/24 10:34:06 INFO SparkEnv: Registering OutputCommitCoordinator

15/06/24 10:34:06 INFO Utils: Successfully started service 'SparkUI' on port 4040.

15/06/24 10:34:06 INFO SparkUI: Started SparkUI at http://[2401:db00:2030:709b:face:0:f:0]:4040

15/06/24 10:34:06 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@dispark001:7078/user/Master...

15/06/24 10:34:06 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20150624103406-0004

15/06/24 10:34:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 64775.

15/06/24 10:34:06 INFO NettyBlockTransferService: Server created on 64775

15/06/24 10:34:06 ERROR SparkContext: Error initializing SparkContext.

java.lang.AssertionError: assertion failed: Expected hostname

at scala.Predef$.assert(Predef.scala:179)

at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)

at org.apache.spark.storage.BlockManagerId.<init>(BlockManagerId.scala:48)

at org.apache.spark.storage.BlockManagerId$.apply(BlockManagerId.scala:107)

at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:188)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)

at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)

at $line3.$read$$iwC$$iwC.<init>(<console>:9)

at $line3.$read$$iwC.<init>(<console>:18)

at $line3.$read.<init>(<console>:20)

at $line3.$read$.<init>(<console>:24)

at $line3.$read$.<clinit>(<console>)

at $line3.$eval$.<init>(<console>:7)

at $line3.$eval$.<clinit>(<console>)

at $line3.$eval.$print(<console>)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)

at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)

at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)

at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)

at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)

at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)

at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)

at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)

at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)

at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)

at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)

at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)

at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)

at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)

at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)

at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)

at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)

at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)

at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)

at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)

at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)

at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)

at org.apache.spark.repl.SparkILoop.org<https://urldefense.proofpoint.com/v1/url?u=http://org.apache.spark.repl.SparkILoop.org&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=1340df7798a6e49f1216a6e05947ea0801a69d04c5480fb201288052a53ed8d6>$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)

at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)

at org.apache.spark.repl.Main$.main(Main.scala:31)

at org.apache.spark.repl.Main.main(Main.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



server side:

15/06/24 10:34:06 INFO Master: Registering app Spark shell

15/06/24 10:34:06 INFO Master: Registered app Spark shell with ID app-20150624103406-0004

15/06/24 10:34:06 INFO Master: Received unregister request from application app-20150624103406-0004

15/06/24 10:34:06 INFO Master: Removing app app-20150624103406-0004

15/06/24 10:34:07 INFO Master: akka.tcp://sparkDriver@dispark002:59150 got disassociated, removing it.



From: Kevin Liu <ke...@fb.com>>
Date: Wednesday, June 17, 2015 at 11:21 AM
To: Akhil Das <ak...@sigmoidanalytics.com>>

Cc: "user@spark.apache.org<ma...@spark.apache.org>" <us...@spark.apache.org>>
Subject: Re: IPv6 support

You Sir - are a genius. Thank you so much, works now…

Still wondering why I have to do this for IPv6 only machines when the default just works on dual-stack machines, but I am happy enough for now.

Kevin

From: Akhil Das <ak...@sigmoidanalytics.com>>
Date: Wednesday, June 17, 2015 at 12:33 AM
To: Kevin Liu <ke...@fb.com>>
Cc: "user@spark.apache.org<ma...@spark.apache.org>" <us...@spark.apache.org>>
Subject: Re: IPv6 support

If you look at this,


15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host localhost

15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext.

java.lang.AssertionError: assertion failed: Expected hostname

your spark.driver.host <https://github.com/apache/spark/blob/3c0156899dc1ec1f7dfe6d7c8af47fa6dc7d00bf/core/src/main/scala/org/apache/spark/util/RpcUtils.scala#L33>  is being set to localhost which fails host.indexOf(':') == -1<https://github.com/apache/spark/blob/branch-1.4/core/src/main/scala/org/apache/spark/util/Utils.scala#L882>. Try setting your spark.driver.host to dispark001.ash3 (from whichever machine you are running the code)




Thanks
Best Regards

On Wed, Jun 17, 2015 at 10:57 AM, Kevin Liu <ke...@fb.com>> wrote:
Thanks Akhil, it seems that what you said should have fixed it.

1) There was a known issue directly related to below, but it has been fixed in 1.4 https://issues.apache.org/jira/browse/SPARK-6440<https://urldefense.proofpoint.com/v1/url?u=https://issues.apache.org/jira/browse/SPARK-6440&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=464bc64957c6ddd8bb506c8ec449f3c26f8c6f4482dc6ea7356df4e0cb5c98e9>
2) Now, with 1.4 - I see the following errors - even after I setting SPARK_MASTER_IP - your description seems to be right on, any more thoughts?


[root@dispark001.ash3<ma...@dispark001.ash3> ~/spark-1.4.0-bin-hadoop2.6]# ping6 dispark001.ash3

PING dispark001.ash3(dispark001.ash3.facebook.com<http://dispark001.ash3.facebook.com>) 56 data bytes

64 bytes from dispark001.ash3.facebook.com<http://dispark001.ash3.facebook.com>: icmp_seq=1 ttl=64 time=0.012 ms

64 bytes from dispark001.ash3.facebook.com<http://dispark001.ash3.facebook.com>: icmp_seq=2 ttl=64 time=0.021 ms

64 bytes from dispark001.ash3.facebook.com<http://dispark001.ash3.facebook.com>: icmp_seq=3 ttl=64 time=0.011 ms

^C

--- dispark001.ash3 ping statistics ---

3 packets transmitted, 3 received, 0% packet loss, time 2113ms

rtt min/avg/max/mdev = 0.011/0.014/0.021/0.006 ms

[root@dispark001.ash3<ma...@dispark001.ash3> ~/spark-1.4.0-bin-hadoop2.6]# export SPARK_MASTER_IP="dispark001.ash3"

[root@dispark001.ash3<ma...@dispark001.ash3> ~/spark-1.4.0-bin-hadoop2.6]# ./bin/run-example SparkPi 10

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

15/06/16 22:25:13 INFO SparkContext: Running Spark version 1.4.0

15/06/16 22:25:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

15/06/16 22:25:13 INFO SecurityManager: Changing view acls to: root

15/06/16 22:25:13 INFO SecurityManager: Changing modify acls to: root

15/06/16 22:25:13 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)

15/06/16 22:25:13 INFO Slf4jLogger: Slf4jLogger started

15/06/16 22:25:13 INFO Remoting: Starting remoting

15/06/16 22:25:13 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@2401:db00:2030:709b:face:0:9:0:50916]

15/06/16 22:25:13 INFO Utils: Successfully started service 'sparkDriver' on port 50916.

15/06/16 22:25:13 INFO SparkEnv: Registering MapOutputTracker

15/06/16 22:25:13 INFO SparkEnv: Registering BlockManagerMaster

15/06/16 22:25:13 INFO DiskBlockManager: Created local directory at /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b

15/06/16 22:25:13 INFO MemoryStore: MemoryStore started with capacity 265.1 MB

15/06/16 22:25:13 INFO HttpFileServer: HTTP File server directory is /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/httpd-2b6a5162-0686-4cc5-accb-1bb66fddf705

15/06/16 22:25:13 INFO HttpServer: Starting HTTP Server

15/06/16 22:25:13 INFO Utils: Successfully started service 'HTTP file server' on port 35895.

15/06/16 22:25:13 INFO SparkEnv: Registering OutputCommitCoordinator

15/06/16 22:25:13 INFO Utils: Successfully started service 'SparkUI' on port 4040.

15/06/16 22:25:13 INFO SparkUI: Started SparkUI at http://[2401:db00:2030:709b:face:0:9:0]:4040<https://urldefense.proofpoint.com/v1/url?u=http://%5B2401:db00:2030:709b:face:0:9:0%5D:4040&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=92e15781e13e5e474d1bca8148f5fefd80b945eef95378d45fb52397fa9d7f79>

15/06/16 22:25:14 INFO SparkContext: Added JAR file:/root/spark-1.4.0-bin-hadoop2.6/lib/spark-examples-1.4.0-hadoop2.6.0.jar at http://[2401:db00:2030:709b:face:0:9:0]:35895/jars/spark-examples-1.4.0-hadoop2.6.0.jar<https://urldefense.proofpoint.com/v1/url?u=http://%5B2401:db00:2030:709b:face:0:9:0%5D:35895/jars/spark-examples-1.4.0-hadoop2.6.0.jar&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=b914685c5841cb7d585614b192572c79069dd8354422500d21b00cc3fba4c97e> with timestamp 1434518714122

15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host localhost

15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext.

java.lang.AssertionError: assertion failed: Expected hostname

at scala.Predef$.assert(Predef.scala:179)

at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)

at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35)

at org.apache.spark.executor.Executor.<init>(Executor.scala:413)

at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:53)

at org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103)

at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)

at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)

at org.apache.spark.examples.SparkPi.main(SparkPi.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

15/06/16 22:25:14 INFO SparkUI: Stopped Spark web UI at http://[2401:db00:2030:709b:face:0:9:0]:4040<https://urldefense.proofpoint.com/v1/url?u=http://%5B2401:db00:2030:709b:face:0:9:0%5D:4040&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=nqUWryDWO2r6Fj9i0OEy07UOTVrxUso3DR8zhj1PO1k%3D%0A&s=92e15781e13e5e474d1bca8148f5fefd80b945eef95378d45fb52397fa9d7f79>

15/06/16 22:25:14 INFO DAGScheduler: Stopping DAGScheduler

15/06/16 22:25:14 ERROR SparkContext: Error stopping SparkContext after init error.

java.lang.NullPointerException

at org.apache.spark.scheduler.local.LocalBackend.stop(LocalBackend.scala:107)

at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:416)

at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1404)

at org.apache.spark.SparkContext.stop(SparkContext.scala:1642)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:565)

at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)

at org.apache.spark.examples.SparkPi.main(SparkPi.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Exception in thread "main" java.lang.AssertionError: assertion failed: Expected hostname

at scala.Predef$.assert(Predef.scala:179)

at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)

at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35)

at org.apache.spark.executor.Executor.<init>(Executor.scala:413)

at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:53)

at org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103)

at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)

at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)

at org.apache.spark.examples.SparkPi.main(SparkPi.scala)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:483)

at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)

at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)

at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)

at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)

at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

15/06/16 22:25:14 INFO DiskBlockManager: Shutdown hook called

15/06/16 22:25:14 INFO Utils: path = /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b, already present as root for deletion.

15/06/16 22:25:14 INFO Utils: Shutdown hook called

15/06/16 22:25:14 INFO Utils: Deleting directory /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66

[root@dispark001.ash3<ma...@dispark001.ash3> ~/spark-1.4.0-bin-hadoop2.6]#


From: Akhil Das <ak...@sigmoidanalytics.com>>
Date: Monday, May 25, 2015 at 9:33 AM
To: Kevin Liu <ke...@fb.com>>
Cc: "user@spark.apache.org<ma...@spark.apache.org>" <us...@spark.apache.org>>
Subject: Re: IPv6 support

Hi Kevin,

Did you try adding a host name for the ipv6? I have a few ipv6 boxes, spark failed for me when i use just the ipv6 addresses, but it works fine when i use the host names.

Here's an entry in my /etc/hosts:

2607:5300:0100:0200:0000:0000:0000:0a4d hacked.work

My spark-env.sh file:

export SPARK_MASTER_IP="hacked.work"

Here's the master listening on my v6:

[Inline image 1]

The Master UI with running spark-shell:

[Inline image 2]

I even ran a simple sc.parallelize(1 to 100).collect().



Thanks
Best Regards

On Wed, May 20, 2015 at 11:09 PM, Kevin Liu <ke...@fb.com>> wrote:
Hello, I have to work with IPv6 only servers and when I installed the
1.3.1 hadoop 2.6 build, I couldn¹t get the example to run due to IPv6
issues (errors below). I tried to add the
-Djava.net.preferIPv6Addresses=true setting but it still doesn¹t work. A
search on Spark¹s support for IPv6 is inconclusive. Can someone help
clarify the current status for IPv6?

Thanks
Kevin


‹‹ errors ‹

5/05/20 10:17:30 INFO Executor: Fetching
http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo<https://urldefense.proofpoint.com/v1/url?u=http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=36fe35441ecafdbc99ac8e5605e91d0a4ea88855d27d1a964e55e04ae65c7fde>
p2.6.0.jar with timestamp 1432142250197
15/05/20 10:17:30 INFO Executor: Fetching
http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo<https://urldefense.proofpoint.com/v1/url?u=http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=36fe35441ecafdbc99ac8e5605e91d0a4ea88855d27d1a964e55e04ae65c7fde>
p2.6.0.jar with timestamp 1432142250197
15/05/20 10:17:30 ERROR Executor: Exception in task 5.0 in stage 0.0 (TID
5)
java.net.MalformedURLException: For input string:
"db00:2030:709b:face:0:9:0:51453"
        at java.net.URL.<init>(URL.java:620)
        at java.net.URL.<init>(URL.java:483)
        at java.net.URL.<init>(URL.java:432)
        at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:603)
        at org.apache.spark.util.Utils$.fetchFile(Utils.scala:431)
        at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu
tor$$updateDependencies$5.apply(Executor.scala:374)
        at
org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu
tor$$updateDependencies$5.apply(Executor.scala:366)
        at
scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(Traver
sableLike.scala:772)
        at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
        at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
        at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
        at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
        at
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:7
71)
        at
org.apache.spark.executor.Executor.org<https://urldefense.proofpoint.com/v1/url?u=http://org.apache.spark.executor.Executor.org&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=cToWyWSkkQGPchtZEHgZAymvCC%2FYOX8btPSeh%2Bth5wM%3D%0A&s=69d2377deecbf9077810fa58426e84fc72e54932d7bf06064e130e9a3ac6af04>$apache$spark$executor$Executor$$upda
teDependencies(Executor.scala:366)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1
142)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:
617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NumberFormatException: For input string:
"db00:2030:709b:face:0:9:0:51453"
        at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:6
5)
        at java.lang.Integer.parseInt(Integer.java:580)
        at java.lang.Integer.parseInt(Integer.java:615)
        at java.net.URLStreamHandler.parseURL(URLStreamHandler.java:216)
        at java.net.URL.<init>(URL.java:615)
        ... 18 more






---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>
For additional commands, e-mail: user-help@spark.apache.org<ma...@spark.apache.org>






Re: IPv6 support

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Its the BlockManager hostname
<https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/storage/BlockManager.scala#L190>
messing up this time, are you having spark.driver.host available in the
conf/spark-defaults.conf file?

Thanks
Best Regards

On Wed, Jun 24, 2015 at 11:37 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> +Dev list
>
> Thanks
> Best Regards
>
> On Wed, Jun 24, 2015 at 11:26 PM, Kevin Liu <ke...@fb.com> wrote:
>
>>  Continuing this thread beyond standalone - onto clusters, does anyone
>> have experience successfully running any Spark cluster on IPv6 *only*
>> (not dual stack) machines? More companies are moving to IPv6 and some such
>> as Facebook are only allocating new clusters on IPv6 only network, so this
>> is getting more relevant.
>>
>>  YARN still doesn’t support IPv6 per
>> http://wiki.apache.org/hadoop/HadoopIPv6
>>
>>  Mesos is questionable per
>> https://issues.apache.org/jira/browse/MESOS-1027 , did anyone get it
>> working?
>>
>>  Standalone: Even though below worked in a single node mode, when I
>> tried to connect to a remote master - it failed with the following, nor did
>> it work with IPv6 address directly like "./bin/spark-shell --master
>> spark://[2401:db00:2030:709b:face:0:9:0]:7078"
>> client side:
>>
>> [root@dispark002.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ./bin/spark-shell
>> --master spark://dispark001:7078
>>
>> log4j:WARN No appenders could be found for logger
>> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
>>
>> log4j:WARN Please initialize the log4j system properly.
>>
>> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
>> more info.
>>
>> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>>
>> 15/06/24 10:34:03 INFO SecurityManager: Changing view acls to: root
>>
>> 15/06/24 10:34:03 INFO SecurityManager: Changing modify acls to: root
>>
>> 15/06/24 10:34:03 INFO SecurityManager: SecurityManager: authentication
>> disabled; ui acls disabled; users with view permissions: Set(root); users
>> with modify permissions: Set(root)
>>
>> 15/06/24 10:34:03 INFO HttpServer: Starting HTTP Server
>>
>> 15/06/24 10:34:03 INFO Utils: Successfully started service 'HTTP class
>> server' on port 49189.
>>
>> Welcome to
>>
>>       ____              __
>>
>>      / __/__  ___ _____/ /__
>>
>>     _\ \/ _ \/ _ `/ __/  '_/
>>
>>    /___/ .__/\_,_/_/ /_/\_\   version 1.4.0
>>
>>       /_/
>>
>>
>>  Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java
>> 1.8.0_25)
>>
>> Type in expressions to have them evaluated.
>>
>> Type :help for more information.
>>
>> 15/06/24 10:34:05 INFO SparkContext: Running Spark version 1.4.0
>>
>> 15/06/24 10:34:05 INFO SecurityManager: Changing view acls to: root
>>
>> 15/06/24 10:34:05 INFO SecurityManager: Changing modify acls to: root
>>
>> 15/06/24 10:34:05 INFO SecurityManager: SecurityManager: authentication
>> disabled; ui acls disabled; users with view permissions: Set(root); users
>> with modify permissions: Set(root)
>>
>> 15/06/24 10:34:06 INFO Slf4jLogger: Slf4jLogger started
>>
>> 15/06/24 10:34:06 INFO Remoting: Starting remoting
>>
>> 15/06/24 10:34:06 INFO Remoting: Remoting started; listening on addresses
>> :[akka.tcp://sparkDriver@dispark002:59150]
>>
>> 15/06/24 10:34:06 INFO Utils: Successfully started service 'sparkDriver'
>> on port 59150.
>>
>> 15/06/24 10:34:06 INFO SparkEnv: Registering MapOutputTracker
>>
>> 15/06/24 10:34:06 INFO SparkEnv: Registering BlockManagerMaster
>>
>> 15/06/24 10:34:06 INFO DiskBlockManager: Created local directory at
>> /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/blockmgr-bb240921-31bf-48da-b96a-7120f118d002
>>
>> 15/06/24 10:34:06 INFO MemoryStore: MemoryStore started with capacity
>> 265.1 MB
>>
>> 15/06/24 10:34:06 INFO HttpFileServer: HTTP File server directory is
>> /tmp/spark-b4248e03-80c2-4d54-b3af-5044c8228f68/httpd-a7cbeb43-aefd-4da8-8df2-89a528b35c9e
>>
>> 15/06/24 10:34:06 INFO HttpServer: Starting HTTP Server
>>
>> 15/06/24 10:34:06 INFO Utils: Successfully started service 'HTTP file
>> server' on port 57293.
>>
>> 15/06/24 10:34:06 INFO SparkEnv: Registering OutputCommitCoordinator
>>
>> 15/06/24 10:34:06 INFO Utils: Successfully started service 'SparkUI' on
>> port 4040.
>>
>> 15/06/24 10:34:06 INFO SparkUI: Started SparkUI at http://
>> [2401:db00:2030:709b:face:0:f:0]:4040
>>
>> 15/06/24 10:34:06 INFO AppClient$ClientActor: Connecting to master
>> akka.tcp://sparkMaster@dispark001:7078/user/Master...
>>
>> 15/06/24 10:34:06 INFO SparkDeploySchedulerBackend: Connected to Spark
>> cluster with app ID app-20150624103406-0004
>>
>> 15/06/24 10:34:06 INFO Utils: Successfully started service
>> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 64775.
>>
>> 15/06/24 10:34:06 INFO NettyBlockTransferService: Server created on 64775
>>
>> 15/06/24 10:34:06 ERROR SparkContext: Error initializing SparkContext.
>>
>> java.lang.AssertionError: assertion failed: Expected hostname
>>
>> at scala.Predef$.assert(Predef.scala:179)
>>
>> at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)
>>
>> at org.apache.spark.storage.BlockManagerId.<init>(BlockManagerId.scala:48)
>>
>> at
>> org.apache.spark.storage.BlockManagerId$.apply(BlockManagerId.scala:107)
>>
>> at
>> org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:188)
>>
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:502)
>>
>> at
>> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017)
>>
>> at $line3.$read$$iwC$$iwC.<init>(<console>:9)
>>
>> at $line3.$read$$iwC.<init>(<console>:18)
>>
>> at $line3.$read.<init>(<console>:20)
>>
>> at $line3.$read$.<init>(<console>:24)
>>
>> at $line3.$read$.<clinit>(<console>)
>>
>> at $line3.$eval$.<init>(<console>:7)
>>
>> at $line3.$eval$.<clinit>(<console>)
>>
>> at $line3.$eval.$print(<console>)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>> at java.lang.reflect.Method.invoke(Method.java:483)
>>
>> at
>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>
>> at
>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>>
>> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>
>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>
>> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>
>> at
>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>
>> at
>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>
>> at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>
>> at
>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
>>
>> at
>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
>>
>> at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>>
>> at
>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
>>
>> at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
>>
>> at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>>
>> at
>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
>>
>> at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>
>> at
>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
>>
>> at
>> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
>>
>> at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>>
>> at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>
>> at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>
>> at
>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>
>> at org.apache.spark.repl.SparkILoop.org
>> $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>
>> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>
>> at org.apache.spark.repl.Main$.main(Main.scala:31)
>>
>> at org.apache.spark.repl.Main.main(Main.scala)
>>
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>
>> at java.lang.reflect.Method.invoke(Method.java:483)
>>
>> at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>
>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>
>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>>
>>  server side:
>>
>> 15/06/24 10:34:06 INFO Master: Registering app Spark shell
>>
>> 15/06/24 10:34:06 INFO Master: Registered app Spark shell with ID
>> app-20150624103406-0004
>>
>> 15/06/24 10:34:06 INFO Master: Received unregister request from
>> application app-20150624103406-0004
>>
>> 15/06/24 10:34:06 INFO Master: Removing app app-20150624103406-0004
>>
>> 15/06/24 10:34:07 INFO Master: akka.tcp://sparkDriver@dispark002:59150
>> got disassociated, removing it.
>>
>>
>>
>>   From: Kevin Liu <ke...@fb.com>
>> Date: Wednesday, June 17, 2015 at 11:21 AM
>> To: Akhil Das <ak...@sigmoidanalytics.com>
>>
>> Cc: "user@spark.apache.org" <us...@spark.apache.org>
>> Subject: Re: IPv6 support
>>
>>   You Sir - are a genius. Thank you so much, works now…
>>
>>  Still wondering why I have to do this for IPv6 only machines when the
>> default just works on dual-stack machines, but I am happy enough for now.
>>
>>  Kevin
>>
>>   From: Akhil Das <ak...@sigmoidanalytics.com>
>> Date: Wednesday, June 17, 2015 at 12:33 AM
>> To: Kevin Liu <ke...@fb.com>
>> Cc: "user@spark.apache.org" <us...@spark.apache.org>
>> Subject: Re: IPv6 support
>>
>>    If you look at this,
>>
>>   15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host
>> localhost
>>
>> 15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext.
>>
>> java.lang.AssertionError: assertion failed: Expected hostname
>>
>>  your spark.driver.host
>> <https://github.com/apache/spark/blob/3c0156899dc1ec1f7dfe6d7c8af47fa6dc7d00bf/core/src/main/scala/org/apache/spark/util/RpcUtils.scala#L33> is
>> being set to localhost which fails host.indexOf(':') == -1
>> <https://github.com/apache/spark/blob/branch-1.4/core/src/main/scala/org/apache/spark/util/Utils.scala#L882>.
>> Try setting your spark.driver.host to dispark001.ash3 (from whichever
>> machine you are running the code)
>>
>>
>>
>>
>>  Thanks
>> Best Regards
>>
>> On Wed, Jun 17, 2015 at 10:57 AM, Kevin Liu <ke...@fb.com> wrote:
>>
>>>  Thanks Akhil, it seems that what you said should have fixed it.
>>>
>>>  1) There was a known issue directly related to below, but it has been
>>> fixed in 1.4 https://issues.apache.org/jira/browse/SPARK-6440
>>> <https://urldefense.proofpoint.com/v1/url?u=https://issues.apache.org/jira/browse/SPARK-6440&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=464bc64957c6ddd8bb506c8ec449f3c26f8c6f4482dc6ea7356df4e0cb5c98e9>
>>> 2) Now, with 1.4 - I see the following errors - even after I setting
>>> SPARK_MASTER_IP - your description seems to be right on, any more thoughts?
>>>
>>>   [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ping6
>>> dispark001.ash3
>>>
>>> PING dispark001.ash3(dispark001.ash3.facebook.com) 56 data bytes
>>>
>>> 64 bytes from dispark001.ash3.facebook.com: icmp_seq=1 ttl=64
>>> time=0.012 ms
>>>
>>> 64 bytes from dispark001.ash3.facebook.com: icmp_seq=2 ttl=64
>>> time=0.021 ms
>>>
>>> 64 bytes from dispark001.ash3.facebook.com: icmp_seq=3 ttl=64
>>> time=0.011 ms
>>>
>>> ^C
>>>
>>> --- dispark001.ash3 ping statistics ---
>>>
>>> 3 packets transmitted, 3 received, 0% packet loss, time 2113ms
>>>
>>> rtt min/avg/max/mdev = 0.011/0.014/0.021/0.006 ms
>>>
>>> [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# export
>>> SPARK_MASTER_IP="dispark001.ash3"
>>>
>>> [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]# ./bin/run-example
>>> SparkPi 10
>>>
>>> Using Spark's default log4j profile:
>>> org/apache/spark/log4j-defaults.properties
>>>
>>> 15/06/16 22:25:13 INFO SparkContext: Running Spark version 1.4.0
>>>
>>> 15/06/16 22:25:13 WARN NativeCodeLoader: Unable to load native-hadoop
>>> library for your platform... using builtin-java classes where applicable
>>>
>>> 15/06/16 22:25:13 INFO SecurityManager: Changing view acls to: root
>>>
>>> 15/06/16 22:25:13 INFO SecurityManager: Changing modify acls to: root
>>>
>>> 15/06/16 22:25:13 INFO SecurityManager: SecurityManager: authentication
>>> disabled; ui acls disabled; users with view permissions: Set(root); users
>>> with modify permissions: Set(root)
>>>
>>> 15/06/16 22:25:13 INFO Slf4jLogger: Slf4jLogger started
>>>
>>> 15/06/16 22:25:13 INFO Remoting: Starting remoting
>>>
>>> 15/06/16 22:25:13 INFO Remoting: Remoting started; listening on
>>> addresses :[akka.tcp://sparkDriver@2401:db00:2030:709b:face:0:9:0:50916]
>>>
>>> 15/06/16 22:25:13 INFO Utils: Successfully started service 'sparkDriver'
>>> on port 50916.
>>>
>>> 15/06/16 22:25:13 INFO SparkEnv: Registering MapOutputTracker
>>>
>>> 15/06/16 22:25:13 INFO SparkEnv: Registering BlockManagerMaster
>>>
>>> 15/06/16 22:25:13 INFO DiskBlockManager: Created local directory at
>>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b
>>>
>>> 15/06/16 22:25:13 INFO MemoryStore: MemoryStore started with capacity
>>> 265.1 MB
>>>
>>> 15/06/16 22:25:13 INFO HttpFileServer: HTTP File server directory is
>>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/httpd-2b6a5162-0686-4cc5-accb-1bb66fddf705
>>>
>>> 15/06/16 22:25:13 INFO HttpServer: Starting HTTP Server
>>>
>>> 15/06/16 22:25:13 INFO Utils: Successfully started service 'HTTP file
>>> server' on port 35895.
>>>
>>> 15/06/16 22:25:13 INFO SparkEnv: Registering OutputCommitCoordinator
>>>
>>> 15/06/16 22:25:13 INFO Utils: Successfully started service 'SparkUI' on
>>> port 4040.
>>>
>>> 15/06/16 22:25:13 INFO SparkUI: Started SparkUI at
>>> http://[2401:db00:2030:709b:face:0:9:0]:4040
>>>
>>> 15/06/16 22:25:14 INFO SparkContext: Added JAR
>>> file:/root/spark-1.4.0-bin-hadoop2.6/lib/spark-examples-1.4.0-hadoop2.6.0.jar
>>> at
>>> http://[2401:db00:2030:709b:face:0:9:0]:35895/jars/spark-examples-1.4.0-hadoop2.6.0.jar
>>> with timestamp 1434518714122
>>>
>>> 15/06/16 22:25:14 INFO Executor: Starting executor ID driver on host
>>> localhost
>>>
>>> 15/06/16 22:25:14 ERROR SparkContext: Error initializing SparkContext.
>>>
>>> java.lang.AssertionError: assertion failed: Expected hostname
>>>
>>> at scala.Predef$.assert(Predef.scala:179)
>>>
>>> at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)
>>>
>>> at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35)
>>>
>>> at org.apache.spark.executor.Executor.<init>(Executor.scala:413)
>>>
>>> at
>>> org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:53)
>>>
>>> at
>>> org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103)
>>>
>>> at
>>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
>>>
>>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
>>>
>>> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>>>
>>> at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:483)
>>>
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>>
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>>
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> 15/06/16 22:25:14 INFO SparkUI: Stopped Spark web UI at
>>> http://[2401:db00:2030:709b:face:0:9:0]:4040
>>>
>>> 15/06/16 22:25:14 INFO DAGScheduler: Stopping DAGScheduler
>>>
>>> 15/06/16 22:25:14 ERROR SparkContext: Error stopping SparkContext after
>>> init error.
>>>
>>> java.lang.NullPointerException
>>>
>>> at
>>> org.apache.spark.scheduler.local.LocalBackend.stop(LocalBackend.scala:107)
>>>
>>> at
>>> org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:416)
>>>
>>> at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1404)
>>>
>>> at org.apache.spark.SparkContext.stop(SparkContext.scala:1642)
>>>
>>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:565)
>>>
>>> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>>>
>>> at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:483)
>>>
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>>
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>>
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> Exception in thread "main" java.lang.AssertionError: assertion failed:
>>> Expected hostname
>>>
>>> at scala.Predef$.assert(Predef.scala:179)
>>>
>>> at org.apache.spark.util.Utils$.checkHost(Utils.scala:882)
>>>
>>> at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:35)
>>>
>>> at org.apache.spark.executor.Executor.<init>(Executor.scala:413)
>>>
>>> at
>>> org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:53)
>>>
>>> at
>>> org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:103)
>>>
>>> at
>>> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:141)
>>>
>>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
>>>
>>> at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:28)
>>>
>>> at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>>>
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>
>>> at java.lang.reflect.Method.invoke(Method.java:483)
>>>
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
>>>
>>> at
>>> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
>>>
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> 15/06/16 22:25:14 INFO DiskBlockManager: Shutdown hook called
>>>
>>> 15/06/16 22:25:14 INFO Utils: path =
>>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66/blockmgr-4bcbf896-03ad-4b11-8db5-a6eaa5f0222b,
>>> already present as root for deletion.
>>>
>>> 15/06/16 22:25:14 INFO Utils: Shutdown hook called
>>>
>>> 15/06/16 22:25:14 INFO Utils: Deleting directory
>>> /tmp/spark-325ae594-5b00-42e2-bd47-70c92c4e5f66
>>>
>>> [root@dispark001.ash3 ~/spark-1.4.0-bin-hadoop2.6]#
>>>
>>>
>>>   From: Akhil Das <ak...@sigmoidanalytics.com>
>>> Date: Monday, May 25, 2015 at 9:33 AM
>>> To: Kevin Liu <ke...@fb.com>
>>> Cc: "user@spark.apache.org" <us...@spark.apache.org>
>>> Subject: Re: IPv6 support
>>>
>>>    Hi Kevin,
>>>
>>>  Did you try adding a host name for the ipv6? I have a few ipv6 boxes,
>>> spark failed for me when i use just the ipv6 addresses, but it works fine
>>> when i use the host names.
>>>
>>>  Here's an entry in my /etc/hosts:
>>>
>>>  2607:5300:0100:0200:0000:0000:0000:0a4d hacked.work
>>>
>>>
>>>  My spark-env.sh file:
>>>
>>>  export SPARK_MASTER_IP="hacked.work"
>>>
>>>
>>>  Here's the master listening on my v6:
>>>
>>>   [image: Inline image 1]
>>>
>>>
>>>  The Master UI with running spark-shell:
>>>
>>>   [image: Inline image 2]
>>>
>>>
>>>  I even ran a simple sc.parallelize(1 to 100).collect().
>>>
>>>
>>>
>>>  Thanks
>>> Best Regards
>>>
>>> On Wed, May 20, 2015 at 11:09 PM, Kevin Liu <ke...@fb.com> wrote:
>>>
>>>> Hello, I have to work with IPv6 only servers and when I installed the
>>>> 1.3.1 hadoop 2.6 build, I couldn¹t get the example to run due to IPv6
>>>> issues (errors below). I tried to add the
>>>> -Djava.net.preferIPv6Addresses=true setting but it still doesn¹t work. A
>>>> search on Spark¹s support for IPv6 is inconclusive. Can someone help
>>>> clarify the current status for IPv6?
>>>>
>>>> Thanks
>>>> Kevin
>>>>
>>>>
>>>> ‹‹ errors ‹
>>>>
>>>> 5/05/20 10:17:30 INFO Executor: Fetching
>>>>
>>>> http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo
>>>> <https://urldefense.proofpoint.com/v1/url?u=http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=36fe35441ecafdbc99ac8e5605e91d0a4ea88855d27d1a964e55e04ae65c7fde>
>>>> p2.6.0.jar with timestamp 1432142250197
>>>> 15/05/20 10:17:30 INFO Executor: Fetching
>>>>
>>>> http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo
>>>> <https://urldefense.proofpoint.com/v1/url?u=http://2401:db00:2030:709b:face:0:9:0:51453/jars/spark-examples-1.3.1-hadoo&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=06CDmFdO8vUzzjbqnMBz2CS55qwEN6lPy%2FypbKwNdow%3D%0A&s=36fe35441ecafdbc99ac8e5605e91d0a4ea88855d27d1a964e55e04ae65c7fde>
>>>> p2.6.0.jar with timestamp 1432142250197
>>>> 15/05/20 10:17:30 ERROR Executor: Exception in task 5.0 in stage 0.0
>>>> (TID
>>>> 5)
>>>> java.net.MalformedURLException: For input string:
>>>> "db00:2030:709b:face:0:9:0:51453"
>>>>         at java.net.URL.<init>(URL.java:620)
>>>>         at java.net.URL.<init>(URL.java:483)
>>>>         at java.net.URL.<init>(URL.java:432)
>>>>         at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:603)
>>>>         at org.apache.spark.util.Utils$.fetchFile(Utils.scala:431)
>>>>         at
>>>>
>>>> org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu
>>>> tor$$updateDependencies$5.apply(Executor.scala:374)
>>>>         at
>>>>
>>>> org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Execu
>>>> tor$$updateDependencies$5.apply(Executor.scala:366)
>>>>         at
>>>>
>>>> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(Traver
>>>> sableLike.scala:772)
>>>>         at
>>>>
>>>> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>>>         at
>>>>
>>>> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>>>         at
>>>>
>>>> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>>>>         at
>>>> scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>>>>         at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>>>>         at
>>>>
>>>> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:7
>>>> 71)
>>>>         at
>>>> org.apache.spark.executor.Executor.org
>>>> <https://urldefense.proofpoint.com/v1/url?u=http://org.apache.spark.executor.Executor.org&k=ZVNjlDMF0FElm4dQtryO4A%3D%3D%0A&r=YviX1%2F1vaAZK%2BrqaSzu%2FMg%3D%3D%0A&m=cToWyWSkkQGPchtZEHgZAymvCC%2FYOX8btPSeh%2Bth5wM%3D%0A&s=69d2377deecbf9077810fa58426e84fc72e54932d7bf06064e130e9a3ac6af04>
>>>> $apache$spark$executor$Executor$$upda
>>>> teDependencies(Executor.scala:366)
>>>>         at
>>>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:184)
>>>>         at
>>>>
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1
>>>> 142)
>>>>         at
>>>>
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:
>>>> 617)
>>>>         at java.lang.Thread.run(Thread.java:745)
>>>> Caused by: java.lang.NumberFormatException: For input string:
>>>> "db00:2030:709b:face:0:9:0:51453"
>>>>         at
>>>>
>>>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:6
>>>> 5)
>>>>         at java.lang.Integer.parseInt(Integer.java:580)
>>>>         at java.lang.Integer.parseInt(Integer.java:615)
>>>>         at java.net.URLStreamHandler.parseURL(URLStreamHandler.java:216)
>>>>         at java.net.URL.<init>(URL.java:615)
>>>>         ... 18 more
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>
>>>>
>>>
>>
>