You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Kousuke Saruta <sa...@oss.nttdata.co.jp> on 2016/07/28 23:15:58 UTC

Re: ERROR: java.net.UnknownHostException

Hi Miki,

What version of Spark are you using?
If the version is > 1.4,  you might hit SPARK-11227.

- Kousuke

On 2016/07/28 18:34, Miki Shingo wrote:

> To whom who has knowledge?
>
> I have faced the following error try to use HA configuration.
> (java.net.UnknownHostException)
>
> below is the error for reference.
>
> 16/07/27 22:42:56 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, dphmuyarn1107.hadoop.local): java.lang.IllegalArgumentException: java.net.UnknownHostException: hdpha
>          at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:411)
>          at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:311)
>          at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
>          at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:678)
>          at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:619)
>          at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:150)
>          at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2653)
>          at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:92)
>          at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2687)
>          at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2669)
>          at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
>          at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:170)
>          at org.apache.hadoop.mapred.JobConf.getWorkingDirectory(JobConf.java:656)
>          at org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:438)
>          at org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:411)
>          at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$33.apply(SparkContext.scala:1038)
>          at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$33.apply(SparkContext.scala:1038)
>          at org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:178)
>          at org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:178)
>          at scala.Option.map(Option.scala:145)
>          at org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:178)
>          at org.apache.spark.rdd.HadoopRDD$$anon$1.<init>(HadoopRDD.scala:216)
>          at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:212)
>          at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:101)
>          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:313)
>          at org.apache.spark.rdd.RDD.iterator(RDD.scala:277)
>          at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>          at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:313)
>          at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:69)
>          at org.apache.spark.rdd.RDD.iterator(RDD.scala:275)
>          at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>          at org.apache.spark.scheduler.Task.run(Task.scala:89)
>          at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
>          at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>          at java.lang.Thread.run(Thread.java:745)
> Caused by: java.net.UnknownHostException: hdpha
>          ... 36 more
>
> Thanks & Regards
>
>    Miki
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org