You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by swetha <sw...@gmail.com> on 2015/08/19 07:06:04 UTC

Failed to fetch block error

Hi,

I see the following error in my Spark Job even after using like 100 cores
and 16G memory. Did any of you experience the same problem earlier?

15/08/18 21:51:23 ERROR shuffle.RetryingBlockFetcher: Failed to fetch block
input-0-1439959114400, and will not retry (0 retries)
java.lang.RuntimeException: java.io.FileNotFoundException:
/data1/spark/spark-aed30958-2ee1-4eb7-984e-6402fb0a0503/blockmgr-ded36b52-ccc7-48dc-ba05-65bb21fc4136/34/input-0-1439959114400
(Too many open files)
	at java.io.RandomAccessFile.open(Native Method)
	at java.io.RandomAccessFile.<init>(RandomAccessFile.java:241)
	at org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:110)
	at org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:134)
	at org.apache.spark.storage.BlockManager.doGetLocal(BlockManager.scala:511)
	at
org.apache.spark.storage.BlockManager.getBlockData(BlockManager.scala:302)
	at
org.apache.spark.network.netty.NettyBlockRpcServer$$anonfun$2.apply(NettyBlockRpcServer.scala:57)
	at
org.apache.spark.network.netty.NettyBlockRpcServer$$anonfun$2.apply(NettyBlockRpcServer.scala:57)
	at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
	at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
	at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
	at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Failed-to-fetch-block-error-tp24335.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


RE: Failed to fetch block error

Posted by java8964 <ja...@hotmail.com>.
>From the log, it looks like the OS user who is running spark cannot open any more file.
Check your ulimit setting for that user:
ulimit -aopen files                      (-n) 65536

> Date: Tue, 18 Aug 2015 22:06:04 -0700
> From: swethakasireddy@gmail.com
> To: user@spark.apache.org
> Subject: Failed to fetch block  error
> 
> Hi,
> 
> I see the following error in my Spark Job even after using like 100 cores
> and 16G memory. Did any of you experience the same problem earlier?
> 
> 15/08/18 21:51:23 ERROR shuffle.RetryingBlockFetcher: Failed to fetch block
> input-0-1439959114400, and will not retry (0 retries)
> java.lang.RuntimeException: java.io.FileNotFoundException:
> /data1/spark/spark-aed30958-2ee1-4eb7-984e-6402fb0a0503/blockmgr-ded36b52-ccc7-48dc-ba05-65bb21fc4136/34/input-0-1439959114400
> (Too many open files)
> 	at java.io.RandomAccessFile.open(Native Method)
> 	at java.io.RandomAccessFile.<init>(RandomAccessFile.java:241)
> 	at org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:110)
> 	at org.apache.spark.storage.DiskStore.getBytes(DiskStore.scala:134)
> 	at org.apache.spark.storage.BlockManager.doGetLocal(BlockManager.scala:511)
> 	at
> org.apache.spark.storage.BlockManager.getBlockData(BlockManager.scala:302)
> 	at
> org.apache.spark.network.netty.NettyBlockRpcServer$$anonfun$2.apply(NettyBlockRpcServer.scala:57)
> 	at
> org.apache.spark.network.netty.NettyBlockRpcServer$$anonfun$2.apply(NettyBlockRpcServer.scala:57)
> 	at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> 	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
> 	at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
> 	at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
> 
> 
> 
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Failed-to-fetch-block-error-tp24335.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>