You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Dibyendu Bhattacharya <di...@gmail.com> on 2016/07/13 06:26:49 UTC

Any Idea about this error : IllegalArgumentException: File segment length cannot be negative ?

In Spark Streaming job, I see a Batch failed with following error. Haven't
seen anything like this earlier.

This has happened during Shuffle for one Batch (haven't reoccurred after
that).. Just curious to know what can cause this error. I am running Spark
1.5.1

Regards,
Dibyendu


Job aborted due to stage failure: Task 2801 in stage 9421.0 failed 4
times, most recent failure: Lost task 2801.3 in stage 9421.0:
java.lang.IllegalArgumentException: requirement failed: File segment
length cannot be negative (got -68321)
	at scala.Predef$.require(Predef.scala:233)
	at org.apache.spark.storage.FileSegment.<init>(FileSegment.scala:28)
	at org.apache.spark.storage.DiskBlockObjectWriter.fileSegment(DiskBlockObjectWriter.scala:216)
	at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:684)
	at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:80)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
	at org.apache.spark.scheduler.Task.run(Task.scala:88)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

Re: Any Idea about this error : IllegalArgumentException: File segment length cannot be negative ?

Posted by Victor Tso-Guillen <vt...@paxata.com>.
Along with Priya's email slightly earlier than this one, we also are seeing
this happen on Spark 1.5.2.

On Wed, Jul 13, 2016 at 1:26 AM Dibyendu Bhattacharya <
dibyendu.bhattachary@gmail.com> wrote:

> In Spark Streaming job, I see a Batch failed with following error. Haven't
> seen anything like this earlier.
>
> This has happened during Shuffle for one Batch (haven't reoccurred after
> that).. Just curious to know what can cause this error. I am running Spark
> 1.5.1
>
> Regards,
> Dibyendu
>
>
> Job aborted due to stage failure: Task 2801 in stage 9421.0 failed 4 times, most recent failure: Lost task 2801.3 in stage 9421.0: java.lang.IllegalArgumentException: requirement failed: File segment length cannot be negative (got -68321)
> 	at scala.Predef$.require(Predef.scala:233)
> 	at org.apache.spark.storage.FileSegment.<init>(FileSegment.scala:28)
> 	at org.apache.spark.storage.DiskBlockObjectWriter.fileSegment(DiskBlockObjectWriter.scala:216)
> 	at org.apache.spark.util.collection.ExternalSorter.writePartitionedFile(ExternalSorter.scala:684)
> 	at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:80)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
> 	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:88)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
>
>