You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by buntu <bu...@gmail.com> on 2014/07/23 10:52:21 UTC

spark-shell -- running into ArrayIndexOutOfBoundsException

I'm using the spark-shell locally and working on a dataset of size 900MB. I
initially ran into "java.lang.OutOfMemoryError: GC overhead limit exceeded"
error and upon researching set "SPARK_DRIVER_MEMORY" to 4g.

Now I run into ArrayIndexOutOfBoundsException, please let me know if there
is some way to fix this:

ERROR Executor: Exception in task ID 8
java.lang.ArrayIndexOutOfBoundsException: 1
	at $line14.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
	at $line14.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$1.next(Iterator.scala:853)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:158)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.Task.run(Task.scala:51)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
14/07/23 01:39:12 ERROR Executor: Exception in task ID 2
java.lang.ArrayIndexOutOfBoundsException: 3
	at $line14.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
	at $line14.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$1.next(Iterator.scala:853)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:158)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.Task.run(Task.scala:51)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
14/07/23 01:39:12 WARN TaskSetManager: Lost TID 8 (task 1.0:8)
14/07/23 01:39:12 WARN TaskSetManager: Loss was due to
java.lang.ArrayIndexOutOfBoundsException
java.lang.ArrayIndexOutOfBoundsException: 1
	at $line14.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
	at $line14.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$1.next(Iterator.scala:853)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:158)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.Task.run(Task.scala:51)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
14/07/23 01:39:12 ERROR TaskSetManager: Task 1.0:8 failed 1 times; aborting
job
14/07/23 01:39:12 WARN TaskSetManager: Loss was due to
java.lang.ArrayIndexOutOfBoundsException
java.lang.ArrayIndexOutOfBoundsException: 3
	at $line14.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
	at $line14.$read$$iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$1.next(Iterator.scala:853)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
	at scala.collection.Iterator$class.foreach(Iterator.scala:727)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:158)
	at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
	at org.apache.spark.scheduler.Task.run(Task.scala:51)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
14/07/23 01:39:12 ERROR Executor: Exception in task ID 9
org.apache.spark.TaskKilledException
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
14/07/23 01:39:12 WARN TaskSetManager: Loss was due to
org.apache.spark.TaskKilledException
org.apache.spark.TaskKilledException
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
14/07/23 01:39:13 ERROR Executor: Exception in task ID 10
org.apache.spark.TaskKilledException
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:178)
	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
14/07/23 01:39:13 WARN TaskSetManager: Task 1 was killed.
14/07/23 01:39:13 WARN TaskSetManager: Task 6 was killed.
14/07/23 01:39:13 WARN TaskSetManager: Task 0 was killed.
14/07/23 01:39:13 WARN TaskSetManager: Task 3 was killed.
14/07/23 01:39:13 WARN TaskSetManager: Task 4 was killed.
14/07/23 01:39:13 WARN TaskSetManager: Task 7 was killed.
org.apache.spark.SparkException: Job aborted due to stage failure: Task
1.0:8 failed 1 times, most recent failure: Exception failure in TID 8 on
host localhost: java.lang.ArrayIndexOutOfBoundsException: 1
        $iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
        $iwC$$iwC$$iwC$$iwC$$anonfun$2.apply(<console>:21)
        scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
        scala.collection.Iterator$$anon$1.next(Iterator.scala:853)
        scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
        scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
        scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
       
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:158)
       
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
        org.apache.spark.scheduler.Task.run(Task.scala:51)
       
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
       
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
       
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
	at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1033)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1017)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1015)
	at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
	at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1015)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:633)
	at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:633)
	at scala.Option.foreach(Option.scala:236)
	at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:633)
	at
org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$2.applyOrElse(DAGScheduler.scala:1207)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
	at akka.actor.ActorCell.invoke(ActorCell.scala:456)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
	at akka.dispatch.Mailbox.run(Mailbox.scala:219)
	at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
	at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)


Thanks!



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-shell-running-into-ArrayIndexOutOfBoundsException-tp10480.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: spark-shell -- running into ArrayIndexOutOfBoundsException

Posted by buntu <bu...@gmail.com>.
Turns out to be an issue with number of fields being read, one of the fields
might be missing from the raw data file causing this error. Michael Ambrust
pointed it out in another thread.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-shell-running-into-ArrayIndexOutOfBoundsException-tp10480p10542.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: spark-shell -- running into ArrayIndexOutOfBoundsException

Posted by buntu <bu...@gmail.com>.
Just wanted to add more info.. I was using SparkSQL reading in the
tab-delimited raw data files converting the timestamp to Date format:

  sc.textFile("rawdata/*").map(_.split("\t")).map(p => Point(df.format(new
Date( p(0).trim.toLong*1000L )), p(1), p(2).trim.toInt ,p(3).trim.toInt,
p(4).trim.toInt ,p(5)))

Then I go about registering it as table and when I run simple query like
select count(*) from 
, I get the ArrayIndexOutOfBoundsException.

I bumped up the SPARK_DRIVER_MEMORY to 8g but still didn't help: 
  export SPARK_DRIVER_MEMORY=8g

Let me know if I'm missing any steps.. thanks!



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-shell-running-into-ArrayIndexOutOfBoundsException-tp10480p10520.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.