You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by me <me...@rsi2m.dev> on 2023/04/05 11:14:47 UTC

Troubleshooting ArrayIndexOutOfBoundsException in long running Spark application

 
 
Dear Apache Spark users,
 
 

I have a long running Spark application that is encountering an ArrayIndexOutOfBoundsException once every two weeks. The exception does not disrupt the operation of my app, but I'm still concerned about it and would like to find a solution.

 

Here's some additional information about my setup:

 

Spark is running in standalone mode
 Spark version is 3.3.1
 Scala version is 2.12.15
 I'm using Spark in Structured Streaming

 

Here's the relevant error message:
 java.lang.ArrayIndexOutOfBoundsException Index 59 out of bounds for length 16
 I've reviewed the code and searched online, but I'm still unable to find a solution. The full stacktrace can be found at this link:  https://gist.github.com/rsi2m/ae54eccac93ae602d04d383e56c1a737
 I would appreciate any insights or suggestions on how to resolve this issue. Thank you in advance for your help.

 

Best regards,
 rsi2m