You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Bill Jay <bi...@gmail.com> on 2014/11/03 22:52:15 UTC

Spark streaming job failed due to "java.util.concurrent.TimeoutException"

Hi all,

I have a spark streaming job that consumes data from Kafka and produces
some simple operations on the data. This job is run in an EMR cluster with
10 nodes. The batch size I use is 1 minute and it takes around 10 seconds
to generate the results that are inserted to a MySQL database. However,
after more than 2 days, the job failed with a list of the following error
information in the log:


jjava.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]

Does anyone know the reason? Thanks!

Bill