You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Cosmin Posteuca <co...@gmail.com> on 2017/06/15 12:40:16 UTC

Spark don't run all code when is submit to yarn-cluster mode.

Hi,

I have the following problem:

After SparkSession is initialized i create a task:

 val task = new Runnable { } where i make a REST API, and from it's
response i read some data from internet/ES/Hive.
This task is running to every 5 second with Akka scheduler:

scheduler.schedule( Duration(0, TimeUnit.SECONDS), Duration(5,
TimeUnit.SECONDS), task)

My problem is when i run the code with yarn-cluster, the SparkSession
it's immediately closed after init, and the SparkSession don't wait after
the tasks that will be scheduled later. But in yarn-client everything it's
ok. SparkSession it's not closed automatically, and the tasks is running to
every 5 seconds.

What it's the problem when i start Spark Application with yarn-cluster mode?
How work submitting when i use yarn-cluster?
How to resolve this problem? How to make this to work in yarn-cluster?

ps. I use Spark 2.1.1, and akka-actor_2.11 - 2.4.17

Thanks,
Cosmin