You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mohammad Tariq <do...@gmail.com> on 2017/04/18 00:41:34 UTC

Application not found in RM

Dear fellow Spark users,

*Use case :* I have written a small java client which launches multiple
Spark jobs through *SparkLauncher* and captures jobs' metrics during the
course of the execution.

*Issue :* Sometimes the client fails saying -
*Caused by:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.yarn.exceptions.ApplicationNotFoundException):
Application with id 'application_APP_ID' doesn't exist in RM.*

I am using *YarnClient.getApplicationReport(ApplicationID ID)* to get the
desired metrics. I forced the threads to sleep for sometime so that
applications actually gets started before I query for these metrics. Most
of the times it works. However, I feel this is not the correct approach.

What could be the ideal way to handle such situation?

Thank you so much for your valuable time!





[image: http://]

Tariq, Mohammad
about.me/mti
[image: http://]
<http://about.me/mti>



[image: --]

Tariq, Mohammad
[image: https://]about.me/mti
<https://about.me/mti?promo=email_sig&utm_source=product&utm_medium=email_sig&utm_campaign=chrome_ext>