You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "bharath kumar (JIRA)" <ji...@apache.org> on 2018/02/23 16:14:00 UTC
[jira] [Created] (SPARK-23497) Sparklyr Applications doesn't
disconnect spark driver in client mode
bharath kumar created SPARK-23497:
-------------------------------------
Summary: Sparklyr Applications doesn't disconnect spark driver in client mode
Key: SPARK-23497
URL: https://issues.apache.org/jira/browse/SPARK-23497
Project: Spark
Issue Type: Improvement
Components: Spark Core, YARN
Affects Versions: 2.1.0
Reporter: bharath kumar
Hello,
When we use Sparklyr to connect to Yarn cluster manager in client mode or cluster mode, Spark driver will not disconnect unless we mention the spark_disconnect(sc) in the code.
Does it make sense to add a timeout feature for driver to exit after certain amount of time, in client mode or cluster mode. I think its only happening with connection from Sparklyr to Yarn. Some times the driver stays there for weeks and holds minimum resources .
*More Details:*
Yarn -2.7.0
Spark -2.1.0
Rversion:
Microsoft R Open 3.4.2
Rstudio Version:
rstudio-server-1.1.414-1.x86_64
yarn application -status application_id
18/01/22 09:08:45 INFO client.MapRZKBasedRMFailoverProxyProvider: Updated RM address to resourcemanager.com/resourcemanager:8032
Application Report :
Application-Id : application_id
Application-Name : sparklyr
Application-Type : SPARK
User : userid
Queue : root.queuename
Start-Time : 1516245523965
Finish-Time : 0
Progress : 0%
State : RUNNING
Final-State : UNDEFINED
Tracking-URL : N/A
RPC Port : -1
AM Host : N/A
Aggregate Resource Allocation :266468 MB-seconds, 59 vcore-seconds
Diagnostics : N/A
[http://spark.rstudio.com/]
I can provide more details if required
Thanks,
Bharath
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org