You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:12:34 UTC

[jira] [Resolved] (SPARK-23502) Support async init of spark context during spark-shell startup

     [ https://issues.apache.org/jira/browse/SPARK-23502?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-23502.
----------------------------------
    Resolution: Incomplete

> Support async init of spark context during spark-shell startup
> --------------------------------------------------------------
>
>                 Key: SPARK-23502
>                 URL: https://issues.apache.org/jira/browse/SPARK-23502
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Shell
>    Affects Versions: 2.0.0
>            Reporter: Sital Kedia
>            Priority: Minor
>              Labels: bulk-closed
>
> Currently, whenever a user starts the spark shell, we initialize the spark context before returning the prompt to the user. In environments, where spark context initialization takes several seconds, it is not a very good user experience for the user to wait for the prompt. Instead of waiting for the initialization of spark context, we can initialize it in the background while we return the prompt to the user as soon as possible. Please note that even if we return the prompt to the user soon, we still need to make sure to wait for the spark context initialization to complete before any query is executed. 
> Please note that the scala interpreter already does very similar async initialization in order to return the prompt to the user faster - https://github.com/scala/scala/blob/v2.12.2/src/repl/scala/tools/nsc/interpreter/ILoop.scala#L414. We will be emulating the behavior for Spark. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org