You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jianfei Wang (JIRA)" <ji...@apache.org> on 2016/09/15 07:54:22 UTC
[jira] [Commented] (SPARK-17552) Doubt about the double
Synchronized in Object SparkSession.getOrCreate()
[ https://issues.apache.org/jira/browse/SPARK-17552?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15492684#comment-15492684 ]
Jianfei Wang commented on SPARK-17552:
--------------------------------------
of course not the same one ,but only one thread can get into the method,because the method belongs to companion object
> Doubt about the double Synchronized in Object SparkSession.getOrCreate()
> ------------------------------------------------------------------------
>
> Key: SPARK-17552
> URL: https://issues.apache.org/jira/browse/SPARK-17552
> Project: Spark
> Issue Type: Question
> Components: Spark Core, SQL
> Affects Versions: 2.0.0
> Reporter: Jianfei Wang
> Priority: Trivial
> Labels: features
>
> Because the getOrCreate() is a method of object SparkSession, only one thread can get into the first synchronized block at the same time, so I think the second synchronzied is unnecessary, of course maybe there are some other reasons, please discuss here,thank you!
> {code}
> def getOrCreate(): SparkSession = synchronized {
> // Get the session from current thread's active session.
> var session = activeThreadSession.get()
> if ((session ne null) && !session.sparkContext.isStopped) {
> options.foreach { case (k, v) => session.conf.set(k, v) }
> if (options.nonEmpty) {
> logWarning("Use an existing SparkSession, some configuration may not take effect.")
> }
> return session
> }
> // Global synchronization so we will only set the default session once.
> SparkSession.synchronized {
> // some code here
> }
>
> return session
> }
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org