You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nikita Gorbachevski (Jira)" <ji...@apache.org> on 2019/10/22 10:20:00 UTC
[jira] [Commented] (SPARK-29550) Enhance locking in session catalog
[ https://issues.apache.org/jira/browse/SPARK-29550?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16956902#comment-16956902 ]
Nikita Gorbachevski commented on SPARK-29550:
---------------------------------------------
Working on this.
> Enhance locking in session catalog
> ----------------------------------
>
> Key: SPARK-29550
> URL: https://issues.apache.org/jira/browse/SPARK-29550
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, SQL
> Affects Versions: 2.4.4
> Reporter: Nikita Gorbachevski
> Priority: Minor
>
> In my streaming application``spark.streaming.concurrentJobs`` is set to 50 which is used as size for underlying thread pool. I automatically create/alter tables/view in runtime. I order to do that i invoke ``create ... if not exists operations`` on driver on each batch invocation. Once i noticed that most of batch time is spent on driver but not on executors. I did a thread dump and figured out that most of the threads are blocked on SessionCatalog waiting for a lock.
> Existing implementation of SessionCatalog uses a single lock which is used almost by all the methods to guard ``currentDb`` and ``tempViews`` variables. I propose to enhance locking behaviour of SessionCatalog by :
> # Employing ReadWriteLock which allows to execute read operations concurrently.
> # Replace synchronized with the corresponding read or write lock.
> Also it's possible to go even further and strip locks for ``currentDb`` and ``tempViews`` but i'm not sure whether it's possible from the implementation point of view. Probably someone will help me with this.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org