You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/03/19 03:39:00 UTC
[jira] [Assigned] (SPARK-34800) Use fine-grained lock in
SessionCatalog.tableExists
[ https://issues.apache.org/jira/browse/SPARK-34800?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-34800:
------------------------------------
Assignee: (was: Apache Spark)
> Use fine-grained lock in SessionCatalog.tableExists
> ---------------------------------------------------
>
> Key: SPARK-34800
> URL: https://issues.apache.org/jira/browse/SPARK-34800
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.1.1
> Reporter: rongchuan.jin
> Priority: Major
>
> We have modified the underlying hive meta store which a different hive > database is placed in its own shard for performance. However, we found that > the synchronized limits the concurrency, we would like to fix it.
> Related jstack trace like following:
> {code:java}
> "http-nio-7070-exec-257" #19961734 daemon prio=5 os_prio=0 tid=0x00007f45f4ce1000 nid=0x1a85e6 waiting for monitor entry [0x00007f45949df000]
> java.lang.Thread.State: BLOCKED (on object monitor)
> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415)
> - waiting to lock <0x000000011d983d90> (a org.apache.spark.sql.catalyst.catalog.SessionCatalog)
> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.requireTableExists(SessionCatalog.scala:185)
> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.getTableMetadata(SessionCatalog.scala:430)
> at org.apache.spark.sql.DdlOperation$.getTableDesc(SourceFile:123)
> at org.apache.spark.sql.DdlOperation.getTableDesc(SourceFile)
> ...
> {code}
> we fixed as discussed in mail [http://mail-archives.apache.org/mod_mbox/spark-dev/202103.mbox/browser]
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org