You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/02/09 21:24:00 UTC
[jira] [Commented] (SPARK-23275) hive/tests have been failing when
run locally on the laptop (Mac) with OOM
[ https://issues.apache.org/jira/browse/SPARK-23275?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16358971#comment-16358971 ]
Apache Spark commented on SPARK-23275:
--------------------------------------
User 'liufengdb' has created a pull request for this issue:
https://github.com/apache/spark/pull/20562
> hive/tests have been failing when run locally on the laptop (Mac) with OOM
> ---------------------------------------------------------------------------
>
> Key: SPARK-23275
> URL: https://issues.apache.org/jira/browse/SPARK-23275
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.1
> Reporter: Dilip Biswal
> Assignee: Dilip Biswal
> Priority: Major
> Fix For: 2.3.0
>
>
> hive tests have been failing when they are run locally (Mac Os) after a recent change in the trunk. After running the tests for some time, the test fails with OOM with Error: unable to create new native thread.
> I noticed the thread count goes all the way up to 2000+ after which we start getting these OOM errors. Most of the threads seem to be related to the connection pool in hive metastore (BoneCP-xxxxx-xxxx ). This behaviour change is happening after we made the following change to HiveClientImpl.reset()
> {code}
> def reset(): Unit = withHiveState {
> try {
> // code
> } finally {
> runSqlHive("USE default") ===> this is causing the issue
> }
> {code}
> I am proposing to temporarily back-out part of a fix made to address SPARK-23000 to resolve this issue while we work-out the exact reason for this sudden increase in thread counts.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org