You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Henry Min (JIRA)" <ji...@apache.org> on 2017/03/08 23:32:38 UTC

[jira] [Commented] (SPARK-6936) SQLContext.sql() caused deadlock in multi-thread env

    [ https://issues.apache.org/jira/browse/SPARK-6936?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15902167#comment-15902167 ] 

Henry Min commented on SPARK-6936:
----------------------------------

This issue seems has been fixed on the version 1.5.0. The information here is extremely important. Can anyone provide the merged details and show me the merged source code?  Because the similar issue still exists on the latest version.

> SQLContext.sql() caused deadlock in multi-thread env
> ----------------------------------------------------
>
>                 Key: SPARK-6936
>                 URL: https://issues.apache.org/jira/browse/SPARK-6936
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.0
>         Environment: JDK 1.8.x, RedHat
> Linux version 2.6.32-431.23.3.el6.x86_64 (mockbuild@x86-027.build.eng.bos.redhat.com) (gcc version 4.4.7 20120313 (Red Hat 4.4.7-4) (GCC) ) #1 SMP Wed Jul 16 06:12:23 EDT 2014
>            Reporter: Paul Wu
>            Assignee: Michael Armbrust
>              Labels: deadlock, sql, threading
>             Fix For: 1.5.0
>
>
> Doing (the same query) in more than one threads with SQLConext.sql may lead to deadlock. Here is a way to reproduce it (since this is multi-thread issue, the reproduction may or may not be so easy).
> 1. Register a relatively big table.
> 2.  Create two different classes and in the classes, do the same query in a method and put the results in a set and print out the set size.
> 3.  Create two threads to use an object from each class in the run method. Start the threads. For my tests,  it can have a deadlock just in a few runs. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org