You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by nicerobot <gi...@git.apache.org> on 2016/03/07 21:30:50 UTC

[GitHub] spark pull request: [SPARK-10548] [SPARK-10563] [SQL] Fix concurre...

Github user nicerobot commented on the pull request:

    https://github.com/apache/spark/pull/8710#issuecomment-193436138
  
    We are still experiencing this. See [SPARK-10548](https://issues.apache.org/jira/browse/SPARK-10548).
    
    I've verified that we are indeed using a version of Spark with [SPARK-10548](https://issues.apache.org/jira/browse/SPARK-10548) implementation yet the issue is still reproducible. In fact, if in the test case, you:
    
        println(null != sc.getLocalProperties("spark.sql.execution.id"))
        df.count()
    
    you can anticipate when your thread will throw the exception.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org