You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Chao (JIRA)" <ji...@apache.org> on 2014/09/30 20:15:33 UTC

[jira] [Resolved] (HIVE-7525) Research to find out if it's possible to submit Spark jobs concurrently using shared SparkContext [Spark Branch]

     [ https://issues.apache.org/jira/browse/HIVE-7525?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Chao resolved HIVE-7525.
------------------------
    Resolution: Fixed

> Research to find out if it's possible to submit Spark jobs concurrently using shared SparkContext [Spark Branch]
> ----------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-7525
>                 URL: https://issues.apache.org/jira/browse/HIVE-7525
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Xuefu Zhang
>            Assignee: Chao
>
> Refer to HIVE-7503 and SPARK-2688. Find out if it's possible to submit multiple spark jobs concurrently using a shared SparkContext. SparkClient's code can be manipulated for this test. Here is the process:
> 1. Transform rdd1 to rdd2 using some transformation.
> 2. call rdd2.cache() to persist it in memory.
> 3. in two threads, calling accordingly:
>     Thread a. rdd2 -> rdd3; rdd3.foreach()
>     Thread b. rdd2 -> rdd4; rdd4.foreach()
> It would be nice to find out monitoring and error reporting aspects.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)