You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Xuefu Zhang (JIRA)" <ji...@apache.org> on 2014/12/20 00:51:13 UTC
[jira] [Updated] (HIVE-9178) Create a separate API for remote Spark
Context RPC other than job submission [Spark Branch]
[ https://issues.apache.org/jira/browse/HIVE-9178?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xuefu Zhang updated HIVE-9178:
------------------------------
Summary: Create a separate API for remote Spark Context RPC other than job submission [Spark Branch] (was: Create a separate API for remote Spark Context RPC other than job submission)
> Create a separate API for remote Spark Context RPC other than job submission [Spark Branch]
> -------------------------------------------------------------------------------------------
>
> Key: HIVE-9178
> URL: https://issues.apache.org/jira/browse/HIVE-9178
> Project: Hive
> Issue Type: Sub-task
> Components: Spark
> Reporter: Xuefu Zhang
>
> Based on discussions in HIVE-8972, it seems making sense to create a separate API for RPCs, such as addJar and getExecutorCounter. These jobs are different from a query submission in that they don't need to be queued in the backend and can be executed right away.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)