You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Prashant Sharma (JIRA)" <ji...@apache.org> on 2014/09/01 09:50:21 UTC

[jira] [Commented] (SPARK-3306) Addition of external resource dependency in executors

    [ https://issues.apache.org/jira/browse/SPARK-3306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14117138#comment-14117138 ] 

Prashant Sharma commented on SPARK-3306:
----------------------------------------

I am not 100% sure, why you are referring to Spark Executors. I was feeling there is no need to touch spark executors to support something like that. You can probably add a new spark conf option and by default all spark conf options are propogated to executors. 

I will let you close this issue, if you are convinced.

> Addition of external resource dependency in executors
> -----------------------------------------------------
>
>                 Key: SPARK-3306
>                 URL: https://issues.apache.org/jira/browse/SPARK-3306
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>            Reporter: Yan
>
> Currently, Spark executors only support static and read-only external resources of side files and jar files. With emerging disparate data sources, there is a need to support more versatile external resources, such as connections to data sources, to facilitate efficient data accesses to the sources. For one, the JDBCRDD, with some modifications,  could benefit from this feature by reusing established JDBC connections from the same Spark context before.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org