You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sun Rui (JIRA)" <ji...@apache.org> on 2015/07/01 09:54:04 UTC
[jira] [Commented] (SPARK-6833) Extend `addPackage` so that any
given R file can be sourced in the worker before functions are run.
[ https://issues.apache.org/jira/browse/SPARK-6833?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14609698#comment-14609698 ]
Sun Rui commented on SPARK-6833:
--------------------------------
I tested with --files, that works. So it seems we can close this issue.
> Extend `addPackage` so that any given R file can be sourced in the worker before functions are run.
> ---------------------------------------------------------------------------------------------------
>
> Key: SPARK-6833
> URL: https://issues.apache.org/jira/browse/SPARK-6833
> Project: Spark
> Issue Type: New Feature
> Components: SparkR
> Reporter: Shivaram Venkataraman
> Priority: Minor
>
> Similar to how extra python files or packages can be specified (in zip / egg formats), it will be good to support the ability to add extra R files to the executors working directory.
> One thing that needs to be investigated is if this will just work out of the box using the spark-submit flag --files ?
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org