You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2014/08/05 07:44:13 UTC

[jira] [Comment Edited] (SPARK-2157) Can't write tight firewall rules for Spark

    [ https://issues.apache.org/jira/browse/SPARK-2157?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14085835#comment-14085835 ] 

Andrew Or edited comment on SPARK-2157 at 8/5/14 5:42 AM:
----------------------------------------------------------

[~aash] I have built my changes on top of yours in my PR. It would be good if you could take a look.


was (Author: andrewor):
[~aash] I have built my changes on top of yours in my PR

> Can't write tight firewall rules for Spark
> ------------------------------------------
>
>                 Key: SPARK-2157
>                 URL: https://issues.apache.org/jira/browse/SPARK-2157
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Andrew Ash
>            Assignee: Andrew Ash
>            Priority: Critical
>
> In order to run Spark in places with strict firewall rules, you need to be able to specify every port that's used between all parts of the stack.
> Per the [network activity section of the docs|http://spark.apache.org/docs/latest/spark-standalone.html#configuring-ports-for-network-security] most of the ports are configurable, but there are a few ports that aren't configurable.
> We need to make every port configurable to a particular port, so that we can run Spark in highly locked-down environments.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org