You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jeffrey Turpin (JIRA)" <ji...@apache.org> on 2015/04/01 23:01:52 UTC
[jira] [Commented] (SPARK-6373) Add SSL/TLS for the Netty based
BlockTransferService
[ https://issues.apache.org/jira/browse/SPARK-6373?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14391452#comment-14391452 ]
Jeffrey Turpin commented on SPARK-6373:
---------------------------------------
Hey Aaron,
Sorry for the delay... I have cleaned things up a bit and refactored the implementation to be more inline with our earlier conversation... Have a look at https://github.com/turp1twin/spark/commit/d976a7ab9b57e26fc180d649fd084a6acb9d027e and let me know your thoughts...
Jeff
> Add SSL/TLS for the Netty based BlockTransferService
> -----------------------------------------------------
>
> Key: SPARK-6373
> URL: https://issues.apache.org/jira/browse/SPARK-6373
> Project: Spark
> Issue Type: New Feature
> Components: Block Manager, Shuffle
> Affects Versions: 1.2.1
> Reporter: Jeffrey Turpin
>
> Add the ability to allow for secure communications (SSL/TLS) for the Netty based BlockTransferService and the ExternalShuffleClient. This ticket will hopefully start the conversation around potential designs... Below is a reference to a WIP prototype which implements this functionality (prototype)... I have attempted to disrupt as little code as possible and tried to follow the current code structure (for the most part) in the areas I modified. I also studied how Hadoop achieves encrypted shuffle (http://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/EncryptedShuffle.html)
> https://github.com/turp1twin/spark/commit/024b559f27945eb63068d1badf7f82e4e7c3621c
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org