You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shridhar Ramachandran (JIRA)" <ji...@apache.org> on 2016/11/03 20:09:58 UTC

[jira] [Commented] (SPARK-15377) Enabling SASL Spark 1.6.1

    [ https://issues.apache.org/jira/browse/SPARK-15377?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15634088#comment-15634088 ] 

Shridhar Ramachandran commented on SPARK-15377:
-----------------------------------------------

It is likely that you haven't enabled spark.authenticate=true in YARN. Excerpted from YarnShuffleService.java --
{noformat}
 * The service also optionally supports authentication. This ensures that executors from one
 * application cannot read the shuffle files written by those from another. This feature can be
 * enabled by setting `spark.authenticate` in the Yarn configuration before starting the NM.
 * Note that the Spark application must also set `spark.authenticate` manually and, unlike in
 * the case of the service port, will not inherit this setting from the Yarn configuration. This
 * is because an application running on the same Yarn cluster may choose to not use the external
 * shuffle service, in which case its setting of `spark.authenticate` should be independent of
 * the service's.
{noformat}
 You can do this by adding that flag to core-site.xml

> Enabling SASL Spark 1.6.1
> -------------------------
>
>                 Key: SPARK-15377
>                 URL: https://issues.apache.org/jira/browse/SPARK-15377
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Core, YARN
>    Affects Versions: 1.6.1
>            Reporter: Fabian Tan
>
> Hi there,
> I wonder if anyone gotten SASL to work with Spark 1.6.1 on YARN?
> At this point in time, I cant confirm if this is a bug or not, but, it's definitely reproducible.
> Basically Spark documentation states that you only require 3 parameters enabled:
> spark.authenticate.enableSaslEncryption=true    
> spark.network.sasl.serverAlwaysEncrypt=true
> spark.authenticate=true
> http://spark.apache.org/docs/latest/security.html
> However, upon launching my spark job with --master yarn and --deploy-mode client, I see the following in my spark executors logs:
> 6/05/17 06:50:51 ERROR client.TransportClientFactory: Exception while bootstrapping client after 29 ms
> java.lang.RuntimeException: java.lang.IllegalArgumentException: Unknown message type: -22
>         at org.apache.spark.network.shuffle.protocol.BlockTransferMessage$Decoder.fromByteBuffer(BlockTransferMessage.java:67)
>         at org.apache.spark.network.shuffle.ExternalShuffleBlockHandler.receive(ExternalShuffleBlockHandler.java:71)
>         at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149)
>         at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102)
>         at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
>         at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
>         at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
>         at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:254)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
>         at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
>         at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
>         at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333)
>         at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319)
>         at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787)
>         at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130)
>         at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
>         at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
>         at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
>         at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
>         at java.lang.Thread.run(Thread.java:745)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org