You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yishan Jiang (JIRA)" <ji...@apache.org> on 2017/08/24 09:35:00 UTC

[jira] [Commented] (SPARK-21495) DIGEST-MD5: Out of order sequencing of messages from server

    [ https://issues.apache.org/jira/browse/SPARK-21495?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16139813#comment-16139813 ] 

Yishan Jiang commented on SPARK-21495:
--------------------------------------

Meet same issue like this, I tried change spark.authenticate.secret as simple as "aaa" and it works well. So it mostly because the authenticate could not support complicate secret. Change it to simple secret to work around if it blocks.

> DIGEST-MD5: Out of order sequencing of messages from server
> -----------------------------------------------------------
>
>                 Key: SPARK-21495
>                 URL: https://issues.apache.org/jira/browse/SPARK-21495
>             Project: Spark
>          Issue Type: Bug
>          Components: Shuffle, Spark Core
>    Affects Versions: 1.6.1
>         Environment: OS: RedHat 7.1 64bit
> Spark: 1.6.1
>            Reporter: Xin Yu Pan
>
> We hit an issue when enabling authentication and Sasl encryption, see bold font in following parameter list.
> spark.local.dir /tmp/xpan-spark-161
> spark.eventLog.dir file:///home/xpan/spark-conf/event
> spark.eventLog.enabled true
> spark.history.fs.logDirectory file:/home/xpan/spark-conf/event
> spark.history.ui.port 18085
> spark.history.fs.cleaner.enabled true
> spark.history.fs.cleaner.interval 1d
> spark.history.fs.cleaner.maxAge 14d
> spark.dynamicAllocation.enabled false
> spark.shuffle.service.enabled false
> spark.shuffle.service.port 7448
> spark.shuffle.reduceLocality.enabled false
> spark.master.port 7087
> spark.master.rest.port 6077
> spark.executor.extraJavaOptions -Djava.security.egd=file:/dev/./urandom
> *spark.authenticate true
> spark.authenticate.secret 5828d44b-f9b9-4033-b1f5-21d1e3273ec2
> spark.authenticate.enableSaslEncryption false
> spark.network.sasl.serverAlwaysEncrypt false*
> We run the simple SparkPi example and there are Exception messages even though the application gets done.
> # cat spark-1.6.1-bin-hadoop2.6/logs/spark-xpan-org.apache.spark.deploy.ExternalShuffleService-1-cws-75.out.1
> ... ...
> 17/07/20 02:57:30 INFO spark.SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users with view permissions: Set(xpan); users with modify permissions: Set(xpan)
> 17/07/20 02:57:31 INFO deploy.ExternalShuffleService: Starting shuffle service on port 7448 with useSasl = true
> 17/07/20 02:58:04 INFO shuffle.ExternalShuffleBlockResolver: Registered executor AppExecId{appId=app-20170720025800-0000, execId=0} with ExecutorShuffleInfo{localDirs=[/tmp/xpan-spark-161/spark-8e4885a3-c463-4dfb-a396-04e16b65fd1e/executor-be15fcd0-c946-4c83-ba25-3b20bbce5b0e/blockmgr-0fd2658a-ce15-4d56-901c-4c746161bbe0], subDirsPerLocalDir=64, shuffleManager=org.apache.spark.shuffle.sort.SortShuffleManager}
> 17/07/20 02:58:11 INFO security.sasl: DIGEST41:Unmatched MACs
> 17/07/20 02:58:11 WARN server.TransportChannelHandler: Exception in connection from /172.29.10.77:50616
> io.netty.handler.codec.DecoderException: javax.security.sasl.SaslException: DIGEST-MD5: Out of order sequencing of messages from server. Got: 125 Expected: 123
> 	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:99)
> 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
> 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
> 	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
> 	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
> 	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
> 	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
> 	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
> 	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
> 	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
> 	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
> 	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
> 	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
> 	at java.lang.Thread.run(Thread.java:785)
> Caused by: javax.security.sasl.SaslException: DIGEST-MD5: Out of order sequencing of messages from server. Got: 125 Expected: 123
> 	at com.ibm.security.sasl.digest.DigestMD5Base$DigestPrivacy.unwrap(DigestMD5Base.java:1535)
> 	at com.ibm.security.sasl.digest.DigestMD5Base.unwrap(DigestMD5Base.java:231)
> 	at org.apache.spark.network.sasl.SparkSaslServer.unwrap(SparkSaslServer.java:149)
> 	at org.apache.spark.network.sasl.SaslEncryption$DecryptionHandler.decode(SaslEncryption.java:127)
> 	at org.apache.spark.network.sasl.SaslEncryption$DecryptionHandler.decode(SaslEncryption.java:102)
> 	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:89)
> 	... 13 more
> 17/07/20 02:58:11 ERROR server.TransportRequestHandler: Error sending result ChunkFetchSuccess{streamChunkId=StreamChunkId{streamId=908084716000, chunkIndex=1}, buffer=FileSegmentManagedBuffer{file=/tmp/xpan-spark-161/spark-8e4885a3-c463-4dfb-a396-04e16b65fd1e/executor-be15fcd0-c946-4c83-ba25-3b20bbce5b0e/blockmgr-0fd2658a-ce15-4d56-901c-4c746161bbe0/0c/shuffle_0_17_0.data, offset=1893612, length=302981}} to /172.29.10.77:50616; closing connection
> java.nio.channels.ClosedChannelException



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org