You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Hadoop QA (JIRA)" <ji...@apache.org> on 2018/11/13 11:50:01 UTC

[jira] [Commented] (HADOOP-15681) AuthenticationFilter should generate valid date format for Set-Cookie header regardless of default Locale

    [ https://issues.apache.org/jira/browse/HADOOP-15681?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16685080#comment-16685080 ] 

Hadoop QA commented on HADOOP-15681:
------------------------------------

| (x) *{color:red}-1 overall{color}* |
\\
\\
|| Vote || Subsystem || Runtime || Comment ||
| {color:blue}0{color} | {color:blue} reexec {color} | {color:blue}  0m 22s{color} | {color:blue} Docker mode activated. {color} |
|| || || || {color:brown} Prechecks {color} ||
| {color:green}+1{color} | {color:green} @author {color} | {color:green}  0m  0s{color} | {color:green} The patch does not contain any @author tags. {color} |
| {color:red}-1{color} | {color:red} test4tests {color} | {color:red}  0m  0s{color} | {color:red} The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. {color} |
|| || || || {color:brown} trunk Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 22m  8s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 16m 37s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 25s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  0m 36s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 13m 11s{color} | {color:green} branch has no errors when building and testing our client artifacts. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  0m 40s{color} | {color:green} trunk passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  0m 28s{color} | {color:green} trunk passed {color} |
|| || || || {color:brown} Patch Compile Tests {color} ||
| {color:green}+1{color} | {color:green} mvninstall {color} | {color:green}  0m 25s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} compile {color} | {color:green} 15m  8s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javac {color} | {color:green} 15m  8s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} checkstyle {color} | {color:green}  0m 25s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} mvnsite {color} | {color:green}  0m 34s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} whitespace {color} | {color:green}  0m  0s{color} | {color:green} The patch has no whitespace issues. {color} |
| {color:green}+1{color} | {color:green} shadedclient {color} | {color:green} 12m  9s{color} | {color:green} patch has no errors when building and testing our client artifacts. {color} |
| {color:green}+1{color} | {color:green} findbugs {color} | {color:green}  0m 50s{color} | {color:green} the patch passed {color} |
| {color:green}+1{color} | {color:green} javadoc {color} | {color:green}  0m 26s{color} | {color:green} the patch passed {color} |
|| || || || {color:brown} Other Tests {color} ||
| {color:green}+1{color} | {color:green} unit {color} | {color:green}  3m 13s{color} | {color:green} hadoop-auth in the patch passed. {color} |
| {color:green}+1{color} | {color:green} asflicense {color} | {color:green}  0m 41s{color} | {color:green} The patch does not generate ASF License warnings. {color} |
| {color:black}{color} | {color:black} {color} | {color:black} 89m 12s{color} | {color:black} {color} |
\\
\\
|| Subsystem || Report/Notes ||
| Docker | Client=17.05.0-ce Server=17.05.0-ce Image:yetus/hadoop:8f97d6f |
| JIRA Issue | HADOOP-15681 |
| JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12936239/HADOOP-15681.patch |
| Optional Tests |  dupname  asflicense  compile  javac  javadoc  mvninstall  mvnsite  unit  shadedclient  findbugs  checkstyle  |
| uname | Linux 5e7cf1396bb7 3.13.0-144-generic #193-Ubuntu SMP Thu Mar 15 17:03:53 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | /testptch/patchprocess/precommit/personality/provided.sh |
| git revision | trunk / e7b63ba |
| maven | version: Apache Maven 3.3.9 |
| Default Java | 1.8.0_181 |
| findbugs | v3.1.0-RC1 |
|  Test Results | https://builds.apache.org/job/PreCommit-HADOOP-Build/15510/testReport/ |
| Max. process+thread count | 339 (vs. ulimit of 10000) |
| modules | C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth |
| Console output | https://builds.apache.org/job/PreCommit-HADOOP-Build/15510/console |
| Powered by | Apache Yetus 0.8.0   http://yetus.apache.org |


This message was automatically generated.



> AuthenticationFilter should generate valid date format for Set-Cookie header regardless of default Locale
> ---------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-15681
>                 URL: https://issues.apache.org/jira/browse/HADOOP-15681
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: security
>    Affects Versions: 3.2.0
>            Reporter: Cao Manh Dat
>            Priority: Minor
>         Attachments: HADOOP-15681.patch
>
>
> Hi guys,
> When I try to set up Hadoop Kerberos authentication for Solr (HTTP2), I met this exception:
> {code}
> java.lang.IllegalArgumentException: null
> 	at org.eclipse.jetty.http2.hpack.Huffman.octetsNeeded(Huffman.java:435) ~[http2-hpack-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.hpack.Huffman.octetsNeeded(Huffman.java:409) ~[http2-hpack-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.hpack.HpackEncoder.encodeValue(HpackEncoder.java:368) ~[http2-hpack-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.hpack.HpackEncoder.encode(HpackEncoder.java:302) ~[http2-hpack-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.hpack.HpackEncoder.encode(HpackEncoder.java:179) ~[http2-hpack-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.generator.HeadersGenerator.generateHeaders(HeadersGenerator.java:72) ~[http2-common-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.generator.HeadersGenerator.generate(HeadersGenerator.java:56) ~[http2-common-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.generator.Generator.control(Generator.java:80) ~[http2-common-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.HTTP2Session$ControlEntry.generate(HTTP2Session.java:1163) ~[http2-common-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.HTTP2Flusher.process(HTTP2Flusher.java:184) ~[http2-common-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.IteratingCallback.processing(IteratingCallback.java:241) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.IteratingCallback.iterate(IteratingCallback.java:224) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.HTTP2Session.frame(HTTP2Session.java:685) ~[http2-common-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.HTTP2Session.frames(HTTP2Session.java:657) ~[http2-common-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.HTTP2Stream.headers(HTTP2Stream.java:107) ~[http2-common-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.server.HttpTransportOverHTTP2.sendHeadersFrame(HttpTransportOverHTTP2.java:235) ~[http2-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.server.HttpTransportOverHTTP2.send(HttpTransportOverHTTP2.java:134) ~[http2-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpChannel.sendResponse(HttpChannel.java:790) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpChannel.write(HttpChannel.java:846) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpOutput.write(HttpOutput.java:240) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpOutput.write(HttpOutput.java:216) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpOutput.close(HttpOutput.java:298) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpWriter.close(HttpWriter.java:49) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.ResponseWriter.close(ResponseWriter.java:163) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.Response.closeOutput(Response.java:1038) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ErrorHandler.generateAcceptableResponse(ErrorHandler.java:178) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ErrorHandler.doError(ErrorHandler.java:142) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ErrorHandler.handle(ErrorHandler.java:78) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.Response.sendError(Response.java:655) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at javax.servlet.http.HttpServletResponseWrapper.sendError(HttpServletResponseWrapper.java:158) ~[javax.servlet-api-3.1.0.jar:3.1.0]
> 	at javax.servlet.http.HttpServletResponseWrapper.sendError(HttpServletResponseWrapper.java:158) ~[javax.servlet-api-3.1.0.jar:3.1.0]
> 	at org.apache.hadoop.security.authentication.server.AuthenticationFilter.doFilter(AuthenticationFilter.java:580) ~[hadoop-auth-2.7.4.jar:?]
> 	at org.apache.solr.security.DelegationTokenKerberosFilter.doFilter(DelegationTokenKerberosFilter.java:134) ~[java/:?]
> 	at org.apache.solr.security.KerberosPlugin.doAuthenticate(KerberosPlugin.java:270) ~[java/:?]
> 	at org.apache.solr.servlet.SolrDispatchFilter.authenticateRequest(SolrDispatchFilter.java:452) ~[java/:?]
> 	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:347) ~[java/:?]
> 	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:324) ~[java/:?]
> 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:140) ~[java/:?]
> 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1317) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473) ~[jetty-servlet-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1219) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:674) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.Server.handle(Server.java:531) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:352) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:293) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:132) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.HTTP2Connection.produce(HTTP2Connection.java:178) ~[http2-common-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http2.server.HTTP2ServerConnection.onOpen(HTTP2ServerConnection.java:148) ~[http2-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.io.AbstractEndPoint.upgrade(AbstractEndPoint.java:440) ~[jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpConnection.onCompleted(HttpConnection.java:385) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpChannelOverHttp.upgrade(HttpChannelOverHttp.java:481) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpChannelOverHttp.headerComplete(HttpChannelOverHttp.java:372) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http.HttpParser.handleHeaderContentMessage(HttpParser.java:594) ~[jetty-http-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http.HttpParser.parseFields(HttpParser.java:1219) ~[jetty-http-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:1508) ~[jetty-http-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpConnection.parseRequestBuffer(HttpConnection.java:360) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:250) ~[jetty-server-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:281) ~[jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:102) ~[jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118) ~[jetty-io-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:132) ~[jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:762) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:680) [jetty-util-9.4.11.v20180605.jar:9.4.11.v20180605]
> 	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
> {code}
> This error comes from Jetty Hpack when it tries to compress this header
> {code}
>       Set-Cookie: hadoop.auth=; Path=/; Domain=127.0.0.1; Expires=Déar, 01-Ean-1970 00:00:00 GMT; HttpOnly
> {code}
> Déar mean Thursday in Ireland and Jetty Hpack can't encode Unicode character. That header is generated by org.apache.hadoop.security.authentication.server.AuthenticationFilter.createAuthCookie()
> I posted this problem to the Jetty community and Greg said that (https://github.com/eclipse/jetty.project/issues/2815)
> {quote}
> I'm pretty sure that unicode characters are not legal for HTTP field values, as RFC7230 says:
>    Historically, HTTP has allowed field content with text in the
>    ISO-8859-1 charset [ISO-8859-1], supporting other charsets only
>    through use of [RFC2047] encoding.  In practice, most HTTP header
>    field values use only a subset of the US-ASCII charset [USASCII].
>    Newly defined header fields SHOULD limit their field values to
>    US-ASCII octets.  A recipient SHOULD treat other octets in field
>    content (obs-text) as opaque data.
> So I don't think that header is legal... but it should not fail in hpack, whose RFC says it should treat fields as opaque octets!
> {quote}
> Therefore I think preventing Unicode character generated from {{AuthenticationFilter}} should be a good idea.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org