You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2020/12/01 17:38:00 UTC

[jira] [Resolved] (SPARK-33611) Decode Query parameters of the redirect URL for reverse proxy

     [ https://issues.apache.org/jira/browse/SPARK-33611?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Gengliang Wang resolved SPARK-33611.
------------------------------------
    Fix Version/s: 3.0.2
                   3.1.0
       Resolution: Fixed

Issue resolved by pull request 30552
[https://github.com/apache/spark/pull/30552]

> Decode Query parameters of the redirect URL for reverse proxy
> -------------------------------------------------------------
>
>                 Key: SPARK-33611
>                 URL: https://issues.apache.org/jira/browse/SPARK-33611
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 3.0.0, 3.1.0
>            Reporter: Gengliang Wang
>            Assignee: Gengliang Wang
>            Priority: Major
>             Fix For: 3.1.0, 3.0.2
>
>
> When running Spark with reverse proxy enabled, the query parameter of the request URL can be encoded twice:  one from the browser and another one from the reverse proxy(e.g. Nginx).  
> In Spark's stage page, the URL of "/taskTable" contains query parameter order[0][dir].  After encoding twice,  the query parameter becomes `order%255B0%255D%255Bdir%255D` and it will be decoded as `order%5B0%5D%5Bdir%5D` instead of  `order[0][dir]`.  As a result, there will be NullPointerException from https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/status/api/v1/StagesResource.scala#L176
> Other than that, the other parameter may not work as expected after encoded twice.
> We should decode the query parameters and fix the problem



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org