You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2019/06/10 19:00:00 UTC

[jira] [Updated] (SPARK-27947) Enhance redactOptions to accept any Map type

     [ https://issues.apache.org/jira/browse/SPARK-27947?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun updated SPARK-27947:
----------------------------------
    Summary: Enhance redactOptions to accept any Map type  (was: ParsedStatement subclass toString may throw ClassCastException)

> Enhance redactOptions to accept any Map type
> --------------------------------------------
>
>                 Key: SPARK-27947
>                 URL: https://issues.apache.org/jira/browse/SPARK-27947
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: John Zhuge
>            Priority: Minor
>
> In ParsedStatement.productIterator, `case mapArg: Map[_, _]` may match any Map type, thus causing `asInstanceOf[Map[String, String]]` to throw ClassCastException.
> The following test reproduces the issue:
> {code:java}
> case class TestStatement(p: Map[String, Int]) extends ParsedStatement {
>  override def output: Seq[Attribute] = Nil
>  override def children: Seq[LogicalPlan] = Nil
> }
> TestStatement(Map("abc" -> 1)).toString{code}
> Changing the code to `case mapArg: Map[String, String]` will not work due to type erasure. As a matter of fact, compiler gives this warning:
> {noformat}
> Warning:(41, 18) non-variable type argument String in type pattern scala.collection.immutable.Map[String,String] (the underlying of Map[String,String]) is unchecked since it is eliminated by erasure
> case mapArg: Map[String, String] =>{noformat}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org