You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/11/05 11:57:00 UTC

[jira] [Resolved] (SPARK-29690) Spark Shell - Clear imports

     [ https://issues.apache.org/jira/browse/SPARK-29690?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-29690.
----------------------------------
    Resolution: Not A Problem

> Spark Shell  - Clear imports
> ----------------------------
>
>                 Key: SPARK-29690
>                 URL: https://issues.apache.org/jira/browse/SPARK-29690
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.2.0
>            Reporter: dinesh
>            Priority: Major
>
> I 'm facing below problem with Spark Shell. So, in a shell session -
>  # I imported following - {color:#57d9a3}{{import scala.collection.immutable.HashMap}}{color}
>  # Then I realized my mistake and imported correct class - {color:#57d9a3}{{import java.util.HashMap}}{color}
> But, now I get following error on running my code -
> {color:#de350b}{{{{<console>:34: error: reference to HashMap is ambiguous;it is imported twice in the same scope byimport java.util.HashMapand import scala.collection.immutable.HashMapval colMap = new HashMap[String, HashMap[String, String]]()}}}}{color}
> If I have long running Spark Shell session i.e I do not want to close and reopen my shell. So, is there a way I can clear previous imports and use correct class?
> I know that we can also specify full qualified name like - {color:#57d9a3}{{val colMap = new java.util.HashMap[String, java.util.HashMap[String, String]]()}}{color}
> But, 'm looking if there is a way to clear an incorrect loaded class?
>  
> I thought spark shell picks imports from history the same way REPL does. That said, previous HashMap should be shadowed away with new import statement.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org