You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joe Pallas (JIRA)" <ji...@apache.org> on 2018/04/01 01:27:00 UTC

[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

    [ https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16421548#comment-16421548 ] 

Joe Pallas commented on SPARK-22393:
------------------------------------

The changes that were imported in [https://github.com/apache/spark/pull/19846] don't seem to cover all the cases that the Scala 2.12 changes covered.  To be specific, this sequence:
{code}
import scala.reflect.runtime.{universe => ru}
import ru.TypeTag
class C[T: TypeTag](value: T)
{code}
works correctly in Scala 2.12 with -Yrepl-class-based, but does not work in spark-shell 2.3.0.

I don't understand the import-handling code enough to understand the problem, however.  It figures out that it needs the import for TypeTag but it doesn't recognize that the import depends on the previous import:
{noformat}
<console>:9: error: not found: value ru
import ru.TypeTag
       ^
{noformat}

 

> spark-shell can't find imported types in class constructors, extends clause
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-22393
>                 URL: https://issues.apache.org/jira/browse/SPARK-22393
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.0.2, 2.1.2, 2.2.0
>            Reporter: Ryan Williams
>            Assignee: Mark Petruska
>            Priority: Minor
>             Fix For: 2.3.0
>
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> <console>:11: error: not found: type Partition
>        class P(p: Partition)
>                   ^
> scala> class P(val index: Int) extends Partition
> <console>:11: error: not found: type Partition
>        class P(val index: Int) extends Partition
>                                        ^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a parameter to a class, or in an extends clause; this applies to classes I import from JARs I provide via {{--jars}} as well as core Spark classes as above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org