You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Mark Petruska (JIRA)" <ji...@apache.org> on 2017/11/19 10:59:00 UTC

[jira] [Commented] (SPARK-22393) spark-shell can't find imported types in class constructors, extends clause

    [ https://issues.apache.org/jira/browse/SPARK-22393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16258441#comment-16258441 ] 

Mark Petruska commented on SPARK-22393:
---------------------------------------

Tested with spark-shell build 2.11:

{code}
import org.apache.spark.Partition
{code}


|| Code || Result ||
| {code} class P(p: Partition) {code}  | {color:red} not found: type Partition {color} |
| {code} class P(x: Int) extends Partition {code} | {color:red} not found: type Partition {color} |
| {code} var p: Partition = _ {code}   | {color:green} OK {color} |
| {code} def a(p: Partition): Int = 0 {code} | {color:green} OK {color} |
| {code} def a(): Partition = p {code} | {color:green} OK {color} |
| {code}
object P {
  class P(p: Partition)
} {code} | {color:green} OK {color} |
| {code}
:paste
import org.apache.spark.Partition
class P(p: Partition)
{code} | {color:green} OK {color} |
| {code}
type Partition = org.apache.spark.Partition
class P(p: Partition)
{code} | {color:green} OK {color} |


To start spark shell and the underlying scala repl with _trace_, _info_ and _debug_ settings:
{code}
$JAVA_HOME/bin/java -cp $SPARK_HOME/conf/:$SPARK_HOME/assembly/target/scala-2.11/jars/* -Dscala.usejavacp=true -Dscala.repl.trace=true -Dscala.repl.info=true -Dscala.repl.debug=true -Dscala.repl.prompt="spark>>>  " -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name "Spark shell" spark-shell
{code}

> spark-shell can't find imported types in class constructors, extends clause
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-22393
>                 URL: https://issues.apache.org/jira/browse/SPARK-22393
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.0.2, 2.1.2, 2.2.0
>            Reporter: Ryan Williams
>            Priority: Minor
>
> {code}
> $ spark-shell
> …
> scala> import org.apache.spark.Partition
> import org.apache.spark.Partition
> scala> class P(p: Partition)
> <console>:11: error: not found: type Partition
>        class P(p: Partition)
>                   ^
> scala> class P(val index: Int) extends Partition
> <console>:11: error: not found: type Partition
>        class P(val index: Int) extends Partition
>                                        ^
> {code}
> Any class that I {{import}} gives "not found: type ___" when used as a parameter to a class, or in an extends clause; this applies to classes I import from JARs I provide via {{--jars}} as well as core Spark classes as above.
> This worked in 1.6.3 but has been broken since 2.0.0.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org