You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Jacek Laskowski <ja...@japila.pl> on 2016/04/05 02:16:39 UTC

error: reference to sql is ambiguous after import org.apache.spark._ in shell?

Hi Spark devs,

I'm unsure if what I'm seeing is correct. I'd appreciate any input
to...rest my nerves :-) I did `import org.apache.spark._` by mistake,
but since it's valid, I'm wondering why does Spark shell imports sql
at all since it's available after the import?!

(it's today's build)

scala> sql("SELECT * FROM dafa").show(false)
<console>:30: error: reference to sql is ambiguous;
it is imported twice in the same scope by
import org.apache.spark._
and import sqlContext.sql
       sql("SELECT * FROM dafa").show(false)
       ^

scala> :imports
 1) import sqlContext.implicits._  (52 terms, 31 are implicit)
 2) import sqlContext.sql          (1 terms)

scala> sc.version
res19: String = 2.0.0-SNAPSHOT

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: error: reference to sql is ambiguous after import org.apache.spark._ in shell?

Posted by Ted Yu <yu...@gmail.com>.
Looks like the import comes from
repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala :

      processLine("import sqlContext.sql")

On Mon, Apr 4, 2016 at 5:16 PM, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi Spark devs,
>
> I'm unsure if what I'm seeing is correct. I'd appreciate any input
> to...rest my nerves :-) I did `import org.apache.spark._` by mistake,
> but since it's valid, I'm wondering why does Spark shell imports sql
> at all since it's available after the import?!
>
> (it's today's build)
>
> scala> sql("SELECT * FROM dafa").show(false)
> <console>:30: error: reference to sql is ambiguous;
> it is imported twice in the same scope by
> import org.apache.spark._
> and import sqlContext.sql
>        sql("SELECT * FROM dafa").show(false)
>        ^
>
> scala> :imports
>  1) import sqlContext.implicits._  (52 terms, 31 are implicit)
>  2) import sqlContext.sql          (1 terms)
>
> scala> sc.version
> res19: String = 2.0.0-SNAPSHOT
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>