You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by billchambers <wc...@ischool.berkeley.edu> on 2015/08/03 05:33:42 UTC

Cannot Import Package (spark-csv)

I am trying to import the spark csv package while using the scala spark
shell. Spark 1.4.1, Scala 2.11

I am starting the shell with:

bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
../sjars/spark-csv_2.11-1.1.0.jar --master local


I then try and run



and get the following error:



What am i doing wrong?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Cannot Import Package (spark-csv)

Posted by Burak Yavuz <br...@gmail.com>.
In addition, you do not need to use --jars with --packages. --packages will
get the jar for you.

Best,
Burak

On Mon, Aug 3, 2015 at 9:01 AM, Burak Yavuz <br...@gmail.com> wrote:

> Hi, there was this issue for Scala 2.11.
> https://issues.apache.org/jira/browse/SPARK-7944
> It should be fixed on master branch. You may be hitting that.
>
> Best,
> Burak
>
> On Sun, Aug 2, 2015 at 9:06 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> I tried the following command on master branch:
>> bin/spark-shell --packages com.databricks:spark-csv_2.10:1.0.3 --jars
>> ../spark-csv_2.10-1.0.3.jar --master local
>>
>> I didn't reproduce the error with your command.
>>
>> FYI
>>
>> On Sun, Aug 2, 2015 at 8:57 PM, Bill Chambers <
>> wchambers@ischool.berkeley.edu> wrote:
>>
>>> Sure the commands are:
>>>
>>> scala> val df =
>>> sqlContext.read.format("com.databricks.spark.csv").option("header",
>>> "true").load("cars.csv")
>>>
>>> and get the following error:
>>>
>>> java.lang.RuntimeException: Failed to load class for data source:
>>> com.databricks.spark.csv
>>>   at scala.sys.package$.error(package.scala:27)
>>>   at
>>> org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
>>>   at
>>> org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
>>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
>>>   ... 49 elided
>>>
>>> On Sun, Aug 2, 2015 at 8:56 PM, Ted Yu <yu...@gmail.com> wrote:
>>>
>>>> The command you ran and the error you got were not visible.
>>>>
>>>> Mind sending them again ?
>>>>
>>>> Cheers
>>>>
>>>> On Sun, Aug 2, 2015 at 8:33 PM, billchambers <
>>>> wchambers@ischool.berkeley.edu> wrote:
>>>>
>>>>> I am trying to import the spark csv package while using the scala spark
>>>>> shell. Spark 1.4.1, Scala 2.11
>>>>>
>>>>> I am starting the shell with:
>>>>>
>>>>> bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
>>>>> ../sjars/spark-csv_2.11-1.1.0.jar --master local
>>>>>
>>>>>
>>>>> I then try and run
>>>>>
>>>>>
>>>>>
>>>>> and get the following error:
>>>>>
>>>>>
>>>>>
>>>>> What am i doing wrong?
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> Bill Chambers
>>> http://billchambers.me/
>>> Email <wc...@ischool.berkeley.edu> | LinkedIn
>>> <http://linkedin.com/in/wachambers> | Twitter
>>> <https://twitter.com/b_a_chambers> | Github
>>> <https://github.com/anabranch>
>>>
>>
>>
>

Re: Cannot Import Package (spark-csv)

Posted by Burak Yavuz <br...@gmail.com>.
Hi, there was this issue for Scala 2.11.
https://issues.apache.org/jira/browse/SPARK-7944
It should be fixed on master branch. You may be hitting that.

Best,
Burak

On Sun, Aug 2, 2015 at 9:06 PM, Ted Yu <yu...@gmail.com> wrote:

> I tried the following command on master branch:
> bin/spark-shell --packages com.databricks:spark-csv_2.10:1.0.3 --jars
> ../spark-csv_2.10-1.0.3.jar --master local
>
> I didn't reproduce the error with your command.
>
> FYI
>
> On Sun, Aug 2, 2015 at 8:57 PM, Bill Chambers <
> wchambers@ischool.berkeley.edu> wrote:
>
>> Sure the commands are:
>>
>> scala> val df =
>> sqlContext.read.format("com.databricks.spark.csv").option("header",
>> "true").load("cars.csv")
>>
>> and get the following error:
>>
>> java.lang.RuntimeException: Failed to load class for data source:
>> com.databricks.spark.csv
>>   at scala.sys.package$.error(package.scala:27)
>>   at
>> org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
>>   at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
>>   ... 49 elided
>>
>> On Sun, Aug 2, 2015 at 8:56 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> The command you ran and the error you got were not visible.
>>>
>>> Mind sending them again ?
>>>
>>> Cheers
>>>
>>> On Sun, Aug 2, 2015 at 8:33 PM, billchambers <
>>> wchambers@ischool.berkeley.edu> wrote:
>>>
>>>> I am trying to import the spark csv package while using the scala spark
>>>> shell. Spark 1.4.1, Scala 2.11
>>>>
>>>> I am starting the shell with:
>>>>
>>>> bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
>>>> ../sjars/spark-csv_2.11-1.1.0.jar --master local
>>>>
>>>>
>>>> I then try and run
>>>>
>>>>
>>>>
>>>> and get the following error:
>>>>
>>>>
>>>>
>>>> What am i doing wrong?
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>
>>>>
>>>
>>
>>
>> --
>> Bill Chambers
>> http://billchambers.me/
>> Email <wc...@ischool.berkeley.edu> | LinkedIn
>> <http://linkedin.com/in/wachambers> | Twitter
>> <https://twitter.com/b_a_chambers> | Github
>> <https://github.com/anabranch>
>>
>
>

Re: Cannot Import Package (spark-csv)

Posted by Ted Yu <yu...@gmail.com>.
I tried the following command on master branch:
bin/spark-shell --packages com.databricks:spark-csv_2.10:1.0.3 --jars
../spark-csv_2.10-1.0.3.jar --master local

I didn't reproduce the error with your command.

FYI

On Sun, Aug 2, 2015 at 8:57 PM, Bill Chambers <
wchambers@ischool.berkeley.edu> wrote:

> Sure the commands are:
>
> scala> val df =
> sqlContext.read.format("com.databricks.spark.csv").option("header",
> "true").load("cars.csv")
>
> and get the following error:
>
> java.lang.RuntimeException: Failed to load class for data source:
> com.databricks.spark.csv
>   at scala.sys.package$.error(package.scala:27)
>   at
> org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
>   at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
>   ... 49 elided
>
> On Sun, Aug 2, 2015 at 8:56 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> The command you ran and the error you got were not visible.
>>
>> Mind sending them again ?
>>
>> Cheers
>>
>> On Sun, Aug 2, 2015 at 8:33 PM, billchambers <
>> wchambers@ischool.berkeley.edu> wrote:
>>
>>> I am trying to import the spark csv package while using the scala spark
>>> shell. Spark 1.4.1, Scala 2.11
>>>
>>> I am starting the shell with:
>>>
>>> bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
>>> ../sjars/spark-csv_2.11-1.1.0.jar --master local
>>>
>>>
>>> I then try and run
>>>
>>>
>>>
>>> and get the following error:
>>>
>>>
>>>
>>> What am i doing wrong?
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>
>
> --
> Bill Chambers
> http://billchambers.me/
> Email <wc...@ischool.berkeley.edu> | LinkedIn
> <http://linkedin.com/in/wachambers> | Twitter
> <https://twitter.com/b_a_chambers> | Github <https://github.com/anabranch>
>

Re: Cannot Import Package (spark-csv)

Posted by Ted Yu <yu...@gmail.com>.
The command you ran and the error you got were not visible.

Mind sending them again ?

Cheers

On Sun, Aug 2, 2015 at 8:33 PM, billchambers <wchambers@ischool.berkeley.edu
> wrote:

> I am trying to import the spark csv package while using the scala spark
> shell. Spark 1.4.1, Scala 2.11
>
> I am starting the shell with:
>
> bin/spark-shell --packages com.databricks:spark-csv_2.11:1.1.0 --jars
> ../sjars/spark-csv_2.11-1.1.0.jar --master local
>
>
> I then try and run
>
>
>
> and get the following error:
>
>
>
> What am i doing wrong?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Cannot Import Package (spark-csv)

Posted by billchambers <wc...@ischool.berkeley.edu>.
Commands again are:

Sure the commands are:

scala> val df =
sqlContext.read.format("com.databricks.spark.csv").option("header",
"true").load("cars.csv")

and get the following error: 

java.lang.RuntimeException: Failed to load class for data source:
com.databricks.spark.csv
  at scala.sys.package$.error(package.scala:27)
  at
org.apache.spark.sql.sources.ResolvedDataSource$.lookupDataSource(ddl.scala:220)
  at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:233)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
  ... 49 elided



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-Import-Package-spark-csv-tp24109p24110.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org