You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jörn Franke <jo...@gmail.com> on 2017/04/06 13:12:34 UTC

Re: Error while reading the CSV

Is the library in your assembly jar?

> On 6. Apr 2017, at 15:06, nayan sharma <na...@gmail.com> wrote:
> 
> Hi All,
> I am getting error while loading CSV file.
> 
> val datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header", "true").load("timeline.csv")
> java.lang.NoSuchMethodError: org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
> 
> 
> I have added the dependencies in sbt file 
> // Spark Additional Library - CSV Read as DF
> libraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
> and starting the spark-shell with command
> 
> spark-shell --master yarn-client  --jars /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar --name nayan 
> 
> 
> 
> Thanks for any help!!
> 
> 
> Thanks,
> Nayan

Re: Error while reading the CSV

Posted by Yash Sharma <ya...@gmail.com>.
Sorry buddy, didn't get your question quite right.
Just to test, I created a scala class with spark csv and it seemed to work.

Donno if that would help much, but here are the env details:
EMR 2.7.3
scalaVersion := "2.11.8"
Spark version 2.0.2



On Fri, 7 Apr 2017 at 17:51 nayan sharma <na...@gmail.com> wrote:

> Hi Yash,
> I know this will work perfect but here I wanted  to read the csv using the
> assembly jar file.
>
> Thanks,
> Nayan
>
> On 07-Apr-2017, at 10:02 AM, Yash Sharma <ya...@gmail.com> wrote:
>
> Hi Nayan,
> I use the --packages with the spark shell and the spark submit. Could you
> please try that and let us know:
> Command:
>
> spark-submit --packages com.databricks:spark-csv_2.11:1.4.0
>
>
> On Fri, 7 Apr 2017 at 00:39 nayan sharma <na...@gmail.com> wrote:
>
> spark version 1.6.2
> scala version 2.10.5
>
> On 06-Apr-2017, at 8:05 PM, Jörn Franke <jo...@gmail.com> wrote:
>
> And which version does your Spark cluster use?
>
> On 6. Apr 2017, at 16:11, nayan sharma <na...@gmail.com> wrote:
>
> scalaVersion := “2.10.5"
>
>
>
>
>
> On 06-Apr-2017, at 7:35 PM, Jörn Franke <jo...@gmail.com> wrote:
>
> Maybe your Spark is based on scala 2.11, but you compile it for 2.10 or
> the other way around?
>
> On 6. Apr 2017, at 15:54, nayan sharma <na...@gmail.com> wrote:
>
> In addition I am using spark version 1.6.2
> Is there any chance of error coming because of Scala version or
> dependencies are not matching.?I just guessed.
>
> Thanks,
> Nayan
>
>
>
> On 06-Apr-2017, at 7:16 PM, nayan sharma <na...@gmail.com> wrote:
>
> Hi Jorn,
> Thanks for replying.
>
> jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv
>
> after doing this I have found a lot of classes under
> com/databricks/spark/csv/
>
> do I need to check for any specific class ??
>
> Regards,
> Nayan
>
> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jo...@gmail.com> wrote:
>
> Is the library in your assembly jar?
>
> On 6. Apr 2017, at 15:06, nayan sharma <na...@gmail.com> wrote:
>
> Hi All,
> I am getting error while loading CSV file.
>
> val
> datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header",
> "true").load("timeline.csv")
> java.lang.NoSuchMethodError:
> org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>
>
> I have added the dependencies in sbt file
>
> // Spark Additional Library - CSV Read as DFlibraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>
> *and starting the spark-shell with command*
>
> spark-shell --master yarn-client  --jars
> /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar
> --name nayan
>
>
>
> Thanks for any help!!
>
>
> Thanks,
> Nayan
>
>
>
>
>
>
>

Re: Error while reading the CSV

Posted by nayan sharma <na...@gmail.com>.
Hi Yash,
I know this will work perfect but here I wanted  to read the csv using the assembly jar file.

Thanks,
Nayan

> On 07-Apr-2017, at 10:02 AM, Yash Sharma <ya...@gmail.com> wrote:
> 
> Hi Nayan,
> I use the --packages with the spark shell and the spark submit. Could you please try that and let us know:
> Command:
> spark-submit --packages com.databricks:spark-csv_2.11:1.4.0
> 
> On Fri, 7 Apr 2017 at 00:39 nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
> spark version 1.6.2
> scala version 2.10.5
> 
>> On 06-Apr-2017, at 8:05 PM, Jörn Franke <jornfranke@gmail.com <ma...@gmail.com>> wrote:
>> 
>> And which version does your Spark cluster use?
>> 
>> On 6. Apr 2017, at 16:11, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>> 
>>> scalaVersion := “2.10.5"
>>> 
>>> 
>>> 
>>> 
>>>> On 06-Apr-2017, at 7:35 PM, Jörn Franke <jornfranke@gmail.com <ma...@gmail.com>> wrote:
>>>> 
>>>> Maybe your Spark is based on scala 2.11, but you compile it for 2.10 or the other way around?
>>>> 
>>>> On 6. Apr 2017, at 15:54, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>>>> 
>>>>> In addition I am using spark version 1.6.2
>>>>> Is there any chance of error coming because of Scala version or dependencies are not matching.?I just guessed.
>>>>> 
>>>>> Thanks,
>>>>> Nayan
>>>>> 
>>>>>  
>>>>>> On 06-Apr-2017, at 7:16 PM, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>>>>>> 
>>>>>> Hi Jorn,
>>>>>> Thanks for replying.
>>>>>> 
>>>>>> jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv
>>>>>> 
>>>>>> after doing this I have found a lot of classes under 
>>>>>> com/databricks/spark/csv/
>>>>>> 
>>>>>> do I need to check for any specific class ??
>>>>>> 
>>>>>> Regards,
>>>>>> Nayan
>>>>>>> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jornfranke@gmail.com <ma...@gmail.com>> wrote:
>>>>>>> 
>>>>>>> Is the library in your assembly jar?
>>>>>>> 
>>>>>>> On 6. Apr 2017, at 15:06, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>>>>>>> 
>>>>>>>> Hi All,
>>>>>>>> I am getting error while loading CSV file.
>>>>>>>> 
>>>>>>>> val datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header", "true").load("timeline.csv")
>>>>>>>> java.lang.NoSuchMethodError: org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>>>>>>>> 
>>>>>>>> 
>>>>>>>> I have added the dependencies in sbt file 
>>>>>>>> // Spark Additional Library - CSV Read as DF
>>>>>>>> libraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>>>>>>>> and starting the spark-shell with command
>>>>>>>> 
>>>>>>>> spark-shell --master yarn-client  --jars /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar --name nayan 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> Thanks for any help!!
>>>>>>>> 
>>>>>>>> 
>>>>>>>> Thanks,
>>>>>>>> Nayan
>>>>>> 
>>>>> 
>>> 
> 


Re: Error while reading the CSV

Posted by Yash Sharma <ya...@gmail.com>.
Hi Nayan,
I use the --packages with the spark shell and the spark submit. Could you
please try that and let us know:
Command:

spark-submit --packages com.databricks:spark-csv_2.11:1.4.0


On Fri, 7 Apr 2017 at 00:39 nayan sharma <na...@gmail.com> wrote:

> spark version 1.6.2
> scala version 2.10.5
>
> On 06-Apr-2017, at 8:05 PM, Jörn Franke <jo...@gmail.com> wrote:
>
> And which version does your Spark cluster use?
>
> On 6. Apr 2017, at 16:11, nayan sharma <na...@gmail.com> wrote:
>
> scalaVersion := “2.10.5"
>
>
>
>
>
> On 06-Apr-2017, at 7:35 PM, Jörn Franke <jo...@gmail.com> wrote:
>
> Maybe your Spark is based on scala 2.11, but you compile it for 2.10 or
> the other way around?
>
> On 6. Apr 2017, at 15:54, nayan sharma <na...@gmail.com> wrote:
>
> In addition I am using spark version 1.6.2
> Is there any chance of error coming because of Scala version or
> dependencies are not matching.?I just guessed.
>
> Thanks,
> Nayan
>
>
>
> On 06-Apr-2017, at 7:16 PM, nayan sharma <na...@gmail.com> wrote:
>
> Hi Jorn,
> Thanks for replying.
>
> jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv
>
> after doing this I have found a lot of classes under
> com/databricks/spark/csv/
>
> do I need to check for any specific class ??
>
> Regards,
> Nayan
>
> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jo...@gmail.com> wrote:
>
> Is the library in your assembly jar?
>
> On 6. Apr 2017, at 15:06, nayan sharma <na...@gmail.com> wrote:
>
> Hi All,
> I am getting error while loading CSV file.
>
> val
> datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header",
> "true").load("timeline.csv")
> java.lang.NoSuchMethodError:
> org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>
>
> I have added the dependencies in sbt file
>
> // Spark Additional Library - CSV Read as DFlibraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>
> *and starting the spark-shell with command*
>
> spark-shell --master yarn-client  --jars
> /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar
> --name nayan
>
>
>
> Thanks for any help!!
>
>
> Thanks,
> Nayan
>
>
>
>
>
>

Re: Error while reading the CSV

Posted by nayan sharma <na...@gmail.com>.
spark version 1.6.2
scala version 2.10.5

> On 06-Apr-2017, at 8:05 PM, Jörn Franke <jo...@gmail.com> wrote:
> 
> And which version does your Spark cluster use?
> 
> On 6. Apr 2017, at 16:11, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
> 
>> scalaVersion := “2.10.5"
>> 
>> 
>> 
>> 
>>> On 06-Apr-2017, at 7:35 PM, Jörn Franke <jornfranke@gmail.com <ma...@gmail.com>> wrote:
>>> 
>>> Maybe your Spark is based on scala 2.11, but you compile it for 2.10 or the other way around?
>>> 
>>> On 6. Apr 2017, at 15:54, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>>> 
>>>> In addition I am using spark version 1.6.2
>>>> Is there any chance of error coming because of Scala version or dependencies are not matching.?I just guessed.
>>>> 
>>>> Thanks,
>>>> Nayan
>>>> 
>>>>  
>>>>> On 06-Apr-2017, at 7:16 PM, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>>>>> 
>>>>> Hi Jorn,
>>>>> Thanks for replying.
>>>>> 
>>>>> jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv
>>>>> 
>>>>> after doing this I have found a lot of classes under 
>>>>> com/databricks/spark/csv/
>>>>> 
>>>>> do I need to check for any specific class ??
>>>>> 
>>>>> Regards,
>>>>> Nayan
>>>>>> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jornfranke@gmail.com <ma...@gmail.com>> wrote:
>>>>>> 
>>>>>> Is the library in your assembly jar?
>>>>>> 
>>>>>> On 6. Apr 2017, at 15:06, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>>>>>> 
>>>>>>> Hi All,
>>>>>>> I am getting error while loading CSV file.
>>>>>>> 
>>>>>>> val datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header", "true").load("timeline.csv")
>>>>>>> java.lang.NoSuchMethodError: org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>>>>>>> 
>>>>>>> 
>>>>>>> I have added the dependencies in sbt file 
>>>>>>> // Spark Additional Library - CSV Read as DF
>>>>>>> libraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>>>>>>> and starting the spark-shell with command
>>>>>>> 
>>>>>>> spark-shell --master yarn-client  --jars /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar --name nayan 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> Thanks for any help!!
>>>>>>> 
>>>>>>> 
>>>>>>> Thanks,
>>>>>>> Nayan
>>>>> 
>>>> 
>> 


Re: Error while reading the CSV

Posted by Jörn Franke <jo...@gmail.com>.
And which version does your Spark cluster use?

> On 6. Apr 2017, at 16:11, nayan sharma <na...@gmail.com> wrote:
> 
> scalaVersion := “2.10.5"
> 
> 
> 
> 
>> On 06-Apr-2017, at 7:35 PM, Jörn Franke <jo...@gmail.com> wrote:
>> 
>> Maybe your Spark is based on scala 2.11, but you compile it for 2.10 or the other way around?
>> 
>>> On 6. Apr 2017, at 15:54, nayan sharma <na...@gmail.com> wrote:
>>> 
>>> In addition I am using spark version 1.6.2
>>> Is there any chance of error coming because of Scala version or dependencies are not matching.?I just guessed.
>>> 
>>> Thanks,
>>> Nayan
>>> 
>>>  
>>>> On 06-Apr-2017, at 7:16 PM, nayan sharma <na...@gmail.com> wrote:
>>>> 
>>>> Hi Jorn,
>>>> Thanks for replying.
>>>> 
>>>> jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv
>>>> 
>>>> after doing this I have found a lot of classes under 
>>>> com/databricks/spark/csv/
>>>> 
>>>> do I need to check for any specific class ??
>>>> 
>>>> Regards,
>>>> Nayan
>>>>> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jo...@gmail.com> wrote:
>>>>> 
>>>>> Is the library in your assembly jar?
>>>>> 
>>>>>> On 6. Apr 2017, at 15:06, nayan sharma <na...@gmail.com> wrote:
>>>>>> 
>>>>>> Hi All,
>>>>>> I am getting error while loading CSV file.
>>>>>> 
>>>>>> val datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header", "true").load("timeline.csv")
>>>>>> java.lang.NoSuchMethodError: org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>>>>>> 
>>>>>> 
>>>>>> I have added the dependencies in sbt file 
>>>>>> // Spark Additional Library - CSV Read as DF
>>>>>> libraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>>>>>> and starting the spark-shell with command
>>>>>> 
>>>>>> spark-shell --master yarn-client  --jars /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar --name nayan 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> Thanks for any help!!
>>>>>> 
>>>>>> 
>>>>>> Thanks,
>>>>>> Nayan
>>>> 
>>> 
> 

Re: Error while reading the CSV

Posted by nayan sharma <na...@gmail.com>.
scalaVersion := “2.10.5"




> On 06-Apr-2017, at 7:35 PM, Jörn Franke <jo...@gmail.com> wrote:
> 
> Maybe your Spark is based on scala 2.11, but you compile it for 2.10 or the other way around?
> 
> On 6. Apr 2017, at 15:54, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
> 
>> In addition I am using spark version 1.6.2
>> Is there any chance of error coming because of Scala version or dependencies are not matching.?I just guessed.
>> 
>> Thanks,
>> Nayan
>> 
>>  
>>> On 06-Apr-2017, at 7:16 PM, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>>> 
>>> Hi Jorn,
>>> Thanks for replying.
>>> 
>>> jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv
>>> 
>>> after doing this I have found a lot of classes under 
>>> com/databricks/spark/csv/
>>> 
>>> do I need to check for any specific class ??
>>> 
>>> Regards,
>>> Nayan
>>>> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jornfranke@gmail.com <ma...@gmail.com>> wrote:
>>>> 
>>>> Is the library in your assembly jar?
>>>> 
>>>> On 6. Apr 2017, at 15:06, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>>>> 
>>>>> Hi All,
>>>>> I am getting error while loading CSV file.
>>>>> 
>>>>> val datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header", "true").load("timeline.csv")
>>>>> java.lang.NoSuchMethodError: org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>>>>> 
>>>>> 
>>>>> I have added the dependencies in sbt file 
>>>>> // Spark Additional Library - CSV Read as DF
>>>>> libraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>>>>> and starting the spark-shell with command
>>>>> 
>>>>> spark-shell --master yarn-client  --jars /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar --name nayan 
>>>>> 
>>>>> 
>>>>> 
>>>>> Thanks for any help!!
>>>>> 
>>>>> 
>>>>> Thanks,
>>>>> Nayan
>>> 
>> 


Re: Error while reading the CSV

Posted by Jörn Franke <jo...@gmail.com>.
Maybe your Spark is based on scala 2.11, but you compile it for 2.10 or the other way around?

> On 6. Apr 2017, at 15:54, nayan sharma <na...@gmail.com> wrote:
> 
> In addition I am using spark version 1.6.2
> Is there any chance of error coming because of Scala version or dependencies are not matching.?I just guessed.
> 
> Thanks,
> Nayan
> 
>  
>> On 06-Apr-2017, at 7:16 PM, nayan sharma <na...@gmail.com> wrote:
>> 
>> Hi Jorn,
>> Thanks for replying.
>> 
>> jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv
>> 
>> after doing this I have found a lot of classes under 
>> com/databricks/spark/csv/
>> 
>> do I need to check for any specific class ??
>> 
>> Regards,
>> Nayan
>>> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jo...@gmail.com> wrote:
>>> 
>>> Is the library in your assembly jar?
>>> 
>>>> On 6. Apr 2017, at 15:06, nayan sharma <na...@gmail.com> wrote:
>>>> 
>>>> Hi All,
>>>> I am getting error while loading CSV file.
>>>> 
>>>> val datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header", "true").load("timeline.csv")
>>>> java.lang.NoSuchMethodError: org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>>>> 
>>>> 
>>>> I have added the dependencies in sbt file 
>>>> // Spark Additional Library - CSV Read as DF
>>>> libraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>>>> and starting the spark-shell with command
>>>> 
>>>> spark-shell --master yarn-client  --jars /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar --name nayan 
>>>> 
>>>> 
>>>> 
>>>> Thanks for any help!!
>>>> 
>>>> 
>>>> Thanks,
>>>> Nayan
>> 
> 

Re: Error while reading the CSV

Posted by nayan sharma <na...@gmail.com>.
In addition I am using spark version 1.6.2
Is there any chance of error coming because of Scala version or dependencies are not matching.?I just guessed.

Thanks,
Nayan

 
> On 06-Apr-2017, at 7:16 PM, nayan sharma <na...@gmail.com> wrote:
> 
> Hi Jorn,
> Thanks for replying.
> 
> jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv
> 
> after doing this I have found a lot of classes under 
> com/databricks/spark/csv/
> 
> do I need to check for any specific class ??
> 
> Regards,
> Nayan
>> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jornfranke@gmail.com <ma...@gmail.com>> wrote:
>> 
>> Is the library in your assembly jar?
>> 
>> On 6. Apr 2017, at 15:06, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
>> 
>>> Hi All,
>>> I am getting error while loading CSV file.
>>> 
>>> val datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header", "true").load("timeline.csv")
>>> java.lang.NoSuchMethodError: org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>>> 
>>> 
>>> I have added the dependencies in sbt file 
>>> // Spark Additional Library - CSV Read as DF
>>> libraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>>> and starting the spark-shell with command
>>> 
>>> spark-shell --master yarn-client  --jars /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar --name nayan 
>>> 
>>> 
>>> 
>>> Thanks for any help!!
>>> 
>>> 
>>> Thanks,
>>> Nayan
> 


Re: Error while reading the CSV

Posted by nayan sharma <na...@gmail.com>.
Hi Jorn,
Thanks for replying.

jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv

after doing this I have found a lot of classes under 
com/databricks/spark/csv/

do I need to check for any specific class ??

Regards,
Nayan
> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jo...@gmail.com> wrote:
> 
> Is the library in your assembly jar?
> 
> On 6. Apr 2017, at 15:06, nayan sharma <nayansharma13@gmail.com <ma...@gmail.com>> wrote:
> 
>> Hi All,
>> I am getting error while loading CSV file.
>> 
>> val datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header", "true").load("timeline.csv")
>> java.lang.NoSuchMethodError: org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>> 
>> 
>> I have added the dependencies in sbt file 
>> // Spark Additional Library - CSV Read as DF
>> libraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>> and starting the spark-shell with command
>> 
>> spark-shell --master yarn-client  --jars /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar --name nayan 
>> 
>> 
>> 
>> Thanks for any help!!
>> 
>> 
>> Thanks,
>> Nayan