You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Dan Bikle <bi...@gmail.com> on 2016/09/23 20:26:31 UTC

databricks spark-csv: linking coordinates are what?

hello world-of-spark,

I am learning spark today.

I want to understand the spark code in this repo:

https://github.com/databricks/spark-csv

In the README.md I see this info:

Linking

You can link against this library in your program at the following
coordinates:
Scala 2.10

groupId: com.databricks
artifactId: spark-csv_2.10
version: 1.5.0

Scala 2.11

groupId: com.databricks
artifactId: spark-csv_2.11
version: 1.5.0

I want to know how I can use the above info.

The people who wrote spark-csv should give some kind of example, demo, or
context.

My understanding of Linking is limited.

I have some experience operating sbt which I learned from this URL:

http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications

The above URL does not give me enough information so that I can link
spark-csv with spark.

Question:
How do I learn how to use the info in the Linking section of the README.md
of
https://github.com/databricks/spark-csv
??

Re: databricks spark-csv: linking coordinates are what?

Posted by Anastasios Zouzias <zo...@gmail.com>.
Hi Dan,

If you use spark <= 1.6, you can also do

$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-csv_2.10:1.5.0

to quickly link the spark-csv jars to spark shell. Otherwise as Holden
suggested you link it in your maven/sbt dependencies. Spark guys assume
that their users have a good working knowledge on maven/sbt; you might need
to read on these before jumping to Spark.

Best,
Anastasios

On Fri, Sep 23, 2016 at 10:26 PM, Dan Bikle <bi...@gmail.com> wrote:

>

> >

> hello world-of-spark,
>
> I am learning spark today.
>
> I want to understand the spark code in this repo:
>
> https://github.com/databricks/spark-csv
<https://github.com/databricks/spark-csv>
>
> In the README.md I see this info:
>
> Linking
>
> You can link against this library in your program at the following
coordinates:
> Scala 2.10
>
> groupId: com.databricks
> artifactId: spark-csv_2.10
> version: 1.5.0
>
> Scala 2.11
>
> groupId: com.databricks
> artifactId: spark-csv_2.11
> version: 1.5.0
>
> I want to know how I can use the above info.
>
> The people who wrote spark-csv should give some kind of example, demo, or
context.
>
> My understanding of Linking is limited.
>
> I have some experience operating sbt which I learned from this URL:
>
>
http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications
<http://spark.apache.org/docs/latest/quick-start.html#self-contained-applications>
>
> The above URL does not give me enough information so that I can link
spark-csv with spark.
>
> Question:
> How do I learn how to use the info in the Linking section of the
README.md of
> https://github.com/databricks/spark-csv
<https://github.com/databricks/spark-csv>
> ??
>

-- 
-- Anastasios Zouzias

Re: databricks spark-csv: linking coordinates are what?

Posted by Holden Karau <ho...@pigscanfly.ca>.
So the good news is the csv library has been integrated into Spark 2.0 so
you don't need to use that package. On the other hand if your in an older
version you can included it using the standard sbt or  maven package
configuration.

On Friday, September 23, 2016, Dan Bikle <bi...@gmail.com> wrote:

> hello world-of-spark,
>
> I am learning spark today.
>
> I want to understand the spark code in this repo:
>
> https://github.com/databricks/spark-csv
>
> In the README.md I see this info:
>
> Linking
>
> You can link against this library in your program at the following
> coordinates:
> Scala 2.10
>
> groupId: com.databricks
> artifactId: spark-csv_2.10
> version: 1.5.0
>
> Scala 2.11
>
> groupId: com.databricks
> artifactId: spark-csv_2.11
> version: 1.5.0
>
> I want to know how I can use the above info.
>
> The people who wrote spark-csv should give some kind of example, demo, or
> context.
>
> My understanding of Linking is limited.
>
> I have some experience operating sbt which I learned from this URL:
>
> http://spark.apache.org/docs/latest/quick-start.html#self-
> contained-applications
>
> The above URL does not give me enough information so that I can link
> spark-csv with spark.
>
> Question:
> How do I learn how to use the info in the Linking section of the README.md
> of
> https://github.com/databricks/spark-csv
> ??
>
>

-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau