You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ashok Kumar <as...@yahoo.com.INVALID> on 2016/02/19 16:26:19 UTC

install databricks csv package for spark

 Hi,
I downloaded the zipped csv libraries from databricks/spark-csv
|   |
|   |  |   |   |   |   |   |
| databricks/spark-csvspark-csv - CSV data source for Spark SQL and DataFrames |
|  |
| View on github.com | Preview by Yahoo |
|  |
|   |


Now I have a directory created called spark-csv-master.  I would like to use this in spark-shell with ---packgage like below
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-csv_2.11:1.3.0
Do I need to use mvn to create a zipped file to start. or may be added to spark CLASSPATH. What is needful here please to make it work 
thanks

Re: install databricks csv package for spark

Posted by Ashok Kumar <as...@yahoo.com.INVALID>.

great thank you 

    On Friday, 19 February 2016, 15:33, Holden Karau <ho...@pigscanfly.ca> wrote:
 

 So with --packages to spark-shell and spark-submit Spark will automatically fetch the requirements from maven. If you want to use an explicit local jar you can do that with the --jars syntax. You might find http://spark.apache.org/docs/latest/submitting-applications.html useful.
On Fri, Feb 19, 2016 at 7:26 AM, Ashok Kumar <as...@yahoo.com.invalid> wrote:

 Hi,
I downloaded the zipped csv libraries from databricks/spark-csv
|   |
|   |  |   |   |   |   |   |
| databricks/spark-csvspark-csv - CSV data source for Spark SQL and DataFrames |
|  |
| View on github.com | Preview by Yahoo |
|  |
|   |


Now I have a directory created called spark-csv-master.  I would like to use this in spark-shell with ---packgage like below
$SPARK_HOME/bin/spark-shell --packages com.databricks:spark-csv_2.11:1.3.0
Do I need to use mvn to create a zipped file to start. or may be added to spark CLASSPATH. What is needful here please to make it work 
thanks




-- 
Cell : 425-233-8271Twitter: https://twitter.com/holdenkarau

  

Re: install databricks csv package for spark

Posted by Holden Karau <ho...@pigscanfly.ca>.
So with --packages to spark-shell and spark-submit Spark will automatically
fetch the requirements from maven. If you want to use an explicit local jar
you can do that with the --jars syntax. You might find
http://spark.apache.org/docs/latest/submitting-applications.html useful.

On Fri, Feb 19, 2016 at 7:26 AM, Ashok Kumar <as...@yahoo.com.invalid>
wrote:

> Hi,
>
> I downloaded the zipped csv libraries from databricks/spark-csv
> <https://github.com/databricks/spark-csv>
>
>
> [image: image] <https://github.com/databricks/spark-csv>
>
>
>
>
>
> databricks/spark-csv <https://github.com/databricks/spark-csv>
> spark-csv - CSV data source for Spark SQL and DataFrames
> View on github.com <https://github.com/databricks/spark-csv>
> Preview by Yahoo
>
>
> Now I have a directory created called spark-csv-master.  I would like to
> use this in spark-shell with ---packgage like below
>
> $SPARK_HOME/bin/spark-shell --packages com.databricks:spark-csv_2.11:1.3.0
>
> Do I need to use mvn to create a zipped file to start. or may be added to
> spark CLASSPATH. What is needful here please to make it work
>
> thanks
>



-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau