You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ozioma Ihekwoaba (JIRA)" <ji...@apache.org> on 2016/08/18 08:46:20 UTC
[jira] [Created] (SPARK-17126) Errors setting driver classpath in
spark-defaults.conf on Windows 7
Ozioma Ihekwoaba created SPARK-17126:
----------------------------------------
Summary: Errors setting driver classpath in spark-defaults.conf on Windows 7
Key: SPARK-17126
URL: https://issues.apache.org/jira/browse/SPARK-17126
Project: Spark
Issue Type: Question
Components: Spark Shell, SQL
Affects Versions: 1.6.1
Environment: Windows 7
Reporter: Ozioma Ihekwoaba
I am having issues starting up Spark shell with a local hive-site.xml on Windows 7.
I have a local Hive 2.1.0 instance on Windows using a MySQL metastore.
The Hive instance is working fine.
I copied over the hive-site.xml to my local instance of Spark 1.6.1 conf folder and also copied over mysql-connector-java-5.1.25-bin.jar to the lib folder.
I was expecting Spark to pick up jar files in the lib folder automatically, but found out Spark expects a spark.driver.extraClassPath and spark.executor.extraClassPath settings to resolve jars.
Thing is this has failed on Windows for me with a DataStoreDriverNotFoundException saying com.mysql.jdbc.Driver could not be found.
Here are some of the different file paths I've tried:
C:/hadoop/spark/v161/lib/mysql-connector-java-5.1.25-bin.jar;C:/hadoop/spark/v161/lib/commons-csv-1.4.jar;C:/hadoop/spark/v161/lib/spark-csv_2.11-1.4.0.jar
".;C:\hadoop\spark\v161\lib\*"
....NONE has worked so far.
Please, what is the correct way to set driver classpaths on Windows?
Also, what is the correct file path format on Windows?
I have it working fine on Linux but my current engagement requires me to run Spark on a Windows box.
Is there a way for Spark to automatically resolve jars from the lib folder in all modes?
Thanks.
Ozzy
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org