You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/11/21 11:58:58 UTC

[jira] [Resolved] (SPARK-18524) Cannot create dataframe on jdbc data source from spark 2.0.2

     [ https://issues.apache.org/jira/browse/SPARK-18524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-18524.
-------------------------------
    Resolution: Not A Problem

This is an Hadoop + Windows env problem. It looks like you don't have winutils installed.

> Cannot create dataframe on jdbc data source  from spark 2.0.2
> -------------------------------------------------------------
>
>                 Key: SPARK-18524
>                 URL: https://issues.apache.org/jira/browse/SPARK-18524
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.0.2
>         Environment: windows 7.
>            Reporter: Som K
>
> HI
> I have installed Apache spark 2.0.2 on windows 7. I have SAP ASE (sybase)
> as my jdbc data source. I am looking to create dataframe from database table.
> However when I run below command I get following errors:
>  val sqlc = new org.apache.spark.sql.SQLContext(sc)
>   val jdbcDF = sqlc.read
>   .format("jdbc")
>   .option("url", "jdbc:sybase:Tds:9.195.69.145:2048/master")
>   .option("dbtable", "sysobjects")
>   .option("user", "sa")
>   .option("password", "******")
>   .load()
> Errors :
> scala> :paste
> // Entering paste mode (ctrl-D to finish)
>  val sqlc = new org.apache.spark.sql.SQLContext(sc)
>   val jdbcDF = sqlc.read
>   .format("jdbc")
>   .option("url", "jdbc:sybase:Tds:9.195.69.145:2048/master")
>   .option("dbtable", "sysobjects")
>   .option("user", "sa")
>   .option("password", "somnath")
>   .load()
> // Exiting paste mode, now interpreting.
> warning: there was one deprecation warning; re-run with -deprecation for details
> 16/11/21 15:29:05 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have mu
> tiple JAR versions of the same plugin in the classpath. The URL "file:/C:/somnath/Elasticsearch/spark-2.0.2-bin-hadoop2.7/bi
> /../jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at U
> L "file:/C:/somnath/Elasticsearch/spark-2.0.2-bin-hadoop2.7/jars/datanucleus-rdbms-3.2.9.jar."
> 16/11/21 15:29:05 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR v
> rsions of the same plugin in the classpath. The URL "file:/C:/somnath/Elasticsearch/spark-2.0.2-bin-hadoop2.7/jars/datanucle
> s-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/somnat
> /Elasticsearch/spark-2.0.2-bin-hadoop2.7/bin/../jars/datanucleus-core-3.2.10.jar."
> 16/11/21 15:29:05 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multip
> e JAR versions of the same plugin in the classpath. The URL "file:/C:/somnath/Elasticsearch/spark-2.0.2-bin-hadoop2.7/jars/d
> tanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:
> C:/somnath/Elasticsearch/spark-2.0.2-bin-hadoop2.7/bin/../jars/datanucleus-api-jdo-3.2.6.jar."
> java.lang.RuntimeException: java.lang.RuntimeException: Error while running command to get file permissions : java.io.IOExce
> tion: (null) entry in command string: null ls -F C:\tmp\hive
>         at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770)
>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
>         at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
>         at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:6
> 9)
>         at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:634)
>         at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
>         at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
>         at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
>         at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:189)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:42



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org