You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ewan Leith (JIRA)" <ji...@apache.org> on 2015/09/02 16:44:46 UTC

[jira] [Created] (SPARK-10419) Add SQLServer JdbcDialect support for datetimeoffset types

Ewan Leith created SPARK-10419:
----------------------------------

             Summary: Add SQLServer JdbcDialect support for datetimeoffset types
                 Key: SPARK-10419
                 URL: https://issues.apache.org/jira/browse/SPARK-10419
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.4.1, 1.5.0
            Reporter: Ewan Leith
            Priority: Minor


Running JDBC connections against Microsoft SQL Server database tables, when a table contains a datetimeoffset column type, the following error is received:

sqlContext.read.jdbc("jdbc:sqlserver://127.0.0.1:1433;DatabaseName=testdb", "sampletable", prop)
java.sql.SQLException: Unsupported type -155
        at org.apache.spark.sql.jdbc.JDBCRDD$.org$apache$spark$sql$jdbc$JDBCRDD$$getCatalystType(JDBCRDD.scala:100)
        at org.apache.spark.sql.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:137)
        at org.apache.spark.sql.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:137)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.sql.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:136)
        at org.apache.spark.sql.jdbc.JDBCRelation.<init>(JDBCRelation.scala:128)
        at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:200)
        at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:130)

Based on the JdbcDialect code for DB2 and the Microsoft SQL Server documentation, we should probably treat datetimeoffset types as Strings 

https://technet.microsoft.com/en-us/library/bb630289%28v=sql.105%29.aspx

We've created a small addition to JdbcDialects.scala to do this conversion, I'll create a pull request for it.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org