You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2016/10/10 06:09:20 UTC

[jira] [Commented] (SPARK-9506) DataFrames Postgresql JDBC unable to support most of the Postgresql's Data Type

    [ https://issues.apache.org/jira/browse/SPARK-9506?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15561405#comment-15561405 ] 

Xiao Li commented on SPARK-9506:
--------------------------------

Please check the latest version, more data types are supported. If you still hit any data type issue, please open a new JIRA with unsupported data types. 

> DataFrames Postgresql JDBC unable to support most of the Postgresql's Data Type
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-9506
>                 URL: https://issues.apache.org/jira/browse/SPARK-9506
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>            Reporter: Pangjiu
>         Attachments: code.PNG, log.PNG, tables_structures.PNG
>
>
> Hi All,
> I have issue on using Postgresql JDBC with sqlContext for postgresql's data types: eg: abstime, character varying[], int2vector, json and etc.
> Exception are "Unsupported type 2003" and "Unsupported type 1111".
> Below is the code:
> Class.forName("org.postgresql.Driver").newInstance()
> val url = "jdbc:postgresql://localhost:5432/sample?user=posgres&password=xxx"
> val driver = "org.postgresql.Driver"
> val output = { sqlContext.load("jdbc", Map 
> 				(
> 					"url" -> url,
> 					"driver" -> driver,
> 					"dbtable" -> "(SELECT `ID`, `NAME` FROM `agent`) AS tableA "
> 				)
> 			)
> }
> Hope SQL Context can support all the data types.
> Thanks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org