You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sergey Bahchissaraitsev (JIRA)" <ji...@apache.org> on 2016/06/16 11:58:05 UTC
[jira] [Updated] (SPARK-15987) PostgreSQL CITEXT type JDBC support
[ https://issues.apache.org/jira/browse/SPARK-15987?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sergey Bahchissaraitsev updated SPARK-15987:
--------------------------------------------
Environment:
Ubuntu 14.04
PostgreSQL 9.3.9
was:
Ubuntu linux 14.04
PostgreSQL version 9.3.9
> PostgreSQL CITEXT type JDBC support
> -----------------------------------
>
> Key: SPARK-15987
> URL: https://issues.apache.org/jira/browse/SPARK-15987
> Project: Spark
> Issue Type: New Feature
> Components: SQL
> Affects Versions: 1.6.1
> Environment: Ubuntu 14.04
> PostgreSQL 9.3.9
> Reporter: Sergey Bahchissaraitsev
> Priority: Critical
> Labels: dataframe, jdbc, postgresql
>
> When trying to use spark data frame on a table with CITEXT type you get the following error:
> Exception in thread "main" java.sql.SQLException: Unsupported type 1111
> at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.org$apache$spark$sql$execution$datasources$jdbc$JDBCRDD$$getCatalystType(JDBCRDD.scala:102)
> at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:141)
> at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:141)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:140)
> at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91)
> at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:222)
> at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:208)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org