You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rene Treffer (JIRA)" <ji...@apache.org> on 2015/04/14 12:02:12 UTC

[jira] [Updated] (SPARK-6888) Make DriverQuirks editable

     [ https://issues.apache.org/jira/browse/SPARK-6888?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Rene Treffer updated SPARK-6888:
--------------------------------
    Flags: Patch

> Make DriverQuirks editable
> --------------------------
>
>                 Key: SPARK-6888
>                 URL: https://issues.apache.org/jira/browse/SPARK-6888
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Rene Treffer
>            Priority: Minor
>
> JDBC type conversion is currently handled by spark with the help of DriverQuirks (org.apache.spark.sql.jdbc.DriverQuirks).
> However some cases can't be resolved, e.g. MySQL "BIGINT UNSIGNED". (other UNSIGNED conversions won't work either but could be resolved automatically by using the next larger type)
> An invalid type conversion (e.g. loading an unsigned bigint with the highest bit set as a long value) causes the jdbc driver to throw an exception.
> The target type is determined automatically and bound to the resulting DataFrame where it's immutable.
> Alternative solutions:
> - Subqueries. Produce extra load on the server
> - SQLContext / jdbc methods with schema support
> - Making it possible to change the schema of data frames



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org