You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2016/11/14 09:18:58 UTC
[jira] [Updated] (SPARK-18419) [SPARK-18419][SQL] Fix
`JDBCOptions.asConnectionProperties` to be case-insensitive
[ https://issues.apache.org/jira/browse/SPARK-18419?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-18419:
----------------------------------
Summary: [SPARK-18419][SQL] Fix `JDBCOptions.asConnectionProperties` to be case-insensitive (was: Fix JDBCOptions and DataSource to be case-insensitive for JDBCOptions keys)
> [SPARK-18419][SQL] Fix `JDBCOptions.asConnectionProperties` to be case-insensitive
> ----------------------------------------------------------------------------------
>
> Key: SPARK-18419
> URL: https://issues.apache.org/jira/browse/SPARK-18419
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Reporter: Dongjoon Hyun
> Priority: Minor
>
> This issue aims to fix the followings.
> **A. Fix `JDBCOptions.asConnectionProperties` to be case-insensitive.**
> `JDBCOptions.asConnectionProperties` is designed to filter JDBC options out, but it fails to handle `CaseInsensitiveMap` correctly. For the following example, it returns `Map('numpartitions' -> "10")` as a wrong result and the assertion fails.
> {code}
> val options = new JDBCOptions(new CaseInsensitiveMap(Map(
> "url" -> "jdbc:mysql://localhost:3306/temp",
> "dbtable" -> "t1",
> "numPartitions" -> "10")))
> assert(options.asConnectionProperties.isEmpty)
> {code}
> **B. Fix `DataSource` to use `CaseInsensitiveMap` consistently.**
> `DataSource` partially use `CaseInsensitiveMap` in code-path. For example, the following fails to find `url`.
> {code}
> val df = spark.createDataFrame(sparkContext.parallelize(arr2x2), schema2)
> df.write.format("jdbc")
> .option("URL", url1)
> .option("dbtable", "TEST.SAVETEST")
> .options(properties.asScala)
> .save()
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org