You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rick Hillegas (JIRA)" <ji...@apache.org> on 2015/09/28 18:38:04 UTC
[jira] [Created] (SPARK-10855) Add a JDBC dialect for Apache Derby
Rick Hillegas created SPARK-10855:
-------------------------------------
Summary: Add a JDBC dialect for Apache Derby
Key: SPARK-10855
URL: https://issues.apache.org/jira/browse/SPARK-10855
Project: Spark
Issue Type: Improvement
Components: SQL
Affects Versions: 1.5.0
Reporter: Rick Hillegas
Priority: Minor
In particular, it would be good if the dialect could handle Derby's user-defined types. The following script fails:
{noformat}
import org.apache.spark.sql._
import org.apache.spark.sql.types._
// the following script was used to create a Derby table
// which has a column of user-defined type:
//
// create type properties external name 'java.util.Properties' language java;
//
// create function systemProperties() returns properties
// language java parameter style java no sql
// external name 'java.lang.System.getProperties';
//
// create table propertiesTable( props properties );
//
// insert into propertiesTable values ( null ), ( systemProperties() );
//
// select * from propertiesTable;
// cannot handle a table which has a column of type java.sql.Types.JAVA_OBJECT:
//
// java.sql.SQLException: Unsupported type 2000
//
val df = sqlContext.read.format("jdbc").options(
Map("url" -> "jdbc:derby:/Users/rhillegas/derby/databases/derby1",
"dbtable" -> "app.propertiesTable")).load()
// shutdown the Derby engine
val shutdown = sqlContext.read.format("jdbc").options(
Map("url" -> "jdbc:derby:;shutdown=true",
"dbtable" -> "")).load()
exit()
{noformat}
The inability to handle user-defined types probably affects other databases besides Derby.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org