You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ignite.apache.org by Tracyl <tl...@bloomberg.net> on 2016/10/06 17:01:17 UTC

Could IgniteCache be accessed thr Spark JDBC data source api?

Hey team, 

I was able to use JDBC driver tool to access IgniteCache. Is it possible to
connect to Ignite thr Spark JDBC data source api? Below are the code and
exceptions I got. It seems like the connection is successful but there are
datatype mapping issues. Do I need to define some schema from the server
side? Any suggestion would be really helpful for me. Thanks in advance! 

Sample Code: 
val props = Map( 
      "url" -> "jdbc:ignite://ip:11211/TestCache", 
      "driver" -> "org.apache.ignite.IgniteJdbcDriver", 
      "dbtable" -> "Person" 
    ) 
val personDF = ctx.read.format("jdbc").options(props).load() 

Exceptions: 
Exception in thread "main" java.sql.SQLException: Unsupported type 1111 
        at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.org$apache$spark$sql$execution$datasources$jdbc$JDBCRDD$$getCatalystType(JDBCRDD.scala:103) 
        at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:140) 
        at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:140) 
        at scala.Option.getOrElse(Option.scala:120) 
        at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:139) 
        at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91) 
        at
org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60) 
        at
org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158) 
        at
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119) 
        at
ignite.spark.DataFrameLoader$.personTest(DataFrameLoader.scala:131) 
        at ignite.spark.DataFrameLoader.personTest(DataFrameLoader.scala) 
        at jdbc.PersonLoader.executeTransaction(PersonLoader.java:175) 
        at jdbc.PersonLoader.main(PersonLoader.java:115) 
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
        at java.lang.reflect.Method.invoke(Method.java:498) 
        at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 
        at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
        at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) 
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 



--
View this message in context: http://apache-ignite-users.70518.x6.nabble.com/Could-IgniteCache-be-accessed-thr-Spark-JDBC-data-source-api-tp8122.html
Sent from the Apache Ignite Users mailing list archive at Nabble.com.