You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jarek Jarcec Cecho (Jira)" <ji...@apache.org> on 2020/01/02 03:36:00 UTC
[jira] [Commented] (SPARK-26494) 【spark sql】Use spark to read oracle TIMESTAMP(6) WITH LOCAL TIME ZONE type can't be found,
[ https://issues.apache.org/jira/browse/SPARK-26494?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17006564#comment-17006564 ]
Jarek Jarcec Cecho commented on SPARK-26494:
--------------------------------------------
I believe that can explain this JIRA a bit further as I've recently hit it myself. If one is reading over JDBC from Oracle and the source table have type {{TIMESTAMP WITH LOCAL TIME ZONE}}, Spark will end up with exception:
{code}
Unrecognized SQL type -102
java.sql.SQLException: Unrecognized SQL type -102
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$getCatalystType(JdbcUtils.scala:246)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:316)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:316)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getSchema(JdbcUtils.scala:315)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:63)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:210)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.apply(JDBCRelation.scala:225)
at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:312)
{code}
The use case is that I'm a user who does not own the source table (and thus have no control over it's schema), yet it needs to be loaded it into Spark environment. I wonder what are your thoughts on why the type makes no sense in Spark [~srowen]?
> 【spark sql】Use spark to read oracle TIMESTAMP(6) WITH LOCAL TIME ZONE type can't be found,
> ------------------------------------------------------------------------------------------
>
> Key: SPARK-26494
> URL: https://issues.apache.org/jira/browse/SPARK-26494
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: kun'qin
> Priority: Minor
>
> Use spark to read oracle TIMESTAMP(6) WITH LOCAL TIME ZONE type can't be found,
> When the data type is TIMESTAMP(6) WITH LOCAL TIME ZONE
> At this point, the sqlType value of the function getCatalystType in the JdbcUtils class is -102.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org