You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by rishmanisation <ri...@gmail.com> on 2018/10/16 23:15:27 UTC

Application crashes when encountering oracle timestamp

I am writing a Spark application to profile an Oracle database. The
application works perfectly without any timestamp columns, but when I do try
to profile a database with a timestamp column I run into the following
error:

Exception in thread "main" java.sql.SQLException: Unrecognized SQL type -102
	at
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.org$apache$spark$sql$execution$datasources$jdbc$JdbcUtils$$getCatalystType(JdbcUtils.scala:238)
	at
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:307)
	at
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$8.apply(JdbcUtils.scala:307)
	at scala.Option.getOrElse(Option.scala:121)
	at
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getSchema(JdbcUtils.scala:306)
	at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:62)
	at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:115)
	at
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:52)
	at
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:340)
	at
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
	at DatabaseProfiling.main(DatabaseProfiling.java:209)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:906)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

It is a Java application. I have tried using both ojdbc7 and ojdbc14 jars
but neither has helped.

Code:

Dataset<Row> df = spark
                .read()
                .format("jdbc")
                .option("url", dbPath)
                .option("oracle.jdbc.mapDateToTimestamp", "false")
                .option("dbtable", tableName)
                .option("user", username)
                .option("password", password)
                .option("driver", driverClass)
                .load();

df.show(); //crashes right here and doesn't go any further

Has anyone encountered this issue and been able to fix it? Please advise.
Thanks!



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org