You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiansen Chen (Jira)" <ji...@apache.org> on 2021/08/14 14:48:00 UTC
[jira] [Created] (SPARK-36513) Using openGauss ( like postgres) as
Spark Datasource, Spark thorws cast errors
Xiansen Chen created SPARK-36513:
------------------------------------
Summary: Using openGauss ( like postgres) as Spark Datasource, Spark thorws cast errors
Key: SPARK-36513
URL: https://issues.apache.org/jira/browse/SPARK-36513
Project: Spark
Issue Type: Bug
Components: Java API, SQL
Affects Versions: 3.3.0
Environment: Spark 3.3.0
Scala 2.12
Reporter: Xiansen Chen
Attachments: image-2021-08-14-22-47-56-474.png
Hi, I'm trying to use openGauss as the Spark Datasource and implementing some datasource interfaces. I would like to know if there is any solution. Thank you!
But when I try to get the column which is VARCHAR type, I got an error like this:
!image-2021-08-14-22-33-32-787.png!
When I use getInt() or getDouble() for other type, the code is working fine.
And here is some code:
{code:java}
object OpenGaussTable {
/*Table products*/
val schema: StructType = new StructType().add("name", StringType)
}
...
class OpenGaussPartitionReader(connectionProperties: ConnectionProperties) extends PartitionReader[InternalRow] {
private val connection = DriverManager.getConnection(
connectionProperties.url, connectionProperties.user, connectionProperties.password
)
private val statement = connection.createStatement()
private val resultSet = statement.executeQuery(s"select * from ${connectionProperties.tableName}")
override def next(): Boolean = resultSet.next()
override def get(): InternalRow = InternalRow(resultSet.getString(1))
override def close(): Unit = connection.close()
}{code}
the complete code address:
[https://pastebin.ubuntu.com/p/vCBssjhc2r/]
https://pastebin.ubuntu.com/p/jCcDbb79NG/
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org