You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "herman.yu@teeupdata.com" <he...@teeupdata.com> on 2016/11/15 17:29:30 UTC

Spark SQL and JDBC compatibility

Hi Everyone,

while reading data into spark 2.0.0 data frames through Calcite JDBC driver, depends on Calcite JDBC connection property setup (lexical), sometimes the data frame query returns empty result set, sometimes it errors out with exception: java.sql.SQLException: Error while preparing statement…

One of the scenarios I realized is that , with df.filter($”col”=“value”),  Spark generated sql has “where (col IS NOT NULL) and (col='value’), which fails Calcite sql parser. if (col IS NOT NULL) is removed, query went through fine.

So, has anybody encountered similar sql code compatibility issue, especially with Calcite? Is it possible to have some configuration changes, on both Spark and Calcite sides to make it working together?

Thanks
Herman.