You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "fbiville (via GitHub)" <gi...@apache.org> on 2023/07/07 13:00:38 UTC

[GitHub] [spark] fbiville commented on pull request #41855: [SPARK-44262][SQL] Add `dropTable` and `getInsertStatement` to JdbcDialect

fbiville commented on PR #41855:
URL: https://github.com/apache/spark/pull/41855#issuecomment-1625382506

   > @fbiville Hi, can you tell us more detail about how neo4j use `drop table` or `insert statement` in JDBC? Thanks.
   
   Hello, I experimented using https://github.com/neo4j-contrib/neo4j-jdbc and a third-party, closed-source SDK based on Spark.
   
   The Neo4j JDBC connector supports only Cypher queries, not SQL queries.
   
   After a conversation with the third-party maintainer, I learnt they were calling `org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils#getSchemaOption`, which would fail right away since there is no JdbcDialect registered for Neo4j.
   
   However, I quickly realize that supplying a Neo4j dialect would not be enough, as some SQL statements are hardcoded directly in `JdbcUtils` and those would not work against Neo4j.
   
   Does that help?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org