You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:13:56 UTC

[jira] [Resolved] (SPARK-22740) [SQL][JDBC] Reserved SQL words are not escaped for table names

     [ https://issues.apache.org/jira/browse/SPARK-22740?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-22740.
----------------------------------
    Resolution: Incomplete

> [SQL][JDBC] Reserved SQL words are not escaped for table names
> --------------------------------------------------------------
>
>                 Key: SPARK-22740
>                 URL: https://issues.apache.org/jira/browse/SPARK-22740
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Nicolas Lalevée
>            Priority: Major
>              Labels: bulk-closed
>
> I have a mysql database where there is a table named 'group'.
> Then in Java this line fails:
> {code:java}
> sparkSession.read().jdbc("jdbc:mysql://localhost/scoop", "group", new Properties()).createTempView("mysql_group");
> {code}
> with this exception:
> {noformat}
> com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'group WHERE 1=0' at line 1
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
> 	at com.mysql.jdbc.Util.getInstance(Util.java:386)
> 	at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1053)
> 	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4074)
> 	at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4006)
> 	at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2468)
> 	at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2629)
> 	at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2719)
> 	at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2155)
> 	at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2318)
> 	at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:62)
> 	at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:113)
> 	at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:47)
> 	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:306)
> 	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
> 	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
> 	at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:193)
> {noformat}
> In mysql, the table name should escaped with back ticks: {{SELECT * FROM `group` WHERE 1=0}}
> I tried to work on a fix, but the "table name" in the code of spark may be the qualified name of the table, with the name of the database, so this is not trivial.
> Instead of generating: {{SELECT * FROM `mydb.group` WHERE 1=0}}
> This should be generated: {{SELECT * FROM `mydb`.`group` WHERE 1=0}}
> Here are the functions that I have found that might be affected by reserved word as a table name:
> - JdbcDialects#getTableExistsQuery
> - JdbcDialects#getSchemaQuery
> - MySQLDialect#getTableExistsQuery
> - PostgresDialects#getTableExistsQuery
> - JdbcUtils#dropTable
> - JdbcUtils#truncateTable
> - JdbcUtils#getInsertStatement
> - JdbcUtils#createTable



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org