You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/08/15 06:44:33 UTC

[GitHub] [spark] dilipbiswal edited a comment on issue #25448: [SPARK-28697][SQL] Invalidate Database/Table names starting with underscore

dilipbiswal edited a comment on issue #25448: [SPARK-28697][SQL] Invalidate Database/Table names starting with underscore
URL: https://github.com/apache/spark/pull/25448#issuecomment-521533557
 
 
   @cloud-fan @dongjoon-hyun @HyukjinKwon 
   Was just checking the db2 definition of a identifier in [link](https://www.ibm.com/support/knowledgecenter/en/SSEPGG_9.7.0/com.ibm.db2.luw.sql.ref.doc/doc/r0000720.html)
   
   Its defined as following :
   ```
   An ordinary identifier is an uppercase letter followed by zero or more characters, each of which is an uppercase letter, a digit, or the underscore character. Note that lower case letters can be used when specifying an ordinary identifier, but they are converted to uppercase when processed. An ordinary identifier should not be a reserved word.
   ```
   
   Hive seems to have allowed  digit as first character as well.
   
   ```
   Identifier
       :
       (Letter | Digit) (Letter | Digit | '_')*
       | {allowQuotedId()}? QuotedIdentifier  /* though at the language level we allow all Identifiers to be QuotedIdentifiers;
                                                 at the API level only columns are allowed to be of this form */
       | '`' RegexComponent+ '`'
       ;
   ```
   
   Not sure why in spark we allowed "_" as starting char to begin with ? Is it to match some other system ?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org