You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "yaooqinn (via GitHub)" <gi...@apache.org> on 2023/04/13 04:15:20 UTC

[GitHub] [spark] yaooqinn opened a new pull request, #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

yaooqinn opened a new pull request, #40768:
URL: https://github.com/apache/spark/pull/40768

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the guideline first in
        'core/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   <!--
   Please clarify what changes you are proposing. The purpose of this section is to outline the changes and how this PR fixes the issue. 
   If possible, please consider writing useful notes for better and faster reviews in your PR. See the examples below.
     1. If you refactor some codes with changing classes, showing the class hierarchy will help reviewers.
     2. If you fix some SQL features, you can provide some references of other DBMSes.
     3. If there is design documentation, please add the link.
     4. If there is a discussion in the mailing list, please add the link.
   -->
   This PR introduces an auxiliary function and makes it support JDBC standard API to support getting SQL Keywords dynamically.
   
   ### Why are the changes needed?
   <!--
   Please clarify why the changes are needed. For instance,
     1. If you propose a new API, clarify the use case for a new API.
     2. If you fix a bug, you can clarify why it is a bug.
   -->
   
   1. JDBC API Compliance 
   2. SQL Keywords are helpful for AI-powered BI tools during prompting to generate queries
   
   ### Does this PR introduce _any_ user-facing change?
   <!--
   Note that it means *any* user-facing change including all aspects such as the documentation fix.
   If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible.
   If possible, please also clarify if this is a user-facing change compared to the released Spark versions or within the unreleased branches such as master.
   If no, write 'No'.
   -->
   
   yes, a new sql_keywords function added
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some test cases that check the changes thoroughly including negative and positive cases if possible.
   If it was tested in a way different from regular unit tests, please clarify how you tested step by step, ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why it was difficult to add.
   If benchmark tests were added, please run the benchmarks in GitHub Actions for the consistent environment, and the instructions could accord to: https://spark.apache.org/developer-tools.html#github-workflow-benchmarks.
   -->
   
   new tests


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on PR #40768:
URL: https://github.com/apache/spark/pull/40768#issuecomment-1512503237

   @srielau how do you think of this new TVF `sql_keywords`?
   ```
   @ExpressionDescription(
     usage = """_FUNC_() - Get Spark SQL keywords""",
     examples = """
       Examples:
         > SELECT * FROM _FUNC_() LIMIT 2;
          ADD  false
          AFTER  false
     """,
     since = "3.5.0",
     group = "generator_funcs")
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on PR #40768:
URL: https://github.com/apache/spark/pull/40768#issuecomment-1506503716

   cc @cloud-fan @dongjoon-hyun @HyukjinKwon thanks


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169608107


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   It is not static, but relies on ANSI mode and `spark.sql.ansi.enforceReservedKeywords`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169835053


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   addressed, please take another look



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1166448221


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala:
##########
@@ -742,6 +742,7 @@ object FunctionRegistry {
     expression[SparkVersion]("version"),
     expression[TypeOf]("typeof"),
     expression[EqualNull]("equal_null"),
+    expression[SQLKeywords]("sql_keywords"),

Review Comment:
   I might add a little after I have asked ChatGPT for `the easiest way to list sql keywords from _ dynamically?`
   
   - Postgres: SELECT word FROM pg_get_keywords() ;
   - MySQL: SHOW Syntax or information_schema;
   - Oracle: V$RESERVED_WORDS system view, SELECT * FROM KEYWORDS;
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169548481


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   How about making SQLKeywordSuite follow what we do here?
   
   `SqlBaseLexer.VOCABULARY` is a stable public API and widely used



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] srielau commented on pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "srielau (via GitHub)" <gi...@apache.org>.
srielau commented on PR #40768:
URL: https://github.com/apache/spark/pull/40768#issuecomment-1513550564

   > @srielau how do you think of this new TVF `sql_keywords`?
   > 
   > ```
   > @ExpressionDescription(
   >   usage = """_FUNC_() - Get Spark SQL keywords""",
   >   examples = """
   >     Examples:
   >       > SELECT * FROM _FUNC_() LIMIT 2;
   >        ADD  false
   >        AFTER  false
   >   """,
   >   since = "3.5.0",
   >   group = "generator_funcs")
   > ```
   
   I'm not opposed to it. What is the reference to JDBC compliance? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on PR #40768:
URL: https://github.com/apache/spark/pull/40768#issuecomment-1515674976

   the test failures seem not related, can you retake a look? @wangyum @cloud-fan 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1165421086


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala:
##########
@@ -742,6 +742,7 @@ object FunctionRegistry {
     expression[SparkVersion]("version"),
     expression[TypeOf]("typeof"),
     expression[EqualNull]("equal_null"),
+    expression[SQLKeywords]("sql_keywords"),

Review Comment:
   do other databases have such a function?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1166663877


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala:
##########
@@ -742,6 +742,7 @@ object FunctionRegistry {
     expression[SparkVersion]("version"),
     expression[TypeOf]("typeof"),
     expression[EqualNull]("equal_null"),
+    expression[SQLKeywords]("sql_keywords"),

Review Comment:
   Make sense



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on PR #40768:
URL: https://github.com/apache/spark/pull/40768#issuecomment-1514185495

   thanks, @wangyum for the notification. I didn't notice there was a transient failure


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on PR #40768:
URL: https://github.com/apache/spark/pull/40768#issuecomment-1514014308

   > What is the reference to JDBC compliance?
   
   To implement java.sql.DatabaseMetaData#getSQLKeywords at thriftserver side


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on PR #40768:
URL: https://github.com/apache/spark/pull/40768#issuecomment-1517166975

   thanks @cloud-fan @wangyum @srielau for the help, merged to master


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1166621064


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala:
##########
@@ -742,6 +742,7 @@ object FunctionRegistry {
     expression[SparkVersion]("version"),
     expression[TypeOf]("typeof"),
     expression[EqualNull]("equal_null"),
+    expression[SQLKeywords]("sql_keywords"),

Review Comment:
   then shall we add a TVF to return keywords? we can also include a boolean column `is_reserved`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169615489


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   And I think it's not very necessary keep the map in memory. For JDBC tools, it may only use once while creating the connection. e.g. https://github.com/apache/hive/blob/ba0217ff17501fb849d8999e808d37579db7b4f1/beeline/src/java/org/apache/hive/beeline/SQLCompleter.java#L56



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] wangyum commented on pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "wangyum (via GitHub)" <gi...@apache.org>.
wangyum commented on PR #40768:
URL: https://github.com/apache/spark/pull/40768#issuecomment-1514178105

   @yaooqinn Could you re-run the test?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169544670


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   can we follow `SQLKeywordSuite` and read the keywords list from a file?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169558336


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   I know. Before implementing as a TVF, it looks overkilling to me too. But now it does not stay in the Optimizer seems OK to me. FYI, I have found that [trino](https://github.com/trinodb/trino/blob/master/core/trino-parser/src/main/java/io/trino/sql/ReservedIdentifiers.java#L144) also implements it in this way.
   
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1166448221


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala:
##########
@@ -742,6 +742,7 @@ object FunctionRegistry {
     expression[SparkVersion]("version"),
     expression[TypeOf]("typeof"),
     expression[EqualNull]("equal_null"),
+    expression[SQLKeywords]("sql_keywords"),

Review Comment:
   I might add a little after I have asked ChatGPT for `the easiest way to list sql keywords from _ dynamically?`
   
   - Postgres: SELECT word FROM pg_get_keywords() ;
   - MySQL: SHOW Syntax or information_schema;
   - Oracle: V$RESERVED_WORDS system, SELECT * FROM KEYWORDS;
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1166299495


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala:
##########
@@ -742,6 +742,7 @@ object FunctionRegistry {
     expression[SparkVersion]("version"),
     expression[TypeOf]("typeof"),
     expression[EqualNull]("equal_null"),
+    expression[SQLKeywords]("sql_keywords"),

Review Comment:
   It's spark-specific, other systems have other interfaces for end users. For example, using table `information_schema.keywords`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1165421646


##########
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIService.scala:
##########
@@ -103,7 +103,11 @@ private[hive] class SparkSQLCLIService(hiveServer: HiveServer2, sqlContext: SQLC
       case GetInfoType.CLI_SERVER_NAME => new GetInfoValue("Spark SQL")
       case GetInfoType.CLI_DBMS_NAME => new GetInfoValue("Spark SQL")
       case GetInfoType.CLI_DBMS_VER => new GetInfoValue(sqlContext.sparkContext.version)
-      case GetInfoType.CLI_ODBC_KEYWORDS => new GetInfoValue("Unimplemented")
+      case GetInfoType.CLI_ODBC_KEYWORDS =>
+        val keywords = sqlContext.sql("SELECT SQL_KEYWORDS()")

Review Comment:
   does it have to be implemented via a sql function?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1166531868


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala:
##########
@@ -742,6 +742,7 @@ object FunctionRegistry {
     expression[SparkVersion]("version"),
     expression[TypeOf]("typeof"),
     expression[EqualNull]("equal_null"),
+    expression[SQLKeywords]("sql_keywords"),

Review Comment:
   seems there is no standard way except for information schema. Let's not add this function.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on PR #40768:
URL: https://github.com/apache/spark/pull/40768#issuecomment-1512343373

   please retake a look, @cloud-fan ,thanks.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169551607


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   My worry is how to determine reserved keywords. Invoking the parser for each keyword looks a bit overkill to me.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1165022684


##########
sql/hive-thriftserver/src/test/scala/org/apache/spark/sql/hive/thriftserver/ThriftServerWithSparkContextSuite.scala:
##########
@@ -207,6 +207,16 @@ trait ThriftServerWithSparkContextSuite extends SharedThriftServer {
       // scalastyle:on line.size.limit
     }
   }
+
+  test("SPARK-43119: Get SQL Keywords") {
+    withCLIServiceClient() { client =>

Review Comment:
   As we deliver hive-jdbc v2.3.9 and hive-service-rpc v3.1.3, we can not test with JDBC directly as the lack of implementation at the client side



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1166231926


##########
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIService.scala:
##########
@@ -103,7 +103,11 @@ private[hive] class SparkSQLCLIService(hiveServer: HiveServer2, sqlContext: SQLC
       case GetInfoType.CLI_SERVER_NAME => new GetInfoValue("Spark SQL")
       case GetInfoType.CLI_DBMS_NAME => new GetInfoValue("Spark SQL")
       case GetInfoType.CLI_DBMS_VER => new GetInfoValue(sqlContext.sparkContext.version)
-      case GetInfoType.CLI_ODBC_KEYWORDS => new GetInfoValue("Unimplemented")
+      case GetInfoType.CLI_ODBC_KEYWORDS =>
+        val keywords = sqlContext.sql("SELECT SQL_KEYWORDS()")

Review Comment:
   We can also generate directly 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and An Auxiliary Function

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1166618008


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala:
##########
@@ -742,6 +742,7 @@ object FunctionRegistry {
     expression[SparkVersion]("version"),
     expression[TypeOf]("typeof"),
     expression[EqualNull]("equal_null"),
+    expression[SQLKeywords]("sql_keywords"),

Review Comment:
   Although the function itself is not ANSI-standard, the behavior extending the standard to add useful functions complies with ANSI. Let's discuss whether to add it or not based on its usability.
   
   - As we can see, most systems support such functionality with user-friendly SQL APIs by `command` or `functions` despite already having it in information_schema, which is not always for end-users.
   - Comparing adding a new command with a new function, the latter is much more lightweight.
   - The standard JDBC API only supports(by contract) list keywords but can not tell whether it is reserved or non-reserved. 
   - the return type of sql_keywords is compliant with information_schema, it just acts as a `view` of information_schema.keywords.
   - `sql_keywords` is a suitable name or not?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169551472


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   FYI, https://github.com/search?q=Lexer.VOCABULARY&type=code



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169655501


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   if it's only used once by the thriftserver, why do we bother to add a TVF? We can't assume that a public SQL function may only be called once.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a diff in pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "cloud-fan (via GitHub)" <gi...@apache.org>.
cloud-fan commented on code in PR #40768:
URL: https://github.com/apache/spark/pull/40768#discussion_r1169599039


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/SQLKeywordUtils.scala:
##########
@@ -0,0 +1,44 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.util
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.parser.{CatalystSqlParser, SqlBaseLexer}
+
+private[sql] object SQLKeywordUtils {

Review Comment:
   Since this is static, can we load all keywords and figure out if it's reserved or not only once? Then we cache the result as a `Seq[(String, Boolean)]` in a lazy val.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] yaooqinn closed pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF

Posted by "yaooqinn (via GitHub)" <gi...@apache.org>.
yaooqinn closed pull request #40768: [SPARK-43119][SQL] Support Get SQL Keywords Dynamically Thru JDBC API and TVF
URL: https://github.com/apache/spark/pull/40768


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org