You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "cloud-fan (via GitHub)" <gi...@apache.org> on 2023/09/05 13:59:15 UTC

[GitHub] [spark] cloud-fan commented on pull request #41683: [SPARK-36680][SQL] Supports Dynamic Table Options for Spark SQL

cloud-fan commented on PR #41683:
URL: https://github.com/apache/spark/pull/41683#issuecomment-1706680230

   Let's spend more time on the API design first, as different people may have different opinions and we should collect as much feedback as possible.
   
   Taking a step back, I think what we need is an SQL API to specify per-scan options, like `spark.read.options(...)`. The SQL API should be general as it's very likely that people will ask for something similar for `df.write.options` and `spark.readStream.options`.
   
   TVF can only be used in the FROM clause, so a new SQL syntax may be better here. Inspired by the [pgsql syntax](https://www.postgresql.org/docs/current/sql-createtable.html), we can add a WITH clause to Spark SQL:
   ```
   ... FROM tbl_name WITH (optionA = v1, optionB = v2, ...)
   INSERT INTO tbl_name WITH (optionA = v1, optionB = v2, ...) SELECT ...
   ```
   
   Streaming is orthogonal to this, and this new WITH clause won't conflict with it. E.g. we can probably do `... FROM STREAM tbl_name WITH (...)`. It's out of the scope of this PR though, as streaming SQL is a big topic.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org