You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/09/22 12:34:00 UTC

[jira] [Commented] (SPARK-36680) Supports Dynamic Table Options for Spark SQL

    [ https://issues.apache.org/jira/browse/SPARK-36680?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17418569#comment-17418569 ] 

Apache Spark commented on SPARK-36680:
--------------------------------------

User 'wang-zhun' has created a pull request for this issue:
https://github.com/apache/spark/pull/34072

> Supports Dynamic Table Options for Spark SQL
> --------------------------------------------
>
>                 Key: SPARK-36680
>                 URL: https://issues.apache.org/jira/browse/SPARK-36680
>             Project: Spark
>          Issue Type: Wish
>          Components: SQL
>    Affects Versions: 3.1.2
>            Reporter: wang-zhun
>            Priority: Major
>
> Now a DataFrame API user can implement dynamic options through the _DataFrameReader$option_ method, but Spark SQL users cannot use.
> {code:java}
> DataFrameReader/AstBuilder -> UnresolvedRelation$options -> DataSourceV2Relation$options -> SupportsRead$newScanBuilder(options)
> {code}
>  
>  The table options were persisted to the Catalog and if we want to modify that, we should use another DDL like "_ALTER TABLE ..._". But there are some cases that user want to modify the table options dynamically just in the query:
>  * JDBCTable set _fetchsize_ according to the actual situation of the table
>  * IcebergTable support time travel
> {code:java}
> spark.read
>     .option("snapshot-id", 10963874102873L)
>     .format("iceberg")
>     .load("path/to/table"){code}
> These parameters setting is very common and ad-hoc, setting them flexibly would promote the user experience with Spark SQL especially for Now we support catalog expansion.
>   



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org