You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/03/17 09:14:00 UTC

[jira] [Updated] (SPARK-28050) DataFrameWriter support insertInto a specific table partition

     [ https://issues.apache.org/jira/browse/SPARK-28050?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun updated SPARK-28050:
----------------------------------
    Affects Version/s:     (was: 3.0.0)
                       3.1.0

> DataFrameWriter support insertInto a specific table partition
> -------------------------------------------------------------
>
>                 Key: SPARK-28050
>                 URL: https://issues.apache.org/jira/browse/SPARK-28050
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Leanken.Lin
>            Priority: Minor
>
> {code:java}
> // Some comments here
> val ptTableName = "mc_test_pt_table"
> sql(s"CREATE TABLE ${ptTableName} (name STRING, num BIGINT) PARTITIONED BY (pt1 STRING, pt2 STRING)")
> val df = spark.sparkContext.parallelize(0 to 99, 2)
>   .map(f =>
>     {
>       (s"name-$f", f)
>     })
>   .toDF("name", "num")
> // if i want to insert df into a specific partition
> // say pt1='2018',pt2='0601' current api does not supported
> // only with following work around
> df.createOrReplaceTempView(s"${ptTableName}_tmp_view")
> sql(s"insert into table ${ptTableName} partition (pt1='2018', pt2='0601') select * from ${ptTableName}_tmp_view")
> {code}
> Propose to have another API in DataframeWriter that can do somethink like:
> {code:java}
> df.write.insertInto(ptTableName, "pt1='2018',pt2='0601'")
> {code}
> we have a lot of this kind of scenario in our production env. providing a api like this will make us less painful.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org