You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Burak Yavuz (Jira)" <ji...@apache.org> on 2020/03/19 01:13:00 UTC

[jira] [Resolved] (SPARK-31178) sql("INSERT INTO v2DataSource ...").collect() double inserts

     [ https://issues.apache.org/jira/browse/SPARK-31178?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Burak Yavuz resolved SPARK-31178.
---------------------------------
    Fix Version/s: 3.0.0
         Assignee: Burak Yavuz
       Resolution: Fixed

Resolved by [https://github.com/apache/spark/pull/27941]

> sql("INSERT INTO v2DataSource ...").collect() double inserts
> ------------------------------------------------------------
>
>                 Key: SPARK-31178
>                 URL: https://issues.apache.org/jira/browse/SPARK-31178
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Burak Yavuz
>            Assignee: Burak Yavuz
>            Priority: Blocker
>             Fix For: 3.0.0
>
>
> The following unit test fails in DataSourceV2SQLSuite:
> {code:java}
> test("do not double insert on INSERT INTO collect()") {
>   import testImplicits._
>   val t1 = s"${catalogAndNamespace}tbl"
>   sql(s"CREATE TABLE $t1 (id bigint, data string) USING $v2Format")
>   val tmpView = "test_data"
>   val df = Seq((1L, "a"), (2L, "b"), (3L, "c")).toDF("id", "data")
>   df.createOrReplaceTempView(tmpView)
>   sql(s"INSERT INTO TABLE $t1 SELECT * FROM $tmpView").collect()
>   verifyTable(t1, df)
> } {code}
> The INSERT INTO is double inserting when ".collect()" is called. I think this is because the V2 SparkPlans are not commands, and doExecute on a Spark plan can be called multiple times causing data to be inserted multiple times.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org