You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:24:12 UTC

[jira] [Updated] (SPARK-15376) DataFrame write.jdbc() inserts more rows than acutal

     [ https://issues.apache.org/jira/browse/SPARK-15376?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-15376:
---------------------------------
    Labels: DataFrame bulk-closed  (was: DataFrame)

> DataFrame write.jdbc() inserts more rows than acutal
> ----------------------------------------------------
>
>                 Key: SPARK-15376
>                 URL: https://issues.apache.org/jira/browse/SPARK-15376
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.4.1, 1.5.0, 1.6.1
>         Environment: CentOS 6 cluster mode
> Cores: 300 (300 granted, 0 left)
> Executor Memory: 45.0 GB
> Submit Date: Wed May 18 10:26:40 CST 2016
>            Reporter: xiaoyu chen
>            Priority: Major
>              Labels: DataFrame, bulk-closed
>
> It's a odd bug, occur under this situation:
>     
> {code:title=Bar.scala}
>     val rddRaw = sc.textFile("xxx").map(xxx).sample(false, 0.15)
>     println(rddRaw.count())    // the actual rows insert to mysql is more than rdd's record num. In my case, is 239994 (rdd),  ~241300 (database inserted)
>     // iter all rows in another way, if drop the Range for loop, the bug wouldn't occur
>     for(some_id <- Range(some_ids_all_range)){
>       rddRaw.filter(_._2 == some_id).randomSplit(Array(x, x, x), 1)
>       .foreach( rd => {
>       // val curCnt = rd.count()  // if invoke count() on rd before write, it would be ok
>           rd.map(x => new TestRow(null, xxx)).toDF().write.mode(SaveMode.Append).jdbc(xxx)
>         }
>       )
>     }
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org