You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Vishal Bagga (JIRA)" <ji...@apache.org> on 2015/09/01 06:05:45 UTC

[jira] [Comment Edited] (SPARK-8386) DataFrame and JDBC regression

    [ https://issues.apache.org/jira/browse/SPARK-8386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14724756#comment-14724756 ] 

Vishal Bagga edited comment on SPARK-8386 at 9/1/15 4:05 AM:
-------------------------------------------------------------

I just ran into the same issue today. This should be addressed and looks like a sever limitation for JDBC writers.


was (Author: vishal b):
I just ran into the same issue today. This should addressed and looks like a sever limitation.

> DataFrame and JDBC regression
> -----------------------------
>
>                 Key: SPARK-8386
>                 URL: https://issues.apache.org/jira/browse/SPARK-8386
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>         Environment: RHEL 7.1
>            Reporter: Peter Haumer
>            Priority: Critical
>
> I have an ETL app that appends to a JDBC table new results found at each run.  In 1.3.1 I did this:
> testResultsDF.insertIntoJDBC(CONNECTION_URL, TABLE_NAME, false);
> When I do this now in 1.4 it complains that the "object" 'TABLE_NAME' already exists. I get this even if I switch the overwrite to true.  I also tried this now:
> testResultsDF.write().mode(SaveMode.Append).jdbc(CONNECTION_URL, TABLE_NAME, connectionProperties);
> getting the same error. It works running the first time creating the new table and adding data successfully. But, running it a second time it (the jdbc driver) will tell me that the table already exists. Even SaveMode.Overwrite will give me the same error. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org