You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/02/27 12:57:45 UTC

[jira] [Commented] (SPARK-19713) saveAsTable

    [ https://issues.apache.org/jira/browse/SPARK-19713?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15885717#comment-15885717 ] 

Hyukjin Kwon commented on SPARK-19713:
--------------------------------------

Could you update the JIRA title to be more meaningful and provide both compared versions?

> saveAsTable
> -----------
>
>                 Key: SPARK-19713
>                 URL: https://issues.apache.org/jira/browse/SPARK-19713
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.1
>            Reporter: Balaram R Gadiraju
>
> Hi,
> I just observed that when we use dataframe.saveAsTable("table") -- In oldversions
> and dataframe.write.saveAsTable("table") -- in the newer versions
>                 When using the method “df3.saveAsTable("brokentable")” in scale code. This creates a folder in hdfs and doesn’t update hive-metastore that it plans to create the table. So if anything goes wrong in between the folder still exists and hive is not aware of the folder creation. This will block the users from creating the table “brokentable” as the folder already exists, we can remove the folder using “hadoop fs –rmr /data/hive/databases/testdb.db/brokentable”.  So below is the workaround which will enable to you to continue the development work.
> Current Code:
> val df3 = sqlContext.sql("select * fromtesttable")
> df3.saveAsTable("brokentable")
> THE WORKAROUND:
> By registering the DataFrame as table and then using sql command to load the data will resolve the issue. EX:
> val df3 = sqlContext.sql("select * from testtable").registerTempTable("df3")
> sqlContext.sql("CREATE TABLE brokentable AS SELECT * FROM df3")



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org