You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shivaram Venkataraman (JIRA)" <ji...@apache.org> on 2016/01/22 19:37:39 UTC

[jira] [Resolved] (SPARK-12629) SparkR: DataFrame's saveAsTable method has issues with the signature and HiveContext

     [ https://issues.apache.org/jira/browse/SPARK-12629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Shivaram Venkataraman resolved SPARK-12629.
-------------------------------------------
       Resolution: Fixed
    Fix Version/s: 2.0.0

Issue resolved by https://github.com/apache/spark/pull/10580

> SparkR: DataFrame's saveAsTable method has issues with the signature and HiveContext 
> -------------------------------------------------------------------------------------
>
>                 Key: SPARK-12629
>                 URL: https://issues.apache.org/jira/browse/SPARK-12629
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>            Reporter: Narine Kokhlikyan
>            Assignee: Narine Kokhlikyan
>             Fix For: 2.0.0
>
>
> There are several issues with the DataFrame's saveAsTable method in SparkR. Here is a summary of some of them. Hope this will help to fix the issues.
> 1. According to SparkR's saveAsTable(...) documentation, we can call the "saveAsTable(df, "myfile")" in order to store the dataframe.
> However, this signature isn't working. It seems that "source" and "mode" are forced according to signature.
> 2. Within the method saveAsTable(...) it tries to retrieve the SQL context and tries to create/initialize source as parquet, but this is also failing because the context has to be Hive Context. Based on the error messages I see.
> 3. In general the method fails when I try to call it with sqlContext
> 4. Also, it seems that SQL DataFrame.saveAsTable is deprecated, we could use df.write.saveAsTable(...) instead ...
> [~shivaram] [~sunrui] [~felixcheung]



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org