You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shivaram Venkataraman (JIRA)" <ji...@apache.org> on 2015/08/27 07:28:45 UTC

[jira] [Resolved] (SPARK-10219) Error when additional options provided as variable in write.df

     [ https://issues.apache.org/jira/browse/SPARK-10219?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Shivaram Venkataraman resolved SPARK-10219.
-------------------------------------------
       Resolution: Fixed
    Fix Version/s: 1.5.1
                   1.6.0

Issue resolved by pull request 8475
[https://github.com/apache/spark/pull/8475]

> Error when additional options provided as variable in write.df
> --------------------------------------------------------------
>
>                 Key: SPARK-10219
>                 URL: https://issues.apache.org/jira/browse/SPARK-10219
>             Project: Spark
>          Issue Type: Bug
>          Components: R
>    Affects Versions: 1.4.0
>         Environment: SparkR shell
>            Reporter: Samuel Alexander
>              Labels: spark-shell, sparkR
>             Fix For: 1.6.0, 1.5.1
>
>
> Opened a SparkR shell
> Created a df using 
> > df <- jsonFile(sqlContext, "examples/src/main/resources/people.json")
> Assigned a variable like below
> > mode <- "append"
> When write.df called using below statement got the mentioned error
> > write.df(df, source="org.apache.spark.sql.parquet", path=par_path, option=mode)
> Error in writeType(con, type) : Unsupported type for serialization name
> Whereas mode is passed as "append" itself, i.e. not via mode variable as below everything works fine
> > write.df(df, source="org.apache.spark.sql.parquet", path=par_path, option="append")
> Note: For parquet it is not needed to hanve option. But we are using Spark Salesforce package (http://spark-packages.org/package/springml/spark-salesforce) which require additional options to be passed.  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org