You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shivaram Venkataraman (JIRA)" <ji...@apache.org> on 2015/08/25 19:12:46 UTC
[jira] [Commented] (SPARK-10219) Error when additional options
provided as variable in write.df
[ https://issues.apache.org/jira/browse/SPARK-10219?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14711642#comment-14711642 ]
Shivaram Venkataraman commented on SPARK-10219:
-----------------------------------------------
I think thats happening because `mode` is actually an argument name that is taken in by the write.df method -- So I am not sure you need option=mode, but just mode=mode or mode="append" should work ?
> Error when additional options provided as variable in write.df
> --------------------------------------------------------------
>
> Key: SPARK-10219
> URL: https://issues.apache.org/jira/browse/SPARK-10219
> Project: Spark
> Issue Type: Bug
> Components: R
> Affects Versions: 1.4.0
> Environment: SparkR shell
> Reporter: Samuel Alexander
> Labels: spark-shell, sparkR
>
> Opened a SparkR shell
> Created a df using
> > df <- jsonFile(sqlContext, "examples/src/main/resources/people.json")
> Assigned a variable like below
> > mode <- "append"
> When write.df called using below statement got the mentioned error
> > write.df(df, source="org.apache.spark.sql.parquet", path=par_path, option=mode)
> Error in writeType(con, type) : Unsupported type for serialization name
> Whereas mode is passed as "append" itself, i.e. not via mode variable as below everything works fine
> > write.df(df, source="org.apache.spark.sql.parquet", path=par_path, option="append")
> Note: For parquet it is not needed to hanve option. But we are using Spark Salesforce package (http://spark-packages.org/package/springml/spark-salesforce) which require additional options to be passed.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org