You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2016/04/24 08:40:12 UTC
[jira] [Created] (SPARK-14883) Fix wrong R examples and make them
up-to-date
Dongjoon Hyun created SPARK-14883:
-------------------------------------
Summary: Fix wrong R examples and make them up-to-date
Key: SPARK-14883
URL: https://issues.apache.org/jira/browse/SPARK-14883
Project: Spark
Issue Type: Bug
Components: Documentation, Examples
Reporter: Dongjoon Hyun
This issue aims to fix some errors in R examples and make them up-to-date in docs and example modules.
- Fix the wrong usage of map. We need to use `lapply` if needed. However, the usage of `lapply` also needs to be reviewed since it's private.
{code}
-teenNames <- map(teenagers, function(p) { paste("Name:", p$name)})
+teenNames <- SparkR:::lapply(teenagers, function(p) { paste("Name:", p$name) })
{code}
- Fix the wrong example in Section `Generic Load/Save Functions` of `docs/sql-programming-guide.md` for consistency.
{code}
-df <- loadDF(sqlContext, "people.parquet")
-saveDF(select(df, "name", "age"), "namesAndAges.parquet")
+df <- read.df(sqlContext, "examples/src/main/resources/users.parquet")
+write.df(select(df, "name", "favorite_color"), "namesAndFavColors.parquet")
{code}
- Replace deprecated functions: jsonFile -> read.json, parquetFile -> read.parquet
{code}
df <- jsonFile(sqlContext, "examples/src/main/resources/people.json")
Warning message:
'jsonFile' is deprecated.
Use 'read.json' instead.
See help("Deprecated")
{code}
- Use up-to-date R-like functions: loadDF -> read.df, saveDF -> write.df, saveAsParquetFile -> write.parquet
- Replace `SparkR DataFrame` with `SparkDataFrame` in `dataframe.R` and `data-manipulation.R`.
- Other minor syntax fixes and typos.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org