You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/12/21 03:15:46 UTC

[jira] [Commented] (SPARK-12232) Create new R API for read.table to avoid conflict

    [ https://issues.apache.org/jira/browse/SPARK-12232?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15065996#comment-15065996 ] 

Apache Spark commented on SPARK-12232:
--------------------------------------

User 'felixcheung' has created a pull request for this issue:
https://github.com/apache/spark/pull/10406

> Create new R API for read.table to avoid conflict
> -------------------------------------------------
>
>                 Key: SPARK-12232
>                 URL: https://issues.apache.org/jira/browse/SPARK-12232
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 1.5.2
>            Reporter: Felix Cheung
>            Priority: Minor
>
> Since we have read.df, read.json, read.parquet (some in pending PRs), we have table() and we should consider having read.table() for consistency and R-likeness.
> However, this conflicts with utils::read.table which returns a R data.frame.
> It seems neither table() or read.table() is desirable in this case.
> table: https://stat.ethz.ch/R-manual/R-devel/library/base/html/table.html
> read.table: https://stat.ethz.ch/R-manual/R-devel/library/utils/html/read.table.html



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org