You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/11/28 14:27:48 UTC

[GitHub] [spark] MaxGekk commented on a change in pull request #17483: [SPARK-20159][SPARKR][SQL] Support all catalog API in R

MaxGekk commented on a change in pull request #17483:
URL: https://github.com/apache/spark/pull/17483#discussion_r532045969



##########
File path: R/pkg/R/SQLContext.R
##########
@@ -569,200 +569,6 @@ tableToDF <- function(tableName) {
   dataFrame(sdf)
 }
 
-#' Tables
-#'
-#' Returns a SparkDataFrame containing names of tables in the given database.
-#'
-#' @param databaseName name of the database
-#' @return a SparkDataFrame
-#' @rdname tables
-#' @export
-#' @examples
-#'\dontrun{
-#' sparkR.session()
-#' tables("hive")
-#' }
-#' @name tables
-#' @method tables default
-#' @note tables since 1.4.0
-tables.default <- function(databaseName = NULL) {
-  sparkSession <- getSparkSession()
-  jdf <- callJStatic("org.apache.spark.sql.api.r.SQLUtils", "getTables", sparkSession, databaseName)

Review comment:
       if I am not mistaken, the method `getTables()` is not used any more in R. Can we remove it from `r.SQLUtils`:
   https://github.com/apache/spark/blob/e8982ca7ad94e98d907babf2d6f1068b7cd064c6/sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala#L219-L226
   ?
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org