You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/09/23 14:05:26 UTC

[GitHub] [spark] wangyum opened a new pull request, #37983: [MINOR][DOCS] Fix dead links in sparkr-vignettes.Rmd

wangyum opened a new pull request, #37983:
URL: https://github.com/apache/spark/pull/37983

   ### What changes were proposed in this pull request?
   
   This PR fix all dead links in sparkr-vignettes.Rmd.
   
   ### Why are the changes needed?
   
   It seems this issue makes spark-3.3.1-bin-hadoop3.tgz build fail:
   do-release-docker.sh logs:
   ```
   cp: cannot stat 'spark-3.3.1-bin-hadoop3/spark-3.3.1-bin-hadoop3.tgz': No such file or directory
   gpg: can't open 'spark-3.3.1-bin-hadoop3.tgz': No such file or directory
   gpg: signing failed: No such file or directory
   shasum: spark-3.3.1-bin-hadoop3.tgz: No such file or directory
   ```
   binary-release-hadoop3.log logs:
   ```
   yumwang@LM-SHC-16508156 output % tail -n 30 binary-release-hadoop3.log
   * this is package ‘SparkR’ version ‘3.3.1’
   * package encoding: UTF-8
   * checking CRAN incoming feasibility ... NOTE
   Maintainer: ‘The Apache Software Foundation <de...@spark.apache.org>’
   
   New submission
   
   Package was archived on CRAN
   
   CRAN repository db overrides:
     X-CRAN-Comment: Archived on 2021-06-28 as issues were not corrected
       in time.
   
     Should use tools::R_user_dir().
   
   Found the following (possibly) invalid URLs:
     URL: https://spark.apache.org/docs/latest/api/R/column_aggregate_functions.html
       From: inst/doc/sparkr-vignettes.html
       Status: 404
       Message: Not Found
     URL: https://spark.apache.org/docs/latest/api/R/read.df.html
       From: inst/doc/sparkr-vignettes.html
       Status: 404
       Message: Not Found
     URL: https://spark.apache.org/docs/latest/api/R/sparkR.session.html
       From: inst/doc/sparkr-vignettes.html
       Status: 404
       Message: Not Found
   * checking package namespace information ... OK
   * checking package dependencies ...%
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   
   No.
   
   ### How was this patch tested?
   
   manual test.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun closed pull request #37983: [SPARK-40547][DOCS] Fix dead links in sparkr-vignettes.Rmd

Posted by GitBox <gi...@apache.org>.
dongjoon-hyun closed pull request #37983: [SPARK-40547][DOCS] Fix dead links in sparkr-vignettes.Rmd
URL: https://github.com/apache/spark/pull/37983


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun commented on pull request #37983: [SPARK-40547][DOCS] Fix dead links in sparkr-vignettes.Rmd

Posted by GitBox <gi...@apache.org>.
dongjoon-hyun commented on PR #37983:
URL: https://github.com/apache/spark/pull/37983#issuecomment-1256389956

   Got it. Anyway, it's good to fix this.
   Merged to master/3.3.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] wangyum commented on pull request #37983: [MINOR][DOCS] Fix dead links in sparkr-vignettes.Rmd

Posted by GitBox <gi...@apache.org>.
wangyum commented on PR #37983:
URL: https://github.com/apache/spark/pull/37983#issuecomment-1256261112

   cc @HyukjinKwon @dongjoon-hyun 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dongjoon-hyun commented on pull request #37983: [MINOR][DOCS] Fix dead links in sparkr-vignettes.Rmd

Posted by GitBox <gi...@apache.org>.
dongjoon-hyun commented on PR #37983:
URL: https://github.com/apache/spark/pull/37983#issuecomment-1256309389

   BTW, are the deadline links new? How did we succeed RC1 before?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] wangyum commented on pull request #37983: [SPARK-40547][DOCS] Fix dead links in sparkr-vignettes.Rmd

Posted by GitBox <gi...@apache.org>.
wangyum commented on PR #37983:
URL: https://github.com/apache/spark/pull/37983#issuecomment-1256332206

   It seems caused by `checking package dependencies`, not invalid URLs.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] wangyum commented on pull request #37983: [MINOR][DOCS] Fix dead links in sparkr-vignettes.Rmd

Posted by GitBox <gi...@apache.org>.
wangyum commented on PR #37983:
URL: https://github.com/apache/spark/pull/37983#issuecomment-1256260836

   This fix need to backport to branch-3.3


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] wangyum commented on pull request #37983: [MINOR][DOCS] Fix dead links in sparkr-vignettes.Rmd

Posted by GitBox <gi...@apache.org>.
wangyum commented on PR #37983:
URL: https://github.com/apache/spark/pull/37983#issuecomment-1256327547

   This is RC1 logs:
   ```
   tail -n 500 binary-release-hadoop3.log
   
   Running CRAN check with --as-cran --no-tests options
   * using log directory ‘/opt/spark-rm/output/spark-3.3.1-bin-hadoop3/R/SparkR.Rcheck’
   * using R version 4.2.1 (2022-06-23)
   * using platform: x86_64-pc-linux-gnu (64-bit)
   * using session charset: UTF-8
   * using options ‘--no-tests --as-cran’
   * checking for file ‘SparkR/DESCRIPTION’ ... OK
   * checking extension type ... Package
   * this is package ‘SparkR’ version ‘3.3.1’
   * package encoding: UTF-8
   * checking CRAN incoming feasibility ... NOTE
   Maintainer: ‘The Apache Software Foundation <de...@spark.apache.org>’
   
   New submission
   
   Package was archived on CRAN
   
   Found the following (possibly) invalid URLs:
     URL: https://spark.apache.org/docs/latest/api/R/column_aggregate_functions.html
       From: inst/doc/sparkr-vignettes.html
       Status: 404
       Message: Not Found
     URL: https://spark.apache.org/docs/latest/api/R/read.df.html
       From: inst/doc/sparkr-vignettes.html
       Status: 404
       Message: Not Found
     URL: https://spark.apache.org/docs/latest/api/R/sparkR.session.html
       From: inst/doc/sparkr-vignettes.html
       Status: 404
       Message: Not Found
   * checking package namespace information ... OK
   * checking package dependencies ... NOTE
   Package suggested but not available for checking: ‘arrow’
   * checking if this is a source package ... OK
   * checking if there is a namespace ... OK
   * checking for .dll and .exe files ... OK
   * checking for hidden files and directories ... OK
   * checking for portable file names ... OK
   * checking for sufficient/correct file permissions ... OK
   * checking whether package ‘SparkR’ can be installed ... OK
   * checking installed package size ... OK
   * checking package directory ... OK
   * checking for future file timestamps ... NOTE
   unable to verify current time
   * checking ‘build’ directory ... OK
   * checking DESCRIPTION meta-information ... OK
   * checking top-level files ... OK
   * checking for left-over files ... OK
   * checking index information ... OK
   * checking package subdirectories ... OK
   * checking R files for non-ASCII characters ... OK
   * checking R files for syntax errors ... OK
   * checking whether the package can be loaded ... OK
   * checking whether the package can be loaded with stated dependencies ... OK
   * checking whether the package can be unloaded cleanly ... OK
   * checking whether the namespace can be loaded with stated dependencies ... OK
   * checking whether the namespace can be unloaded cleanly ... OK
   * checking loading without being on the library search path ... OK
   * checking use of S3 registration ... OK
   * checking dependencies in R code ... OK
   * checking S3 generic/method consistency ... OK
   * checking replacement functions ... OK
   * checking foreign function calls ... OK
   * checking R code for possible problems ... NOTE
   Found if() conditions comparing class() to string:
   File ‘SparkR/R/catalog.R’: if (class(schema) == "structType") ...
   File ‘SparkR/R/catalog.R’: if (class(tableName) != "character") ...
   File ‘SparkR/R/catalog.R’: if (class(viewName) != "character") ...
   File ‘SparkR/R/catalog.R’: if (class(databaseName) != "character") ...
   File ‘SparkR/R/catalog.R’: if (!is.null(databaseName) && class(databaseName) != "character") ...
   File ‘SparkR/R/catalog.R’: if (!is.null(databaseName) && class(databaseName) != "character") ...
   File ‘SparkR/R/catalog.R’: if (!is.null(databaseName) && class(databaseName) != "character") ...
   File ‘SparkR/R/column.R’: if (class(e2) == "Column") ...
   File ‘SparkR/R/column.R’: if (class(data) == "Column") ...
   File ‘SparkR/R/column.R’: if (class(value) == "Column") ...
   File ‘SparkR/R/column.R’: if (class(value) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(value) != "character") ...
   File ‘SparkR/R/DataFrame.R’: if (!is.null(col) && class(col) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (!is.null(col) && class(col) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (is.numeric(numPartitions) && class(col) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(col) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (length(cols) >= 1 && class(cols[[1]]) == "character") ...
   File ‘SparkR/R/DataFrame.R’: if (class(value) != "Column" && !is.null(value)) ...
   File ‘SparkR/R/DataFrame.R’: if (class(i) != "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(c) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(col) != "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(condition) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(joinExpr) != "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(value) == "list") ...
   File ‘SparkR/R/DataFrame.R’: if (class(col) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(col) == "character") ...
   File ‘SparkR/R/DataFrame.R’: if (class(col) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/DataFrame.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(col1) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(col1) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(percentage) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(accuracy) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(schema) == "structType") ...
   File ‘SparkR/R/functions.R’: if (class(schema) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "character") ...
   File ‘SparkR/R/functions.R’: if (class(schema) == "structType") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "character") ...
   File ‘SparkR/R/functions.R’: if (class(value) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(yes) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(no) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "Column") ...
   File ‘SparkR/R/functions.R’: if (class(x) == "character") ...
   File ‘SparkR/R/functions.R’: if (class(count) == "Column") ...
   File ‘SparkR/R/group.R’: if (class(cols[[1]]) == "Column") ...
   File ‘SparkR/R/group.R’: if (class(schema) == "structType") ...
   File ‘SparkR/R/mllib_classification.R’: if (class(lowerBoundsOnCoefficients) != "matrix") ...
   File ‘SparkR/R/mllib_classification.R’: if (class(upperBoundsOnCoefficients) != "matrix") ...
   File ‘SparkR/R/schema.R’: if (class(x) != "character") ...
   File ‘SparkR/R/schema.R’: if (class(type) != "character") ...
   File ‘SparkR/R/schema.R’: if (class(nullable) != "logical") ...
   File ‘SparkR/R/SQLContext.R’: if (class(schema) == "structType") ...
   File ‘SparkR/R/SQLContext.R’: if (class(schema) == "structType") ...
   File ‘SparkR/R/utils.R’: if (class(key) == "integer") ...
   File ‘SparkR/R/utils.R’: if (class(key) == "numeric") ...
   File ‘SparkR/R/utils.R’: if (class(key) == "character") ...
   File ‘SparkR/R/WindowSpec.R’: if (class(col) == "character") ...
   Use inherits() (or maybe is()) instead.
   * checking Rd files ... OK
   * checking Rd metadata ... OK
   * checking Rd line widths ... OK
   * checking Rd cross-references ... OK
   * checking for missing documentation entries ... OK
   * checking for code/documentation mismatches ... OK
   * checking Rd \usage sections ... OK
   * checking Rd contents ... OK
   * checking for unstated dependencies in examples ... OK
   * checking installed files from ‘inst/doc’ ... OK
   * checking files in ‘vignettes’ ... OK
   * checking examples ... OK
   * checking for unstated dependencies in ‘tests’ ... OK
   * checking tests ... SKIPPED
   * checking for unstated dependencies in vignettes ... OK
   * checking package vignettes in ‘inst/doc’ ... OK
   * checking re-building of vignette outputs ... OK
   * checking PDF version of manual ... OK
   * skipping checking HTML version of manual: no command ‘tidy’ found
   * checking for non-standard things in the check directory ... OK
   * checking for detritus in the temp directory ... OK
   * DONE
   Status: 4 NOTEs
   Using R_SCRIPT_PATH = /usr/bin
   Removing lib path and installing from source package
   Creating a new generic function for ‘as.data.frame’ in package ‘SparkR’
   Creating a new generic function for ‘colnames’ in package ‘SparkR’
   Creating a new generic function for ‘colnames<-’ in package ‘SparkR’
   Creating a new generic function for ‘cov’ in package ‘SparkR’
   Creating a new generic function for ‘drop’ in package ‘SparkR’
   Creating a new generic function for ‘na.omit’ in package ‘SparkR’
   Creating a new generic function for ‘filter’ in package ‘SparkR’
   Creating a new generic function for ‘intersect’ in package ‘SparkR’
   Creating a new generic function for ‘sample’ in package ‘SparkR’
   Creating a new generic function for ‘transform’ in package ‘SparkR’
   Creating a new generic function for ‘subset’ in package ‘SparkR’
   Creating a new generic function for ‘summary’ in package ‘SparkR’
   Creating a new generic function for ‘union’ in package ‘SparkR’
   Creating a new generic function for ‘endsWith’ in package ‘SparkR’
   Creating a new generic function for ‘startsWith’ in package ‘SparkR’
   Creating a new generic function for ‘lag’ in package ‘SparkR’
   Creating a new generic function for ‘rank’ in package ‘SparkR’
   Creating a new generic function for ‘sd’ in package ‘SparkR’
   Creating a new generic function for ‘var’ in package ‘SparkR’
   Creating a new generic function for ‘window’ in package ‘SparkR’
   Creating a new generic function for ‘predict’ in package ‘SparkR’
   Creating a new generic function for ‘rbind’ in package ‘SparkR’
   Creating a generic function for ‘substr’ from package ‘base’ in package ‘SparkR’
   Creating a generic function for ‘%in%’ from package ‘base’ in package ‘SparkR’
   Creating a generic function for ‘lapply’ from package ‘base’ in package ‘SparkR’
   Creating a generic function for ‘Filter’ from package ‘base’ in package ‘SparkR’
   Creating a generic function for ‘nrow’ from package ‘base’ in package ‘SparkR’
   Creating a generic function for ‘ncol’ from package ‘base’ in package ‘SparkR’
   Creating a generic function for ‘factorial’ from package ‘base’ in package ‘SparkR’
   Creating a generic function for ‘atan2’ from package ‘base’ in package ‘SparkR’
   Creating a generic function for ‘ifelse’ from package ‘base’ in package ‘SparkR’
   /opt/spark-rm/output/spark-3.3.1-bin-hadoop3/R
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org