You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Reynold Xin <rx...@databricks.com> on 2016/09/22 06:47:37 UTC
R docs no longer building for branch-2.0
I'm working on packaging 2.0.1 rc but encountered a problem: R doc fails to
build. Can somebody take a look at the issue ASAP?
** knitting documentation of write.parquet
** knitting documentation of write.text
** knitting documentation of year
~/workspace/spark-release-docs/spark/R
~/workspace/spark-release-docs/spark/R
processing file: sparkr-vignettes.Rmd
|
| | 0%
|
|. | 1%
inline R code fragments
|
|. | 2%
label: unnamed-chunk-1 (with options)
List of 1
$ message: logi FALSE
Loading required package: methods
Attaching package: 'SparkR'
The following objects are masked from 'package:stats':
cov, filter, lag, na.omit, predict, sd, var, window
The following objects are masked from 'package:base':
as.data.frame, colnames, colnames<-, drop, intersect, rank,
rbind, sample, subset, summary, transform, union
|
|.. | 3%
ordinary text without R code
|
|.. | 4%
label: unnamed-chunk-2 (with options)
List of 1
$ message: logi FALSE
Spark package found in SPARK_HOME:
/home/jenkins/workspace/spark-release-docs/spark
Error: Could not find or load main class org.apache.spark.launcher.Main
Quitting from lines 30-31 (sparkr-vignettes.Rmd)
Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, :
JVM is not ready after 10 seconds
Calls: render ... eval -> eval -> sparkR.session -> sparkR.sparkContext
Execution halted
jekyll 2.5.3 | Error: R doc generation failed
Deleting credential directory
/home/jenkins/workspace/spark-release-docs/spark-utils/new-release-scripts/jenkins/jenkins-credentials-IXCkuX6w
Build step 'Execute shell' marked build as failure
[WS-CLEANUP] Deleting project workspace...[WS-CLEANUP] done
Finished: FAILURE
Re: R docs no longer building for branch-2.0
Posted by Shivaram Venkataraman <sh...@eecs.berkeley.edu>.
I looked into this and found the problem. Will send a PR now to fix this.
If you are curious about what is happening here: When we build the
docs separately we don't have the JAR files from the Spark build in
the same tree. We added a new set of docs recently in SparkR called an
R vignette that runs Spark and generates docs using outputs from the
run. So this doesn't work when the JARs are not available.
Thanks
Shivaram
On Thu, Sep 22, 2016 at 5:06 AM, Sean Owen <so...@cloudera.com> wrote:
> FWIW it worked for me, but I may not be executing the same thing. I
> was running the commands given in R/DOCUMENTATION.md
>
> It succeeded for me in creating the vignette, on branch-2.0.
>
> Maybe it's a version or library issue? what R do you have installed,
> and are you up to date with packages like devtools and roxygen2?
>
> On Thu, Sep 22, 2016 at 7:47 AM, Reynold Xin <rx...@databricks.com> wrote:
>> I'm working on packaging 2.0.1 rc but encountered a problem: R doc fails to
>> build. Can somebody take a look at the issue ASAP?
>>
>>
>>
>> ** knitting documentation of write.parquet
>> ** knitting documentation of write.text
>> ** knitting documentation of year
>> ~/workspace/spark-release-docs/spark/R
>> ~/workspace/spark-release-docs/spark/R
>>
>>
>> processing file: sparkr-vignettes.Rmd
>>
>> |
>> | | 0%
>> |
>> |. | 1%
>> inline R code fragments
>>
>>
>> |
>> |. | 2%
>> label: unnamed-chunk-1 (with options)
>> List of 1
>> $ message: logi FALSE
>>
>> Loading required package: methods
>>
>> Attaching package: 'SparkR'
>>
>> The following objects are masked from 'package:stats':
>>
>> cov, filter, lag, na.omit, predict, sd, var, window
>>
>> The following objects are masked from 'package:base':
>>
>> as.data.frame, colnames, colnames<-, drop, intersect, rank,
>> rbind, sample, subset, summary, transform, union
>>
>>
>> |
>> |.. | 3%
>> ordinary text without R code
>>
>>
>> |
>> |.. | 4%
>> label: unnamed-chunk-2 (with options)
>> List of 1
>> $ message: logi FALSE
>>
>> Spark package found in SPARK_HOME:
>> /home/jenkins/workspace/spark-release-docs/spark
>> Error: Could not find or load main class org.apache.spark.launcher.Main
>> Quitting from lines 30-31 (sparkr-vignettes.Rmd)
>> Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, :
>> JVM is not ready after 10 seconds
>> Calls: render ... eval -> eval -> sparkR.session -> sparkR.sparkContext
>>
>> Execution halted
>> jekyll 2.5.3 | Error: R doc generation failed
>> Deleting credential directory
>> /home/jenkins/workspace/spark-release-docs/spark-utils/new-release-scripts/jenkins/jenkins-credentials-IXCkuX6w
>> Build step 'Execute shell' marked build as failure
>> [WS-CLEANUP] Deleting project workspace...[WS-CLEANUP] done
>> Finished: FAILURE
---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
Re: R docs no longer building for branch-2.0
Posted by Sean Owen <so...@cloudera.com>.
FWIW it worked for me, but I may not be executing the same thing. I
was running the commands given in R/DOCUMENTATION.md
It succeeded for me in creating the vignette, on branch-2.0.
Maybe it's a version or library issue? what R do you have installed,
and are you up to date with packages like devtools and roxygen2?
On Thu, Sep 22, 2016 at 7:47 AM, Reynold Xin <rx...@databricks.com> wrote:
> I'm working on packaging 2.0.1 rc but encountered a problem: R doc fails to
> build. Can somebody take a look at the issue ASAP?
>
>
>
> ** knitting documentation of write.parquet
> ** knitting documentation of write.text
> ** knitting documentation of year
> ~/workspace/spark-release-docs/spark/R
> ~/workspace/spark-release-docs/spark/R
>
>
> processing file: sparkr-vignettes.Rmd
>
> |
> | | 0%
> |
> |. | 1%
> inline R code fragments
>
>
> |
> |. | 2%
> label: unnamed-chunk-1 (with options)
> List of 1
> $ message: logi FALSE
>
> Loading required package: methods
>
> Attaching package: 'SparkR'
>
> The following objects are masked from 'package:stats':
>
> cov, filter, lag, na.omit, predict, sd, var, window
>
> The following objects are masked from 'package:base':
>
> as.data.frame, colnames, colnames<-, drop, intersect, rank,
> rbind, sample, subset, summary, transform, union
>
>
> |
> |.. | 3%
> ordinary text without R code
>
>
> |
> |.. | 4%
> label: unnamed-chunk-2 (with options)
> List of 1
> $ message: logi FALSE
>
> Spark package found in SPARK_HOME:
> /home/jenkins/workspace/spark-release-docs/spark
> Error: Could not find or load main class org.apache.spark.launcher.Main
> Quitting from lines 30-31 (sparkr-vignettes.Rmd)
> Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, :
> JVM is not ready after 10 seconds
> Calls: render ... eval -> eval -> sparkR.session -> sparkR.sparkContext
>
> Execution halted
> jekyll 2.5.3 | Error: R doc generation failed
> Deleting credential directory
> /home/jenkins/workspace/spark-release-docs/spark-utils/new-release-scripts/jenkins/jenkins-credentials-IXCkuX6w
> Build step 'Execute shell' marked build as failure
> [WS-CLEANUP] Deleting project workspace...[WS-CLEANUP] done
> Finished: FAILURE
---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org