You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Anwar AliKhan <an...@gmail.com> on 2020/06/24 09:49:09 UTC
Error: Vignette re-building failed. Execution halted
./dev/make-distribution.sh --name custom-spark --pip --r --tgz -Psparkr
-Phive -Phive-thriftserver -Pmesos -Pyarn -Pkubernetes
<http://www.backbutton.co.uk/>
minor error Spark r test failed , I don't use r so it doesn't effect me.
***installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation
path
* DONE (SparkR)
++ cd /opt/spark/R/lib
++ jar cfM /opt/spark/R/lib/sparkr.zip SparkR
++ popd
++ cd /opt/spark/R/..
++ pwd
+ SPARK_HOME=/opt/spark
+ . /opt/spark/bin/load-spark-env.sh
++ '[' -z /opt/spark ']'
++ SPARK_ENV_SH=spark-env.sh
++ '[' -z '' ']'
++ export SPARK_ENV_LOADED=1
++ SPARK_ENV_LOADED=1
++ export SPARK_CONF_DIR=/opt/spark/conf
++ SPARK_CONF_DIR=/opt/spark/conf
++ SPARK_ENV_SH=/opt/spark/conf/spark-env.sh
++ [[ -f /opt/spark/conf/spark-env.sh ]]
++ set -a
++ . /opt/spark/conf/spark-env.sh
+++ export SPARK_LOCAL_IP=192.168.0.786
+++ SPARK_LOCAL_IP=192.168.0.786
++ set +a
++ export SPARK_SCALA_VERSION=2.12
++ SPARK_SCALA_VERSION=2.12
+ '[' -f /opt/spark/RELEASE ']'
+ SPARK_JARS_DIR=/opt/spark/assembly/target/scala-2.12/jars
+ '[' -d /opt/spark/assembly/target/scala-2.12/jars ']'
+ SPARK_HOME=/opt/spark
+ /usr/bin/R CMD build /opt/spark/R/pkg
* checking for file ‘/opt/spark/R/pkg/DESCRIPTION’ ... OK
* preparing ‘SparkR’:
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... ERROR
--- re-building ‘sparkr-vignettes.Rmd’ using rmarkdown
Attaching package: 'SparkR'
The following objects are masked from 'package:stats':
cov, filter, lag, na.omit, predict, sd, var, window
The following objects are masked from 'package:base':
as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
rank, rbind, sample, startsWith, subset, summary, transform, union
Picked up _JAVA_OPTIONS: -XX:-UsePerfData
Picked up _JAVA_OPTIONS: -XX:-UsePerfData
20/06/24 10:23:54 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
[Stage 0:> (0 + 1)
/ 1]
[Stage 9:=================================================> (88 + 1) /
100]
[Stage 13:=======================================> (147 + 1) /
200]
20/06/24 10:24:04 WARN Instrumentation: [79237008] regParam is zero, which
might cause numerical instability and overfitting.
20/06/24 10:24:04 WARN BLAS: Failed to load implementation from:
com.github.fommil.netlib.NativeSystemBLAS
20/06/24 10:24:04 WARN BLAS: Failed to load implementation from:
com.github.fommil.netlib.NativeRefBLAS
20/06/24 10:24:04 WARN LAPACK: Failed to load implementation from:
com.github.fommil.netlib.NativeSystemLAPACK
20/06/24 10:24:04 WARN LAPACK: Failed to load implementation from:
com.github.fommil.netlib.NativeRefLAPACK
20/06/24 10:24:09 WARN package: Truncated the string representation of a
plan since it was too large. This behavior can be adjusted by setting
'spark.sql.debug.maxToStringFields'.
[Stage 67:============> (45 + 1) /
200]
[Stage 67:=================> (62 + 1) /
200]
[Stage 67:======================> (80 + 1) /
200]
[Stage 67:==========================> (98 + 1) /
200]
[Stage 67:==============================> (114 + 1) /
200]
[Stage 67:===================================> (132 + 1) /
200]
[Stage 67:=======================================> (148 + 1) /
200]
[Stage 67:============================================> (166 + 1) /
200]
[Stage 67:=================================================> (184 + 1) /
200]
[Stage 69:============> (44 + 1) /
200]
[Stage 69:================> (61 + 1) /
200]
[Stage 69:=====================> (79 + 1) /
200]
[Stage 69:==========================> (97 + 1) /
200]
[Stage 69:===============================> (116 + 1) /
200]
[Stage 69:====================================> (134 + 1) /
200]
[Stage 69:=========================================> (152 + 1) /
200]
[Stage 69:=============================================> (169 + 1) /
200]
[Stage 69:==================================================> (187 + 1) /
200]
[Stage 70:> (0 + 1)
/ 5]
20/06/24 10:24:14 ERROR Executor: Exception in task 0.0 in stage 70.0 (TID
1148)
org.apache.spark.SparkException: R unexpectedly exited.
R worker produced errors: Error in FUN(X[[i]], ...) :
requireNamespace("e1071", quietly = TRUE) is not TRUE
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
at scala.collection.Iterator.foreach(Iterator.scala:941)
at scala.collection.Iterator.foreach$(Iterator.scala:941)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
at
scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
at
scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
at
scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
at
scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
at
org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:127)
at
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
... 27 more
20/06/24 10:24:14 WARN TaskSetManager: Lost task 0.0 in stage 70.0 (TID
1148, 192.168.0.38, executor driver): org.apache.spark.SparkException: R
unexpectedly exited.
R worker produced errors: Error in FUN(X[[i]], ...) :
requireNamespace("e1071", quietly = TRUE) is not TRUE
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
at scala.collection.Iterator.foreach(Iterator.scala:941)
at scala.collection.Iterator.foreach$(Iterator.scala:941)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
at
scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
at
scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
at
scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
at
scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
at
org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:127)
at
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
... 27 more
20/06/24 10:24:14 ERROR TaskSetManager: Task 0 in stage 70.0 failed 1
times; aborting job
20/06/24 10:24:14 ERROR RBackendHandler: collect on 358 failed
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:164)
at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:105)
at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:39)
at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321)
at
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
at
io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
at
io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at
io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Job aborted due to stage
failure: Task 0 in stage 70.0 failed 1 times, most recent failure: Lost
task 0.0 in stage 70.0 (TID 1148, 192.168.0.38, executor driver):
org.apache.spark.SparkException: R unexpectedly exited.
R worker produced errors: Error in FUN(X[[i]], ...) :
requireNamespace("e1071", quietly = TRUE) is not TRUE
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
at scala.collection.Iterator.foreach(Iterator.scala:941)
at scala.collection.Iterator.foreach$(Iterator.scala:941)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
at
scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
at
scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
at
scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
at
scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
at
org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:127)
at
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
... 27 more
Driver stacktrace:
at
org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2117)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2066)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2065)
at
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2065)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1021)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1021)
at scala.Option.foreach(Option.scala:407)
at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1021)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2297)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2246)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2235)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:823)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2108)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2129)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2148)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2173)
at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
at
org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:361)
at
org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:360)
at
org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
... 37 more
Caused by: org.apache.spark.SparkException: R unexpectedly exited.
R worker produced errors: Error in FUN(X[[i]], ...) :
requireNamespace("e1071", quietly = TRUE) is not TRUE
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
at scala.collection.Iterator.foreach(Iterator.scala:941)
at scala.collection.Iterator.foreach$(Iterator.scala:941)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
at
scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
at
scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
at
scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
at
scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
at
org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:127)
at
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
... 1 more
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
... 27 more
20/06/24 10:24:14 ERROR RRunner: R Writer thread got an exception
org.apache.spark.TaskKilledException
at
org.apache.spark.TaskContextImpl.killTaskIfInterrupted(TaskContextImpl.scala:156)
at
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:36)
at
org.apache.spark.api.r.BaseRRunner$WriterThread.run(BaseRRunner.scala:197)
Quitting from lines 470-471 (sparkr-vignettes.Rmd)
Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
in stage 70.0 failed 1 times, most recent failure: Lost task 0.0 in stage
70.0 (TID 1148, 192.168.0.38, executor driver):
org.apache.spark.SparkException: R unexpectedly exited.
R worker produced errors: Error in FUN(X[[i]], ...) :
requireNamespace("e1071", quietly = TRUE) is not TRUE
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
at scala.collection.Iterator.foreach(Iterator.scala:941)
at scala.collection.Iterator.foreach$(Iterator.scala:941)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
at
scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
at
scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
at
scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
at scala.collection.TraversableOnce.to
<http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
at
scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
at
scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
at
scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
at
org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
at
org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:127)
at
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
... 27 more
Driver stacktrace:
at
org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2117)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2066)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2065)
at
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2065)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1021)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1021)
at scala.Option.foreach(Option.scala:407)
at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1021)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2297)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2246)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2235)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at
org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:823)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2108)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2129)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2148)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2173)
at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
at
org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:361)
at
org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:360)
at
org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:164)
at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:105)
at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:39)
at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321)
at
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
at io.netty.channel.nio.NioEventLoop.run(NioEventLo
--- failed re-building ‘sparkr-vignettes.Rmd’
SUMMARY: processing the following file failed:
‘sparkr-vignettes.Rmd’
Error: Vignette re-building failed.
Execution halted
20/06/24 10:24:15 WARN TaskSetManager: Lost task 1.0 in stage 70.0 (TID
1149, 192.168.0.38, executor driver): TaskKilled (Stage cancelled)
<http://www.backbutton.co.uk/>
Suggested Amendment to ./dev/make-distribution.sh
Posted by Anwar AliKhan <an...@gmail.com>.
😯 May I suggest😎 amending your ./dev/make-distribution.sh. 🤐
To include a 😬 check if these two previously mentioned packages 😍 are
installed and if not 🤔 install them
as part of build process . The build process time 😱will increase if the
packages are not installed. Long build process is normal 😴 expectation
especially if a project has been going for 10 years.😤😷
A message to say these packages are needed but not installed . Please wait
while packages are being installed will be helpful to the user experience.🤗
On Wed, 24 Jun 2020, 16:21 Anwar AliKhan, <an...@gmail.com> wrote:
> THANKS !
>
>
> It appears that was the last dependency for the build.
> sudo apt-get install -y r-cran-e1071.
>
> Shout out to ZOOM
> https://zoomadmin.com/HowToInstall/UbuntuPackage/r-cran-e1071 again
> like they say it was "It’s Super Easy! "
>
> package knitr was the previous missing dependency which I was able to
> work out from build error message
> sudo apt install knitr
>
> 'e1071' doesn't appear to be a package name or namespace.
> package 'e1071' seems to be a formidable package for machine learning
> algorithms.
>
>
> *** installing help indices
> ** building package indices
> ** installing vignettes
> ** testing if installed package can be loaded from temporary location
> ** testing if installed package can be loaded from final location
> ** testing if installed package keeps a record of temporary installation
> path
> * DONE (SparkR)
> /opt/spark/R
> + popd
> + mkdir /opt/spark/dist/conf
> + cp /opt/spark/conf/fairscheduler.xml.template
> /opt/spark/conf/log4j.properties.template
> /opt/spark/conf/metrics.properties.template /opt/spark/conf/slaves.template
> /opt/spark/conf/spark-defaults.conf.template
> /opt/spark/conf/spark-env.sh.template /opt/spark/dist/conf
> + cp /opt/spark/README.md /opt/spark/dist
> + cp -r /opt/spark/bin /opt/spark/dist
> + cp -r /opt/spark/python /opt/spark/dist
> + '[' true == true ']'
> + rm -f /opt/spark/dist/python/dist/pyspark-3.1.0.dev0.tar.gz
> + cp -r /opt/spark/sbin /opt/spark/dist
> + '[' -d /opt/spark/R/lib/SparkR ']'
> + mkdir -p /opt/spark/dist/R/lib
> + cp -r /opt/spark/R/lib/SparkR /opt/spark/dist/R/lib
> + cp /opt/spark/R/lib/sparkr.zip /opt/spark/dist/R/lib
> + '[' true == true ']'
> + TARDIR_NAME=spark-3.1.0-SNAPSHOT-bin-custom-spark
> + TARDIR=/opt/spark/spark-3.1.0-SNAPSHOT-bin-custom-spark
> + rm -rf /opt/spark/spark-3.1.0-SNAPSHOT-bin-custom-spark
> + cp -r /opt/spark/dist /opt/spark/spark-3.1.0-SNAPSHOT-bin-custom-spark
> + tar czf spark-3.1.0-SNAPSHOT-bin-custom-spark.tgz -C /opt/spark
> spark-3.1.0-SNAPSHOT-bin-custom-spark
> + rm -rf /opt/spark/spark-3.1.0-SNAPSHOT-bin-custom-spark
> <http://www.backbutton.co.uk/>
>
>
> On Wed, 24 Jun 2020, 11:07 Hyukjin Kwon, <gu...@gmail.com> wrote:
>
>> Looks like you haven't installed the 'e1071' package.
>>
>> 2020년 6월 24일 (수) 오후 6:49, Anwar AliKhan <an...@gmail.com>님이 작성:
>>
>>> ./dev/make-distribution.sh --name custom-spark --pip --r --tgz -Psparkr
>>> -Phive -Phive-thriftserver -Pmesos -Pyarn -Pkubernetes
>>> <http://www.backbutton.co.uk/>
>>>
>>>
>>> minor error Spark r test failed , I don't use r so it doesn't effect me.
>>>
>>> ***installing help indices
>>> ** building package indices
>>> ** installing vignettes
>>> ** testing if installed package can be loaded from temporary location
>>> ** testing if installed package can be loaded from final location
>>> ** testing if installed package keeps a record of temporary installation
>>> path
>>> * DONE (SparkR)
>>> ++ cd /opt/spark/R/lib
>>> ++ jar cfM /opt/spark/R/lib/sparkr.zip SparkR
>>> ++ popd
>>> ++ cd /opt/spark/R/..
>>> ++ pwd
>>> + SPARK_HOME=/opt/spark
>>> + . /opt/spark/bin/load-spark-env.sh
>>> ++ '[' -z /opt/spark ']'
>>> ++ SPARK_ENV_SH=spark-env.sh
>>> ++ '[' -z '' ']'
>>> ++ export SPARK_ENV_LOADED=1
>>> ++ SPARK_ENV_LOADED=1
>>> ++ export SPARK_CONF_DIR=/opt/spark/conf
>>> ++ SPARK_CONF_DIR=/opt/spark/conf
>>> ++ SPARK_ENV_SH=/opt/spark/conf/spark-env.sh
>>> ++ [[ -f /opt/spark/conf/spark-env.sh ]]
>>> ++ set -a
>>> ++ . /opt/spark/conf/spark-env.sh
>>> +++ export SPARK_LOCAL_IP=192.168.0.786
>>> +++ SPARK_LOCAL_IP=192.168.0.786
>>> ++ set +a
>>> ++ export SPARK_SCALA_VERSION=2.12
>>> ++ SPARK_SCALA_VERSION=2.12
>>> + '[' -f /opt/spark/RELEASE ']'
>>> + SPARK_JARS_DIR=/opt/spark/assembly/target/scala-2.12/jars
>>> + '[' -d /opt/spark/assembly/target/scala-2.12/jars ']'
>>> + SPARK_HOME=/opt/spark
>>> + /usr/bin/R CMD build /opt/spark/R/pkg
>>> * checking for file ‘/opt/spark/R/pkg/DESCRIPTION’ ... OK
>>> * preparing ‘SparkR’:
>>> * checking DESCRIPTION meta-information ... OK
>>> * installing the package to build vignettes
>>> * creating vignettes ... ERROR
>>> --- re-building ‘sparkr-vignettes.Rmd’ using rmarkdown
>>>
>>> Attaching package: 'SparkR'
>>>
>>> The following objects are masked from 'package:stats':
>>>
>>> cov, filter, lag, na.omit, predict, sd, var, window
>>>
>>> The following objects are masked from 'package:base':
>>>
>>> as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
>>> rank, rbind, sample, startsWith, subset, summary, transform, union
>>>
>>> Picked up _JAVA_OPTIONS: -XX:-UsePerfData
>>> Picked up _JAVA_OPTIONS: -XX:-UsePerfData
>>> 20/06/24 10:23:54 WARN NativeCodeLoader: Unable to load native-hadoop
>>> library for your platform... using builtin-java classes where applicable
>>> Setting default log level to "WARN".
>>> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
>>> setLogLevel(newLevel).
>>>
>>> [Stage 0:> (0 +
>>> 1) / 1]
>>>
>>>
>>>
>>>
>>> [Stage 9:=================================================> (88 +
>>> 1) / 100]
>>>
>>>
>>>
>>>
>>> [Stage 13:=======================================> (147 +
>>> 1) / 200]
>>>
>>>
>>>
>>> 20/06/24 10:24:04 WARN Instrumentation: [79237008] regParam is zero,
>>> which might cause numerical instability and overfitting.
>>> 20/06/24 10:24:04 WARN BLAS: Failed to load implementation from:
>>> com.github.fommil.netlib.NativeSystemBLAS
>>> 20/06/24 10:24:04 WARN BLAS: Failed to load implementation from:
>>> com.github.fommil.netlib.NativeRefBLAS
>>> 20/06/24 10:24:04 WARN LAPACK: Failed to load implementation from:
>>> com.github.fommil.netlib.NativeSystemLAPACK
>>> 20/06/24 10:24:04 WARN LAPACK: Failed to load implementation from:
>>> com.github.fommil.netlib.NativeRefLAPACK
>>> 20/06/24 10:24:09 WARN package: Truncated the string representation of a
>>> plan since it was too large. This behavior can be adjusted by setting
>>> 'spark.sql.debug.maxToStringFields'.
>>>
>>> [Stage 67:============> (45 +
>>> 1) / 200]
>>>
>>> [Stage 67:=================> (62 +
>>> 1) / 200]
>>>
>>> [Stage 67:======================> (80 +
>>> 1) / 200]
>>>
>>> [Stage 67:==========================> (98 +
>>> 1) / 200]
>>>
>>> [Stage 67:==============================> (114 +
>>> 1) / 200]
>>>
>>> [Stage 67:===================================> (132 +
>>> 1) / 200]
>>>
>>> [Stage 67:=======================================> (148 +
>>> 1) / 200]
>>>
>>> [Stage 67:============================================> (166 +
>>> 1) / 200]
>>>
>>> [Stage 67:=================================================> (184 +
>>> 1) / 200]
>>>
>>>
>>>
>>>
>>> [Stage 69:============> (44 +
>>> 1) / 200]
>>>
>>> [Stage 69:================> (61 +
>>> 1) / 200]
>>>
>>> [Stage 69:=====================> (79 +
>>> 1) / 200]
>>>
>>> [Stage 69:==========================> (97 +
>>> 1) / 200]
>>>
>>> [Stage 69:===============================> (116 +
>>> 1) / 200]
>>>
>>> [Stage 69:====================================> (134 +
>>> 1) / 200]
>>>
>>> [Stage 69:=========================================> (152 +
>>> 1) / 200]
>>>
>>> [Stage 69:=============================================> (169 +
>>> 1) / 200]
>>>
>>> [Stage 69:==================================================> (187 +
>>> 1) / 200]
>>>
>>>
>>>
>>>
>>> [Stage 70:> (0 +
>>> 1) / 5]
>>> 20/06/24 10:24:14 ERROR Executor: Exception in task 0.0 in stage 70.0
>>> (TID 1148)
>>> org.apache.spark.SparkException: R unexpectedly exited.
>>> R worker produced errors: Error in FUN(X[[i]], ...) :
>>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>>
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>>> at
>>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>
>>> $(TraversableOnce.scala:313)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>>> at
>>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>>> at
>>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>>> at
>>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>>> at
>>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>>> at
>>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>> at java.lang.Thread.run(Thread.java:748)
>>> Caused by: java.io.EOFException
>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>>> ... 27 more
>>> 20/06/24 10:24:14 WARN TaskSetManager: Lost task 0.0 in stage 70.0 (TID
>>> 1148, 192.168.0.38, executor driver): org.apache.spark.SparkException: R
>>> unexpectedly exited.
>>> R worker produced errors: Error in FUN(X[[i]], ...) :
>>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>>
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>>> at
>>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>
>>> $(TraversableOnce.scala:313)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>>> at
>>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>>> at
>>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>>> at
>>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>>> at
>>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>>> at
>>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>> at java.lang.Thread.run(Thread.java:748)
>>> Caused by: java.io.EOFException
>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>>> ... 27 more
>>>
>>> 20/06/24 10:24:14 ERROR TaskSetManager: Task 0 in stage 70.0 failed 1
>>> times; aborting job
>>> 20/06/24 10:24:14 ERROR RBackendHandler: collect on 358 failed
>>> java.lang.reflect.InvocationTargetException
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>> at
>>> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:164)
>>> at
>>> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:105)
>>> at
>>> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:39)
>>> at
>>> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>>> at
>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>>> at
>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>>> at
>>> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321)
>>> at
>>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>>> at
>>> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
>>> at
>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
>>> at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
>>> at
>>> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
>>> at
>>> io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
>>> at java.lang.Thread.run(Thread.java:748)
>>> Caused by: org.apache.spark.SparkException: Job aborted due to stage
>>> failure: Task 0 in stage 70.0 failed 1 times, most recent failure: Lost
>>> task 0.0 in stage 70.0 (TID 1148, 192.168.0.38, executor driver):
>>> org.apache.spark.SparkException: R unexpectedly exited.
>>> R worker produced errors: Error in FUN(X[[i]], ...) :
>>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>>
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>>> at
>>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>
>>> $(TraversableOnce.scala:313)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>>> at
>>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>>> at
>>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>>> at
>>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>>> at
>>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>>> at
>>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>> at java.lang.Thread.run(Thread.java:748)
>>> Caused by: java.io.EOFException
>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>>> ... 27 more
>>>
>>> Driver stacktrace:
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2117)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2066)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2065)
>>> at
>>> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
>>> at
>>> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
>>> at
>>> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2065)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1021)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1021)
>>> at scala.Option.foreach(Option.scala:407)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1021)
>>> at
>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2297)
>>> at
>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2246)
>>> at
>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2235)
>>> at
>>> org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:823)
>>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2108)
>>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2129)
>>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2148)
>>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2173)
>>> at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
>>> at
>>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>>> at
>>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
>>> at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
>>> at
>>> org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:361)
>>> at
>>> org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:360)
>>> at
>>> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
>>> ... 37 more
>>> Caused by: org.apache.spark.SparkException: R unexpectedly exited.
>>> R worker produced errors: Error in FUN(X[[i]], ...) :
>>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>>
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>>> at
>>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>
>>> $(TraversableOnce.scala:313)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>>> at
>>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>>> at
>>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>>> at
>>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>>> at
>>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>>> at
>>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>> ... 1 more
>>> Caused by: java.io.EOFException
>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>>> ... 27 more
>>> 20/06/24 10:24:14 ERROR RRunner: R Writer thread got an exception
>>> org.apache.spark.TaskKilledException
>>> at
>>> org.apache.spark.TaskContextImpl.killTaskIfInterrupted(TaskContextImpl.scala:156)
>>> at
>>> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:36)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$WriterThread.run(BaseRRunner.scala:197)
>>> Quitting from lines 470-471 (sparkr-vignettes.Rmd)
>>> Error: processing vignette 'sparkr-vignettes.Rmd' failed with
>>> diagnostics:
>>> org.apache.spark.SparkException: Job aborted due to stage failure: Task
>>> 0 in stage 70.0 failed 1 times, most recent failure: Lost task 0.0 in stage
>>> 70.0 (TID 1148, 192.168.0.38, executor driver):
>>> org.apache.spark.SparkException: R unexpectedly exited.
>>> R worker produced errors: Error in FUN(X[[i]], ...) :
>>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>>
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>>> at
>>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>>> at
>>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>>> at
>>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>>> at scala.collection.TraversableOnce.to
>>> <http://scala.collection.traversableonce.to/>
>>> $(TraversableOnce.scala:313)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>>> at
>>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>>> at
>>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>>> at
>>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>>> at
>>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>>> at
>>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>>> at
>>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>>> at
>>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>>> at
>>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>> at java.lang.Thread.run(Thread.java:748)
>>> Caused by: java.io.EOFException
>>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>>> ... 27 more
>>>
>>> Driver stacktrace:
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2117)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2066)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2065)
>>> at
>>> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
>>> at
>>> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
>>> at
>>> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2065)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1021)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1021)
>>> at scala.Option.foreach(Option.scala:407)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1021)
>>> at
>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2297)
>>> at
>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2246)
>>> at
>>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2235)
>>> at
>>> org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
>>> at
>>> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:823)
>>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2108)
>>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2129)
>>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2148)
>>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2173)
>>> at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
>>> at
>>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>>> at
>>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
>>> at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
>>> at
>>> org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:361)
>>> at
>>> org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:360)
>>> at
>>> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:498)
>>> at
>>> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:164)
>>> at
>>> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:105)
>>> at
>>> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:39)
>>> at
>>> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>>> at
>>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>>> at
>>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>>> at
>>> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321)
>>> at
>>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>>> at
>>> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>>> at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>>> at
>>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
>>> at
>>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
>>> at
>>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
>>> at io.netty.channel.nio.NioEventLoop.run(NioEventLo
>>> --- failed re-building ‘sparkr-vignettes.Rmd’
>>>
>>> SUMMARY: processing the following file failed:
>>> ‘sparkr-vignettes.Rmd’
>>>
>>> Error: Vignette re-building failed.
>>> Execution halted
>>> 20/06/24 10:24:15 WARN TaskSetManager: Lost task 1.0 in stage 70.0 (TID
>>> 1149, 192.168.0.38, executor driver): TaskKilled (Stage cancelled)
>>>
>>> <http://www.backbutton.co.uk/>
>>>
>>>
Re: Error: Vignette re-building failed. Execution halted
Posted by Anwar AliKhan <an...@gmail.com>.
THANKS !
It appears that was the last dependency for the build.
sudo apt-get install -y r-cran-e1071.
Shout out to ZOOM
https://zoomadmin.com/HowToInstall/UbuntuPackage/r-cran-e1071 again
like they say it was "It’s Super Easy! "
package knitr was the previous missing dependency which I was able to work
out from build error message
sudo apt install knitr
'e1071' doesn't appear to be a package name or namespace.
package 'e1071' seems to be a formidable package for machine learning
algorithms.
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
** testing if installed package can be loaded from final location
** testing if installed package keeps a record of temporary installation
path
* DONE (SparkR)
/opt/spark/R
+ popd
+ mkdir /opt/spark/dist/conf
+ cp /opt/spark/conf/fairscheduler.xml.template
/opt/spark/conf/log4j.properties.template
/opt/spark/conf/metrics.properties.template /opt/spark/conf/slaves.template
/opt/spark/conf/spark-defaults.conf.template
/opt/spark/conf/spark-env.sh.template /opt/spark/dist/conf
+ cp /opt/spark/README.md /opt/spark/dist
+ cp -r /opt/spark/bin /opt/spark/dist
+ cp -r /opt/spark/python /opt/spark/dist
+ '[' true == true ']'
+ rm -f /opt/spark/dist/python/dist/pyspark-3.1.0.dev0.tar.gz
+ cp -r /opt/spark/sbin /opt/spark/dist
+ '[' -d /opt/spark/R/lib/SparkR ']'
+ mkdir -p /opt/spark/dist/R/lib
+ cp -r /opt/spark/R/lib/SparkR /opt/spark/dist/R/lib
+ cp /opt/spark/R/lib/sparkr.zip /opt/spark/dist/R/lib
+ '[' true == true ']'
+ TARDIR_NAME=spark-3.1.0-SNAPSHOT-bin-custom-spark
+ TARDIR=/opt/spark/spark-3.1.0-SNAPSHOT-bin-custom-spark
+ rm -rf /opt/spark/spark-3.1.0-SNAPSHOT-bin-custom-spark
+ cp -r /opt/spark/dist /opt/spark/spark-3.1.0-SNAPSHOT-bin-custom-spark
+ tar czf spark-3.1.0-SNAPSHOT-bin-custom-spark.tgz -C /opt/spark
spark-3.1.0-SNAPSHOT-bin-custom-spark
+ rm -rf /opt/spark/spark-3.1.0-SNAPSHOT-bin-custom-spark
<http://www.backbutton.co.uk/>
On Wed, 24 Jun 2020, 11:07 Hyukjin Kwon, <gu...@gmail.com> wrote:
> Looks like you haven't installed the 'e1071' package.
>
> 2020년 6월 24일 (수) 오후 6:49, Anwar AliKhan <an...@gmail.com>님이 작성:
>
>> ./dev/make-distribution.sh --name custom-spark --pip --r --tgz -Psparkr
>> -Phive -Phive-thriftserver -Pmesos -Pyarn -Pkubernetes
>> <http://www.backbutton.co.uk/>
>>
>>
>> minor error Spark r test failed , I don't use r so it doesn't effect me.
>>
>> ***installing help indices
>> ** building package indices
>> ** installing vignettes
>> ** testing if installed package can be loaded from temporary location
>> ** testing if installed package can be loaded from final location
>> ** testing if installed package keeps a record of temporary installation
>> path
>> * DONE (SparkR)
>> ++ cd /opt/spark/R/lib
>> ++ jar cfM /opt/spark/R/lib/sparkr.zip SparkR
>> ++ popd
>> ++ cd /opt/spark/R/..
>> ++ pwd
>> + SPARK_HOME=/opt/spark
>> + . /opt/spark/bin/load-spark-env.sh
>> ++ '[' -z /opt/spark ']'
>> ++ SPARK_ENV_SH=spark-env.sh
>> ++ '[' -z '' ']'
>> ++ export SPARK_ENV_LOADED=1
>> ++ SPARK_ENV_LOADED=1
>> ++ export SPARK_CONF_DIR=/opt/spark/conf
>> ++ SPARK_CONF_DIR=/opt/spark/conf
>> ++ SPARK_ENV_SH=/opt/spark/conf/spark-env.sh
>> ++ [[ -f /opt/spark/conf/spark-env.sh ]]
>> ++ set -a
>> ++ . /opt/spark/conf/spark-env.sh
>> +++ export SPARK_LOCAL_IP=192.168.0.786
>> +++ SPARK_LOCAL_IP=192.168.0.786
>> ++ set +a
>> ++ export SPARK_SCALA_VERSION=2.12
>> ++ SPARK_SCALA_VERSION=2.12
>> + '[' -f /opt/spark/RELEASE ']'
>> + SPARK_JARS_DIR=/opt/spark/assembly/target/scala-2.12/jars
>> + '[' -d /opt/spark/assembly/target/scala-2.12/jars ']'
>> + SPARK_HOME=/opt/spark
>> + /usr/bin/R CMD build /opt/spark/R/pkg
>> * checking for file ‘/opt/spark/R/pkg/DESCRIPTION’ ... OK
>> * preparing ‘SparkR’:
>> * checking DESCRIPTION meta-information ... OK
>> * installing the package to build vignettes
>> * creating vignettes ... ERROR
>> --- re-building ‘sparkr-vignettes.Rmd’ using rmarkdown
>>
>> Attaching package: 'SparkR'
>>
>> The following objects are masked from 'package:stats':
>>
>> cov, filter, lag, na.omit, predict, sd, var, window
>>
>> The following objects are masked from 'package:base':
>>
>> as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
>> rank, rbind, sample, startsWith, subset, summary, transform, union
>>
>> Picked up _JAVA_OPTIONS: -XX:-UsePerfData
>> Picked up _JAVA_OPTIONS: -XX:-UsePerfData
>> 20/06/24 10:23:54 WARN NativeCodeLoader: Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>> Setting default log level to "WARN".
>> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
>> setLogLevel(newLevel).
>>
>> [Stage 0:> (0 +
>> 1) / 1]
>>
>>
>>
>>
>> [Stage 9:=================================================> (88 + 1)
>> / 100]
>>
>>
>>
>>
>> [Stage 13:=======================================> (147 + 1)
>> / 200]
>>
>>
>>
>> 20/06/24 10:24:04 WARN Instrumentation: [79237008] regParam is zero,
>> which might cause numerical instability and overfitting.
>> 20/06/24 10:24:04 WARN BLAS: Failed to load implementation from:
>> com.github.fommil.netlib.NativeSystemBLAS
>> 20/06/24 10:24:04 WARN BLAS: Failed to load implementation from:
>> com.github.fommil.netlib.NativeRefBLAS
>> 20/06/24 10:24:04 WARN LAPACK: Failed to load implementation from:
>> com.github.fommil.netlib.NativeSystemLAPACK
>> 20/06/24 10:24:04 WARN LAPACK: Failed to load implementation from:
>> com.github.fommil.netlib.NativeRefLAPACK
>> 20/06/24 10:24:09 WARN package: Truncated the string representation of a
>> plan since it was too large. This behavior can be adjusted by setting
>> 'spark.sql.debug.maxToStringFields'.
>>
>> [Stage 67:============> (45 + 1)
>> / 200]
>>
>> [Stage 67:=================> (62 + 1)
>> / 200]
>>
>> [Stage 67:======================> (80 + 1)
>> / 200]
>>
>> [Stage 67:==========================> (98 + 1)
>> / 200]
>>
>> [Stage 67:==============================> (114 + 1)
>> / 200]
>>
>> [Stage 67:===================================> (132 + 1)
>> / 200]
>>
>> [Stage 67:=======================================> (148 + 1)
>> / 200]
>>
>> [Stage 67:============================================> (166 + 1)
>> / 200]
>>
>> [Stage 67:=================================================> (184 + 1)
>> / 200]
>>
>>
>>
>>
>> [Stage 69:============> (44 + 1)
>> / 200]
>>
>> [Stage 69:================> (61 + 1)
>> / 200]
>>
>> [Stage 69:=====================> (79 + 1)
>> / 200]
>>
>> [Stage 69:==========================> (97 + 1)
>> / 200]
>>
>> [Stage 69:===============================> (116 + 1)
>> / 200]
>>
>> [Stage 69:====================================> (134 + 1)
>> / 200]
>>
>> [Stage 69:=========================================> (152 + 1)
>> / 200]
>>
>> [Stage 69:=============================================> (169 + 1)
>> / 200]
>>
>> [Stage 69:==================================================> (187 + 1)
>> / 200]
>>
>>
>>
>>
>> [Stage 70:> (0 +
>> 1) / 5]
>> 20/06/24 10:24:14 ERROR Executor: Exception in task 0.0 in stage 70.0
>> (TID 1148)
>> org.apache.spark.SparkException: R unexpectedly exited.
>> R worker produced errors: Error in FUN(X[[i]], ...) :
>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>> at
>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>> at
>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>> at
>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>> at
>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>> at
>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>> at
>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>> at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.io.EOFException
>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>> ... 27 more
>> 20/06/24 10:24:14 WARN TaskSetManager: Lost task 0.0 in stage 70.0 (TID
>> 1148, 192.168.0.38, executor driver): org.apache.spark.SparkException: R
>> unexpectedly exited.
>> R worker produced errors: Error in FUN(X[[i]], ...) :
>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>> at
>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>> at
>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>> at
>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>> at
>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>> at
>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>> at
>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>> at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.io.EOFException
>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>> ... 27 more
>>
>> 20/06/24 10:24:14 ERROR TaskSetManager: Task 0 in stage 70.0 failed 1
>> times; aborting job
>> 20/06/24 10:24:14 ERROR RBackendHandler: collect on 358 failed
>> java.lang.reflect.InvocationTargetException
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:164)
>> at
>> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:105)
>> at
>> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:39)
>> at
>> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>> at
>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>> at
>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>> at
>> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321)
>> at
>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>> at
>> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
>> at
>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
>> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
>> at
>> io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
>> at
>> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
>> at
>> io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
>> at java.lang.Thread.run(Thread.java:748)
>> Caused by: org.apache.spark.SparkException: Job aborted due to stage
>> failure: Task 0 in stage 70.0 failed 1 times, most recent failure: Lost
>> task 0.0 in stage 70.0 (TID 1148, 192.168.0.38, executor driver):
>> org.apache.spark.SparkException: R unexpectedly exited.
>> R worker produced errors: Error in FUN(X[[i]], ...) :
>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>> at
>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>> at
>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>> at
>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>> at
>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>> at
>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>> at
>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>> at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.io.EOFException
>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>> ... 27 more
>>
>> Driver stacktrace:
>> at
>> org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2117)
>> at
>> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2066)
>> at
>> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2065)
>> at
>> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
>> at
>> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
>> at
>> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
>> at
>> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2065)
>> at
>> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1021)
>> at
>> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1021)
>> at scala.Option.foreach(Option.scala:407)
>> at
>> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1021)
>> at
>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2297)
>> at
>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2246)
>> at
>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2235)
>> at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
>> at
>> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:823)
>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2108)
>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2129)
>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2148)
>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2173)
>> at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
>> at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>> at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
>> at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
>> at
>> org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:361)
>> at
>> org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:360)
>> at
>> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
>> ... 37 more
>> Caused by: org.apache.spark.SparkException: R unexpectedly exited.
>> R worker produced errors: Error in FUN(X[[i]], ...) :
>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>> at
>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>> at
>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>> at
>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>> at
>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>> at
>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>> at
>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>> ... 1 more
>> Caused by: java.io.EOFException
>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>> ... 27 more
>> 20/06/24 10:24:14 ERROR RRunner: R Writer thread got an exception
>> org.apache.spark.TaskKilledException
>> at
>> org.apache.spark.TaskContextImpl.killTaskIfInterrupted(TaskContextImpl.scala:156)
>> at
>> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:36)
>> at
>> org.apache.spark.api.r.BaseRRunner$WriterThread.run(BaseRRunner.scala:197)
>> Quitting from lines 470-471 (sparkr-vignettes.Rmd)
>> Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
>> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
>> in stage 70.0 failed 1 times, most recent failure: Lost task 0.0 in stage
>> 70.0 (TID 1148, 192.168.0.38, executor driver):
>> org.apache.spark.SparkException: R unexpectedly exited.
>> R worker produced errors: Error in FUN(X[[i]], ...) :
>> requireNamespace("e1071", quietly = TRUE) is not TRUE
>>
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
>> at
>> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
>> at scala.collection.Iterator.foreach(Iterator.scala:941)
>> at scala.collection.Iterator.foreach$(Iterator.scala:941)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
>> at
>> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
>> at
>> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
>> at scala.collection.TraversableOnce.to
>> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
>> at
>> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
>> at
>> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
>> at
>> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
>> at
>> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
>> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
>> at
>> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
>> at
>> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
>> at org.apache.spark.scheduler.Task.run(Task.scala:127)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
>> at
>> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
>> at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>> at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.io.EOFException
>> at java.io.DataInputStream.readInt(DataInputStream.java:392)
>> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
>> ... 27 more
>>
>> Driver stacktrace:
>> at
>> org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2117)
>> at
>> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2066)
>> at
>> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2065)
>> at
>> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
>> at
>> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
>> at
>> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
>> at
>> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2065)
>> at
>> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1021)
>> at
>> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1021)
>> at scala.Option.foreach(Option.scala:407)
>> at
>> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1021)
>> at
>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2297)
>> at
>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2246)
>> at
>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2235)
>> at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
>> at
>> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:823)
>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2108)
>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2129)
>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2148)
>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2173)
>> at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
>> at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>> at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
>> at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
>> at
>> org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:361)
>> at
>> org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:360)
>> at
>> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at
>> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:164)
>> at
>> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:105)
>> at
>> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:39)
>> at
>> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>> at
>> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>> at
>> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>> at
>> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321)
>> at
>> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>> at
>> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>> at
>> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>> at
>> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
>> at
>> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
>> at
>> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
>> at io.netty.channel.nio.NioEventLoop.run(NioEventLo
>> --- failed re-building ‘sparkr-vignettes.Rmd’
>>
>> SUMMARY: processing the following file failed:
>> ‘sparkr-vignettes.Rmd’
>>
>> Error: Vignette re-building failed.
>> Execution halted
>> 20/06/24 10:24:15 WARN TaskSetManager: Lost task 1.0 in stage 70.0 (TID
>> 1149, 192.168.0.38, executor driver): TaskKilled (Stage cancelled)
>>
>> <http://www.backbutton.co.uk/>
>>
>>
Re: Error: Vignette re-building failed. Execution halted
Posted by Hyukjin Kwon <gu...@gmail.com>.
Looks like you haven't installed the 'e1071' package.
2020년 6월 24일 (수) 오후 6:49, Anwar AliKhan <an...@gmail.com>님이 작성:
> ./dev/make-distribution.sh --name custom-spark --pip --r --tgz -Psparkr
> -Phive -Phive-thriftserver -Pmesos -Pyarn -Pkubernetes
> <http://www.backbutton.co.uk/>
>
>
> minor error Spark r test failed , I don't use r so it doesn't effect me.
>
> ***installing help indices
> ** building package indices
> ** installing vignettes
> ** testing if installed package can be loaded from temporary location
> ** testing if installed package can be loaded from final location
> ** testing if installed package keeps a record of temporary installation
> path
> * DONE (SparkR)
> ++ cd /opt/spark/R/lib
> ++ jar cfM /opt/spark/R/lib/sparkr.zip SparkR
> ++ popd
> ++ cd /opt/spark/R/..
> ++ pwd
> + SPARK_HOME=/opt/spark
> + . /opt/spark/bin/load-spark-env.sh
> ++ '[' -z /opt/spark ']'
> ++ SPARK_ENV_SH=spark-env.sh
> ++ '[' -z '' ']'
> ++ export SPARK_ENV_LOADED=1
> ++ SPARK_ENV_LOADED=1
> ++ export SPARK_CONF_DIR=/opt/spark/conf
> ++ SPARK_CONF_DIR=/opt/spark/conf
> ++ SPARK_ENV_SH=/opt/spark/conf/spark-env.sh
> ++ [[ -f /opt/spark/conf/spark-env.sh ]]
> ++ set -a
> ++ . /opt/spark/conf/spark-env.sh
> +++ export SPARK_LOCAL_IP=192.168.0.786
> +++ SPARK_LOCAL_IP=192.168.0.786
> ++ set +a
> ++ export SPARK_SCALA_VERSION=2.12
> ++ SPARK_SCALA_VERSION=2.12
> + '[' -f /opt/spark/RELEASE ']'
> + SPARK_JARS_DIR=/opt/spark/assembly/target/scala-2.12/jars
> + '[' -d /opt/spark/assembly/target/scala-2.12/jars ']'
> + SPARK_HOME=/opt/spark
> + /usr/bin/R CMD build /opt/spark/R/pkg
> * checking for file ‘/opt/spark/R/pkg/DESCRIPTION’ ... OK
> * preparing ‘SparkR’:
> * checking DESCRIPTION meta-information ... OK
> * installing the package to build vignettes
> * creating vignettes ... ERROR
> --- re-building ‘sparkr-vignettes.Rmd’ using rmarkdown
>
> Attaching package: 'SparkR'
>
> The following objects are masked from 'package:stats':
>
> cov, filter, lag, na.omit, predict, sd, var, window
>
> The following objects are masked from 'package:base':
>
> as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
> rank, rbind, sample, startsWith, subset, summary, transform, union
>
> Picked up _JAVA_OPTIONS: -XX:-UsePerfData
> Picked up _JAVA_OPTIONS: -XX:-UsePerfData
> 20/06/24 10:23:54 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
> setLogLevel(newLevel).
>
> [Stage 0:> (0 +
> 1) / 1]
>
>
>
>
> [Stage 9:=================================================> (88 + 1)
> / 100]
>
>
>
>
> [Stage 13:=======================================> (147 + 1)
> / 200]
>
>
>
> 20/06/24 10:24:04 WARN Instrumentation: [79237008] regParam is zero, which
> might cause numerical instability and overfitting.
> 20/06/24 10:24:04 WARN BLAS: Failed to load implementation from:
> com.github.fommil.netlib.NativeSystemBLAS
> 20/06/24 10:24:04 WARN BLAS: Failed to load implementation from:
> com.github.fommil.netlib.NativeRefBLAS
> 20/06/24 10:24:04 WARN LAPACK: Failed to load implementation from:
> com.github.fommil.netlib.NativeSystemLAPACK
> 20/06/24 10:24:04 WARN LAPACK: Failed to load implementation from:
> com.github.fommil.netlib.NativeRefLAPACK
> 20/06/24 10:24:09 WARN package: Truncated the string representation of a
> plan since it was too large. This behavior can be adjusted by setting
> 'spark.sql.debug.maxToStringFields'.
>
> [Stage 67:============> (45 + 1)
> / 200]
>
> [Stage 67:=================> (62 + 1)
> / 200]
>
> [Stage 67:======================> (80 + 1)
> / 200]
>
> [Stage 67:==========================> (98 + 1)
> / 200]
>
> [Stage 67:==============================> (114 + 1)
> / 200]
>
> [Stage 67:===================================> (132 + 1)
> / 200]
>
> [Stage 67:=======================================> (148 + 1)
> / 200]
>
> [Stage 67:============================================> (166 + 1)
> / 200]
>
> [Stage 67:=================================================> (184 + 1)
> / 200]
>
>
>
>
> [Stage 69:============> (44 + 1)
> / 200]
>
> [Stage 69:================> (61 + 1)
> / 200]
>
> [Stage 69:=====================> (79 + 1)
> / 200]
>
> [Stage 69:==========================> (97 + 1)
> / 200]
>
> [Stage 69:===============================> (116 + 1)
> / 200]
>
> [Stage 69:====================================> (134 + 1)
> / 200]
>
> [Stage 69:=========================================> (152 + 1)
> / 200]
>
> [Stage 69:=============================================> (169 + 1)
> / 200]
>
> [Stage 69:==================================================> (187 + 1)
> / 200]
>
>
>
>
> [Stage 70:> (0 +
> 1) / 5]
> 20/06/24 10:24:14 ERROR Executor: Exception in task 0.0 in stage 70.0 (TID
> 1148)
> org.apache.spark.SparkException: R unexpectedly exited.
> R worker produced errors: Error in FUN(X[[i]], ...) :
> requireNamespace("e1071", quietly = TRUE) is not TRUE
>
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
> at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
> at scala.collection.Iterator.foreach(Iterator.scala:941)
> at scala.collection.Iterator.foreach$(Iterator.scala:941)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
> at
> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
> at
> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
> at
> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
> at
> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
> at
> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
> at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> at org.apache.spark.scheduler.Task.run(Task.scala:127)
> at
> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
> at
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:392)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
> ... 27 more
> 20/06/24 10:24:14 WARN TaskSetManager: Lost task 0.0 in stage 70.0 (TID
> 1148, 192.168.0.38, executor driver): org.apache.spark.SparkException: R
> unexpectedly exited.
> R worker produced errors: Error in FUN(X[[i]], ...) :
> requireNamespace("e1071", quietly = TRUE) is not TRUE
>
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
> at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
> at scala.collection.Iterator.foreach(Iterator.scala:941)
> at scala.collection.Iterator.foreach$(Iterator.scala:941)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
> at
> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
> at
> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
> at
> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
> at
> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
> at
> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
> at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> at org.apache.spark.scheduler.Task.run(Task.scala:127)
> at
> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
> at
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:392)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
> ... 27 more
>
> 20/06/24 10:24:14 ERROR TaskSetManager: Task 0 in stage 70.0 failed 1
> times; aborting job
> 20/06/24 10:24:14 ERROR RBackendHandler: collect on 358 failed
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:164)
> at
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:105)
> at
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:39)
> at
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> at
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> at
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> at
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321)
> at
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> at
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
> at
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
> at
> io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
> at
> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
> at
> io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.spark.SparkException: Job aborted due to stage
> failure: Task 0 in stage 70.0 failed 1 times, most recent failure: Lost
> task 0.0 in stage 70.0 (TID 1148, 192.168.0.38, executor driver):
> org.apache.spark.SparkException: R unexpectedly exited.
> R worker produced errors: Error in FUN(X[[i]], ...) :
> requireNamespace("e1071", quietly = TRUE) is not TRUE
>
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
> at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
> at scala.collection.Iterator.foreach(Iterator.scala:941)
> at scala.collection.Iterator.foreach$(Iterator.scala:941)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
> at
> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
> at
> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
> at
> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
> at
> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
> at
> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
> at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> at org.apache.spark.scheduler.Task.run(Task.scala:127)
> at
> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
> at
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:392)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
> ... 27 more
>
> Driver stacktrace:
> at
> org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2117)
> at
> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2066)
> at
> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2065)
> at
> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
> at
> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
> at
> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
> at
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2065)
> at
> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1021)
> at
> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1021)
> at scala.Option.foreach(Option.scala:407)
> at
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1021)
> at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2297)
> at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2246)
> at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2235)
> at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
> at
> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:823)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2108)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2129)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2148)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2173)
> at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
> at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
> at
> org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:361)
> at
> org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:360)
> at
> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
> ... 37 more
> Caused by: org.apache.spark.SparkException: R unexpectedly exited.
> R worker produced errors: Error in FUN(X[[i]], ...) :
> requireNamespace("e1071", quietly = TRUE) is not TRUE
>
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
> at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
> at scala.collection.Iterator.foreach(Iterator.scala:941)
> at scala.collection.Iterator.foreach$(Iterator.scala:941)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
> at
> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
> at
> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
> at
> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
> at
> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
> at
> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
> at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> at org.apache.spark.scheduler.Task.run(Task.scala:127)
> at
> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
> at
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> ... 1 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:392)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
> ... 27 more
> 20/06/24 10:24:14 ERROR RRunner: R Writer thread got an exception
> org.apache.spark.TaskKilledException
> at
> org.apache.spark.TaskContextImpl.killTaskIfInterrupted(TaskContextImpl.scala:156)
> at
> org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:36)
> at
> org.apache.spark.api.r.BaseRRunner$WriterThread.run(BaseRRunner.scala:197)
> Quitting from lines 470-471 (sparkr-vignettes.Rmd)
> Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
> in stage 70.0 failed 1 times, most recent failure: Lost task 0.0 in stage
> 70.0 (TID 1148, 192.168.0.38, executor driver):
> org.apache.spark.SparkException: R unexpectedly exited.
> R worker produced errors: Error in FUN(X[[i]], ...) :
> requireNamespace("e1071", quietly = TRUE) is not TRUE
>
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:144)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator$$anonfun$1.applyOrElse(BaseRRunner.scala:137)
> at
> scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:128)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.hasNext(BaseRRunner.scala:113)
> at scala.collection.Iterator.foreach(Iterator.scala:941)
> at scala.collection.Iterator.foreach$(Iterator.scala:941)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.foreach(BaseRRunner.scala:102)
> at
> scala.collection.generic.Growable.$plus$plus$eq(Growable.scala:62)
> at
> scala.collection.generic.Growable.$plus$plus$eq$(Growable.scala:53)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:105)
> at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:49)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>(TraversableOnce.scala:315)
> at scala.collection.TraversableOnce.to
> <http://scala.collection.traversableonce.to/>$(TraversableOnce.scala:313)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.to(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toBuffer(TraversableOnce.scala:307)
> at
> scala.collection.TraversableOnce.toBuffer$(TraversableOnce.scala:307)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toBuffer(BaseRRunner.scala:102)
> at
> scala.collection.TraversableOnce.toArray(TraversableOnce.scala:294)
> at
> scala.collection.TraversableOnce.toArray$(TraversableOnce.scala:288)
> at
> org.apache.spark.api.r.BaseRRunner$ReaderIterator.toArray(BaseRRunner.scala:102)
> at org.apache.spark.rdd.RDD.$anonfun$collect$2(RDD.scala:1030)
> at
> org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2148)
> at
> org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> at org.apache.spark.scheduler.Task.run(Task.scala:127)
> at
> org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:464)
> at
> org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1377)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:467)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:392)
> at org.apache.spark.api.r.RRunner$$anon$1.read(RRunner.scala:98)
> ... 27 more
>
> Driver stacktrace:
> at
> org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2117)
> at
> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2066)
> at
> org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2065)
> at
> scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
> at
> scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
> at
> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
> at
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2065)
> at
> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1021)
> at
> org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1021)
> at scala.Option.foreach(Option.scala:407)
> at
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1021)
> at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2297)
> at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2246)
> at
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2235)
> at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
> at
> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:823)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2108)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2129)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2148)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:2173)
> at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
> at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
> at
> org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:361)
> at
> org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:360)
> at
> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:164)
> at
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:105)
> at
> org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:39)
> at
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> at
> io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> at
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> at
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:321)
> at
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:295)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> at
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
> at
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
> at io.netty.channel.nio.NioEventLoop.run(NioEventLo
> --- failed re-building ‘sparkr-vignettes.Rmd’
>
> SUMMARY: processing the following file failed:
> ‘sparkr-vignettes.Rmd’
>
> Error: Vignette re-building failed.
> Execution halted
> 20/06/24 10:24:15 WARN TaskSetManager: Lost task 1.0 in stage 70.0 (TID
> 1149, 192.168.0.38, executor driver): TaskKilled (Stage cancelled)
>
> <http://www.backbutton.co.uk/>
>
>