You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Ashish Dalal <da...@gmail.com> on 2015/08/12 13:17:17 UTC

Re: Adding R interpreter to Zeppelin

Hi All,

I am attaching the full procedure I adopted to run R from Apache Zeppelin
using Datalayer's zeppelin-R repo on the github but I am facing voidEval
function error.It has been one full week that I have been stuck at this
problem. Any help would be appreciated.

Please find Steps.txt as the file in which I have written the steps that I
have followed.

Here is the  code I wrote:

%r
print(1:5)
voidEval failed



Here is the error log:

ERROR StatusLogger No log4j2 configuration file found. Using default
configuration: logging only errors to the console.

StartRserve: first connect try failed with: Cannot connect: Connection
refused

StartRserve: waiting for Rserve to start ... (java.lang.UNIXProcess@1c9c5521
)

StartRserve: Rserve>

StartRserve: Rserve>Attaching package: ‘SparkR’

StartRserve: Rserve>

StartRserve: Rserve>The following objects are masked from ‘package:base’:

StartRserve: Rserve>

StartRserve: Rserve>    intersect, rbind, sample, summary, table

StartRserve: Rserve>

StartRserve: Rserve>Launching java with spark-submit command
/Users/ashish.dalal/downloads/spark/bin/spark-submit   sparkr-shell
/var/folders/62/5997pbmn0qjg8766pv4mjxwrr8n54y/T//RtmpDWgRDn/backend_portd74679a20ac

StartRserve: Rserve>SLF4J: Class path contains multiple SLF4J bindings.

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/interpreter/R/log4j-slf4j-impl-2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/interpreter/R/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/zeppelin-interpreter/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/zeppelin-server/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/incubator-zeppelin/zeppelin-zengine/target/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: Found binding in
[jar:file:/Users/ashish.dalal/Downloads/spark/assembly/target/scala-2.10/spark-assembly-1.5.0-SNAPSHOT-hadoop2.2.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]

StartRserve: Rserve>SLF4J: See
http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

StartRserve: Rserve>SLF4J: Actual binding is of type
[org.apache.logging.slf4j.Log4jLoggerFactory]

StartRserve: Rserve>ERROR StatusLogger No log4j2 configuration file found.
Using default configuration: logging only errors to the console.

StartRserve: Rserve>log4j:ERROR setFile(null,true) call failed.

StartRserve: Rserve>java.io.FileNotFoundException:  (No such file or
directory)

StartRserve: Rserve>    at java.io.FileOutputStream.open(Native Method)

StartRserve: Rserve>    at
java.io.FileOutputStream.<init>(FileOutputStream.java:221)

StartRserve: Rserve>    at
java.io.FileOutputStream.<init>(FileOutputStream.java:142)

StartRserve: Rserve>    at
org.apache.log4j.FileAppender.setFile(FileAppender.java:294)

StartRserve: Rserve>    at
org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)

StartRserve: Rserve>    at
org.apache.log4j.DailyRollingFileAppender.activateOptions(DailyRollingFileAppender.java:223)

StartRserve: Rserve>    at
org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)

StartRserve: Rserve>    at
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)

StartRserve: Rserve>    at
org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:809)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:615)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:502)

StartRserve: Rserve>    at
org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:547)

StartRserve: Rserve>    at
org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:483)

StartRserve: Rserve>    at
org.apache.log4j.LogManager.<clinit>(LogManager.java:127)

StartRserve: Rserve>    at
org.apache.log4j.Logger.getLogger(Logger.java:104)

StartRserve: Rserve>    at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:75)

StartRserve: Rserve>    at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)

StartRserve: Rserve>    at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:52)

StartRserve: Rserve>    at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2034)

StartRserve: Rserve>    at
scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)

StartRserve: Rserve>    at
org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2025)

StartRserve: Rserve>    at
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:55)

StartRserve: Rserve>    at
org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:253)

StartRserve: Rserve>    at
org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:53)

StartRserve: Rserve>    at
org.apache.spark.SparkEnv$.create(SparkEnv.scala:252)

StartRserve: Rserve>    at
org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)

StartRserve: Rserve>    at
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)

StartRserve: Rserve>    at
org.apache.spark.SparkContext.<init>(SparkContext.scala:432)

StartRserve: Rserve>    at
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)

StartRserve: Rserve>    at
org.apache.spark.api.r.RRDD$.createSparkContext(RRDD.scala:375)

StartRserve: Rserve>    at
org.apache.spark.api.r.RRDD.createSparkContext(RRDD.scala)

StartRserve: Rserve>    at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

StartRserve: Rserve>    at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

StartRserve: Rserve>    at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

StartRserve: Rserve>    at java.lang.reflect.Method.invoke(Method.java:606)

StartRserve: Rserve>    at
org.apache.spark.api.r.RBackendHandler.handleMethodCall(RBackendHandler.scala:132)

StartRserve: Rserve>    at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:79)

StartRserve: Rserve>    at
org.apache.spark.api.r.RBackendHandler.channelRead0(RBackendHandler.scala:38)

StartRserve: Rserve>    at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

StartRserve: Rserve>    at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

StartRserve: Rserve>    at
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

StartRserve: Rserve>    at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

StartRserve: Rserve>    at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)



14:30:34.949 [pool-3-thread-2] ERROR io.datalayer.zeppelin.R.RInterpreter -
Exception while connecting to Rserve

org.rosuda.REngine.Rserve.RserveException: voidEval failed

        at
org.rosuda.REngine.Rserve.RConnection.voidEval(RConnection.java:209)
~[Rserve-1.8.2-SNAPSHOT.jar:?]

        at
io.datalayer.zeppelin.R.RInterpreter.interpret(RInterpreter.java:117)
[zeppelin-R-1.0.0-SNAPSHOT.jar:?]

        at
io.datalayer.zeppelin.R.RInterpreter.interpret(RInterpreter.java:98)
[zeppelin-R-1.0.0-SNAPSHOT.jar:?]

        at
org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:57)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:276)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at org.apache.zeppelin.scheduler.Job.run(Job.java:170)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:118)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
[?:1.7.0_79]

        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
[?:1.7.0_79]

        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
[?:1.7.0_79]

        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
[?:1.7.0_79]

        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[?:1.7.0_79]

        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[?:1.7.0_79]

        at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]



14:30:34.995 [pool-3-thread-2] ERROR io.datalayer.zeppelin.R.RInterpreter -
Exception while connecting to Rserve

org.rosuda.REngine.Rserve.RserveException: voidEval failed

        at
org.rosuda.REngine.Rserve.RConnection.voidEval(RConnection.java:209)
~[Rserve-1.8.2-SNAPSHOT.jar:?]

        at
io.datalayer.zeppelin.R.RInterpreter.interpret(RInterpreter.java:117)
[zeppelin-R-1.0.0-SNAPSHOT.jar:?]

        at
org.apache.zeppelin.interpreter.ClassloaderInterpreter.interpret(ClassloaderInterpreter.java:57)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:93)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:276)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at org.apache.zeppelin.scheduler.Job.run(Job.java:170)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:118)
[zeppelin-interpreter-0.6.0-incubating-SNAPSHOT.jar:0.6.0-incubating-SNAPSHOT]

        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
[?:1.7.0_79]

        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
[?:1.7.0_79]

        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
[?:1.7.0_79]

        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
[?:1.7.0_79]

        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[?:1.7.0_79]

        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[?:1.7.0_79]

        at java.lang.Thread.run(Thread.java:745) [?:1.7.0_79]

On Thu, Jul 30, 2015 at 9:25 PM, Ashish Dalal <da...@gmail.com>
wrote:

> I am able to see that R interpreter is registered in the Interpreter Group
> but when I write R code and evaluate it, it throws an error saying
> "voidEval failed".
>
> Thanks
>
> On Thu, Jul 30, 2015 at 8:00 PM, <fe...@hotmail.com> wrote:
>
>> *There* is another effort (that is not GPLv3 licensed) I am part of for
>> a SparkR interpreter.
>>
>> What is the issue you are seeing?
>>
>> From: Ashish Dalal
>> Sent: Wednesday, July 29, 2:02 PM
>> Subject: Adding R interpreter to Zeppelin
>> To: users@zeppelin.incubator.apache.org
>>
>> Hi all,
>>
>> I am working on adding R interpreter to Zeppelin.
>>
>> There is a github repo by datalayer.io  (
>> https://github.com/datalayer/zeppelin-R) and I have added it's
>> RInterpreter.java  & RInterpreterTest.java along with modifying pom.xml but
>> I am still not able to run R code in Apache Zeppelin.
>>
>> I am able to see the R interpreter registered in the list of interpreters
>> when I start running the Zeppelin, but the evaluations for the R code I
>> write is not getting done.
>>
>> Please help if you can.
>>
>> Thanks!
>>
>> Regards,
>>
>> Ashish Dalal
>>
>>
>