You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by Andrew Musselman <an...@gmail.com> on 2022/01/04 19:33:56 UTC

spark.r.backendConnectionTimeout

Is there a reliable way to change spark.r.backendConnectionTimeout in
0.10.0? I've tried setting it to infinity or 6000000 in the spark
interpreter config and after interpreter restart I still get this message
after a day when I execute an R cell:
`sparkR backend is dead, please try to increase
spark.r.backendConnectionTimeout`

R version:

$ R --version
R version 3.4.4 (2018-03-15) -- "Someone to Lean On"
Copyright (C) 2018 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)


Previous log lines to the WARN:

zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:00,585] ({SchedulerFactory5}
VFSNotebookRepo.java[save]:144) - Saving note 2GPAQ28TJ to R
Test_2GPAQ28TJ.zpln
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:00,587] ({SchedulerFactory5}
AbstractScheduler.java[runJob]:154) - Job
paragraph_1639695005043_382078932 finished by scheduler
RemoteInterpreter-r-shared_process-shared_session with status ERROR
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,641] ({Thread-13}
ZeppelinServer.java[lambda$shutdown$0]:316) - Shutting down Zeppelin
Server ...
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,647] ({qtp1219402581-14}
NotebookServer.java[onClose]:474) - Closed connection to
127.0.0.1:36372 (1006) Disconnected
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,648] ({Thread-13}
AbstractConnector.java[doStop]:381) - Stopped
ServerConnector@3270d194{HTTP/1.1, (http/1.1)}{127.0.0.1:8080}
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,648] ({Thread-13}
HouseKeeper.java[stopScavenging]:158) - node0 Stopped scavenging
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,665] ({Thread-13}
ContextHandler.java[doStop]:1099) - Stopped
o.e.j.w.WebAppContext@7ae42ce3{zeppelin-web-angular,/next,null,UNAVAILABLE}{/home/ubuntu/zeppelin-0.10.0-bin-all/zeppelin-web-angular-0.10.0.war}
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,674] ({Thread-13}
ContextHandler.java[doStop]:1099) - Stopped
o.e.j.w.WebAppContext@9ebe38b{zeppelin-web,/,null,UNAVAILABLE}{/home/ubuntu/zeppelin-0.10.0-bin-all/zeppelin-web-0.10.0.war}
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,678] ({ignite-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: ignite
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,678] ({ignite-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: ignite
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,683] ({geode-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: geode
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,684] ({jdbc-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: jdbc
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,686] ({beam-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: beam
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,686] ({beam-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: beam
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,686] ({influxdb-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting:
influxdb
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,686] ({lens-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: lens
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,690] ({pig-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: pig
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,690] ({geode-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: geode
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,690] ({jdbc-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: jdbc
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log- INFO
[2022-01-04 17:56:05,690] ({file-close}
InterpreterSetting.java[close]:535) - Close InterpreterSetting: file
--
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log.2021-12-21-
INFO [2021-12-21 01:21:18,597] ({qtp1219402581-14}
NotebookService.java[runParagraph]:346) - Start to run paragraph:
paragraph_1639695005043_382078932 of note: 2GPAQ28TJ
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log.2021-12-21-
INFO [2021-12-21 01:21:18,598] ({qtp1219402581-14}
VFSNotebookRepo.java[save]:144) - Saving note 2GPAQ28TJ to R
Test_2GPAQ28TJ.zpln
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log.2021-12-21-
INFO [2021-12-21 01:21:18,600] ({SchedulerFactory4}
AbstractScheduler.java[runJob]:127) - Job
paragraph_1639695005043_382078932 started by scheduler
RemoteInterpreter-r-shared_process-shared_session
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log.2021-12-21-
INFO [2021-12-21 01:21:18,601] ({SchedulerFactory4}
Paragraph.java[jobRun]:416) - Run paragraph [paragraph_id:
paragraph_1639695005043_382078932, interpreter:
org.apache.zeppelin.r.RInterpreter, note_id: 2GPAQ28TJ, user:
anonymous]
zeppelin-0.10.0-bin-all/logs/zeppelin-ubuntu-ip-172-31-85-96.log.2021-12-21:
WARN [2021-12-21 01:21:18,608] ({SchedulerFactory4}
NotebookServer.java[onStatusChange]:1986) - Job
paragraph_1639695005043_382078932 is finished, status: ERROR,
exception: null, result: %text sparkR backend is dead, please try to
increase spark.r.backendConnectionTimeout