You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Sourav Mazumder <so...@gmail.com> on 2015/09/17 15:18:08 UTC
Some Qs on the life cycle of a Spark interpreter
Hi,
In my understanding when I run any paragraph on any notebook using Spark
interpreter it internally starts a Spark Context/Spark Driver. And that
keeps on running.
Here are my questions on that -
1. Is the Zeppelin Interpreter process same as this Spark Context/Spark
Driver ?
1. Is there a way I can stop any such running Spark Context/Spark Driver
gracefully without restarting the Zeppelin server.
2. Many a times I'm getting the error that says 'Connection refused at
RemoteINterpreter'. I believe in this case what is happening is the Spark
Context/Spark Driver is getting killed. So Zeppelin process is not able to
communicate with the same. Is this understanding right ?
4. Many a times I see the error that Scheduler Process is not running/got
killed . The only option in that case is to restart the Zeppelin server. Is
this Scheduler process spawns the Interpreter process ? How do we configure
the memory etc for the same ?
Any pointer would be helpful on these doubts.
Regards,
Sourav