You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Schmirr Wurst <sc...@gmail.com> on 2015/07/27 12:15:10 UTC
reload spark context
Hi
- Is there a way to reload the spark context in the gui ?
- I've also realised that sc.hadoopConfiguration.set() is only working
once, after the first time, even if I modify the params with the same
function again, they don't seems to change...
- By the way, is there a way to print the parameter I've set ?... I
tried sc.hadoopConfiguration.get("myparam"), but doesn't work...
Best,
Sw
Re: reload spark context
Posted by Alexander Bezzubov <ab...@nflabs.com>.
Hi,
I'm not sure if that exactly 'reload spark context' is, but if you go
to 'Interpreters' and hit 'restart' button for the spark interpreter
that will restart Spark and create a new context.
Other 2 question look like a Spar-specific behaviour, so It might be
better to ask Spark community directly.
Hope this helps.
--
BR,
Alexander
On Mon, Jul 27, 2015 at 7:15 PM, Schmirr Wurst <sc...@gmail.com> wrote:
> Hi
>
> - Is there a way to reload the spark context in the gui ?
> - I've also realised that sc.hadoopConfiguration.set() is only working
> once, after the first time, even if I modify the params with the same
> function again, they don't seems to change...
> - By the way, is there a way to print the parameter I've set ?... I
> tried sc.hadoopConfiguration.get("myparam"), but doesn't work...
>
> Best,
> Sw
--
--
Kind regards,
Alexander.