You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Robert James <sr...@gmail.com> on 2014/06/24 20:58:07 UTC

Centralized Spark Logging solution

We need a centralized spark logging solution.  Ideally, it should:

* Allow any Spark process to log at multiple levels (info, warn,
debug) using a single line, similar to log4j
* All logs should go to a central location - so, to read the logs, we
don't need to check each worker by itself
* Ideally, it should be configurable so that when the code is run
standalone (not on spark) to use a different (local) log (this last
point is point is optional, because we could add it with a wrapper)

Can you recommend something? How do you handle logging and debugging
spark applications? Do you go through the logs on each machine?