You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by shangan <sh...@corp.kaixin001.com> on 2010/10/08 09:56:40 UTC

hadoop log files increase without bounds

in core-default.xml, there's logging properties, but it seems not work.
as I checked all the nodes, the amount of files under logs directory is more than 10 and total size of which in each node is more than 300M. Do anybody know how to configure in order to remove old logs automatically ? 

the followings are the default configuration I get from core-default.xml for logging:


<!--- logging properties -->
<property>
  <name>hadoop.logfile.size</name>
  <value>10000000</value>
  <description>The max size of each log file</description>
</property>
<property>
  <name>hadoop.logfile.count</name>
  <value>10</value>
  <description>The max number of log files</description>
</property>

2010-10-08 



shangan