You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hbase.apache.org by mb...@apache.org on 2012/09/25 14:52:06 UTC

svn commit: r1389842 - /hbase/branches/0.89-fb/src/main/java/org/apache/hadoop/hbase/mapreduce/loadtest/LoadTest.java

Author: mbautin
Date: Tue Sep 25 12:52:06 2012
New Revision: 1389842

URL: http://svn.apache.org/viewvc?rev=1389842&view=rev
Log:
[master] Reduce number of retries for load tester to 1 million.

Author: pritam

Summary:
Earlier in D519615 I had made the max reduce attempts for the
reducers in the load tester to be Integer.MAX_VALUE, this had a problem
that all reducers were stuck in the pending state. I asked around in the
MR team but we couldn't figure out the exact reason for this, the theory
is that probably it was due to some overflow somewhere. I tried the run
with 10^6 and it worked. So reducing this accordingly.

Test Plan: Run it with the new number.

Reviewers: kranganathan, mbautin, liyintang

Reviewed By: liyintang

CC: hbase-eng@

Differential Revision: https://phabricator.fb.com/D582617

Modified:
    hbase/branches/0.89-fb/src/main/java/org/apache/hadoop/hbase/mapreduce/loadtest/LoadTest.java

Modified: hbase/branches/0.89-fb/src/main/java/org/apache/hadoop/hbase/mapreduce/loadtest/LoadTest.java
URL: http://svn.apache.org/viewvc/hbase/branches/0.89-fb/src/main/java/org/apache/hadoop/hbase/mapreduce/loadtest/LoadTest.java?rev=1389842&r1=1389841&r2=1389842&view=diff
==============================================================================
--- hbase/branches/0.89-fb/src/main/java/org/apache/hadoop/hbase/mapreduce/loadtest/LoadTest.java (original)
+++ hbase/branches/0.89-fb/src/main/java/org/apache/hadoop/hbase/mapreduce/loadtest/LoadTest.java Tue Sep 25 12:52:06 2012
@@ -62,7 +62,7 @@ public class LoadTest extends Configured
   // Since all tasks share the same jmx port, some tasks might fail since
   // they might run on the same machine and try to bind to the same jmx port.
   // Alleviating this situation by retrying tasks as long as we can.
-  public static final int MAX_REDUCE_TASK_ATTEMPTS = Integer.MAX_VALUE;
+  public static final int MAX_REDUCE_TASK_ATTEMPTS = 1000000;
 
   public static class Map
       extends Mapper<LongWritable, Text, LongWritable, Text> {