You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Karl Anderson <kr...@monkey.org> on 2008/06/19 21:31:08 UTC

unit testing of Jython mappers/reducers

Does anyone have an example of a unit test setup for Jython jobs?  I'm  
unable to run my methods outside of the context of Hadoop.  This may  
be a general Jython issue.

Here is my attempt.  As mentioned in the comment, I am able to resolve  
"self.mapper.map", but I get an AttributeError when I attempt to call  
it.  Is this a Java polymorphism issue - maybe I'm not passing the  
right types, and the baseclass doesn't have a method definition with  
the right types?  Or do the JobConf methods to state input/output  
types that a normal Hadoop run calls have something to do with it?


# import style may matter
from org.apache import hadoop
from org.apache.hadoop.examples.kcluster import KMeansMapper

import unittest


class TestFoo(unittest.TestCase):
     def setUp(self):
         self.mapper = KMeansMapper()

     def testbar(self):
         # can do this:
         #   print self.mapper.map => resolves the method
         # this raises AttributeError: abstract method "map" not  
implemented
         self.mapper.map(hadoop.io.LongWritable(0),
                         hadoop.io.Text("10 1 0"),
                         hadoop.mapred.OutputCollector(),
                         hadoop.mapred.Reporter.NULL)

if __name__ == "__main__":
     unittest.main()