You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mrunit.apache.org by Dave Beech <da...@gmx.net> on 2012/04/21 10:51:37 UTC

Java 7 and failing unit tests

Guys - just wanted to make you aware that some unit tests are currently 
failing in trunk when building against Java 7. I know this isn't 
strictly a problem, or even wrong since Hadoop itself requires Java 6, 
but it just caught me out for a minute because of my Eclipse setup!

Test failures:
   
testJavaSerialization(org.apache.hadoop.mrunit.mapreduce.TestMapReduceDriver): 
java.lang.Integer cannot be cast to org.apache.hadoop.io.WritableComparable
   testJavaSerialization(org.apache.hadoop.mrunit.TestMapReduceDriver): 
java.lang.Integer cannot be cast to org.apache.hadoop.io.WritableComparable
   
testJavaSerialization(org.apache.hadoop.mrunit.TestPipelineMapReduceDriver): 
java.lang.Integer cannot be cast to org.apache.hadoop.io.WritableComparable

Maybe we should add a note in BUILD.txt to say Java 6 *must* be used? It 
sort of does at the moment but I think it could be more explicit.

Dave


Re: Java 7 and failing unit tests

Posted by Brock Noland <br...@cloudera.com>.
Excellent analysis! Thanks Dave and Jim!

On Sat, Apr 21, 2012 at 11:26 AM, Jim Donofrio <do...@gmail.com> wrote:
> java.lang.ClassCastException: java.lang.Integer cannot be cast to
> org.apache.hadoop.io.WritableComparable
>    at
> org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:119)
>    at java.util.TreeMap.compare(TreeMap.java:1188)
>    at java.util.TreeMap.put(TreeMap.java:531)
>    at
> org.apache.hadoop.mrunit.MapReduceDriverBase.shuffle(MapReduceDriverBase.java:172)
>    at org.apache.hadoop.mrunit.MapReduceDriver.run(MapReduceDriver.java:328)
>    at
> org.apache.hadoop.mrunit.MapReduceDriverBase.runTest(MapReduceDriverBase.java:137)
>    at org.apache.hadoop.mrunit.TestDriver.runTest(TestDriver.java:158)
>    at
> org.apache.hadoop.mrunit.TestMapReduceDriver.testJavaSerialization(TestMapReduceDriver.java:409)
>
>  public int compare(Object a, Object b) {
>    return compare((WritableComparable)a, (WritableComparable)b);
>  }
>
> Java 7:
>        Entry<K,V> t = root;
>        if (t == null) {
>            compare(key, key); // type (and possibly null) check
>
>            root = new Entry<>(key, value, null);
>            size = 1;
>            modCount++;
>            return null;
>        }
>
> Java 6:
>        Entry<K,V> t = root;
>        if (t == null) {
>        // TBD:
>        // 5045147: (coll) Adding null to an empty TreeSet should
>        // throw NullPointerException
>        //
>        // compare(key, key); // type check
>            root = new Entry<K,V>(key, value, null);
>            size = 1;
>            modCount++;
>            return null;
>        }
>
> Oops this is a bug in test case not the actual code, good catch. This
> problem is not unique to Java 7. The problem only appears in Java 7 because
> my java serialization tests only use 1 input value. Java 6's put method in
> TreeMap does not call compare when adding to an empty map while Java 7's put
> method does call compare when adding to an empty map.
>
> This can be fixed in the test case by setting a
> OutputValueGroupingComparator in the conf for Integer. Users that use java
> or other non Writable serializations would know they have to set the
> relevant comparators.
>
> I dont think we need to halt the current release because the test will only
> fail under java 7 which most users will not use with hadoop. I will create a
> jira and fix those test cases though.
>
>
>
> On 04/21/2012 09:46 AM, Brock Noland wrote:
>>
>> That is interesting...What line numbers are the errors being thrown?
>>
>> On Sat, Apr 21, 2012 at 3:51 AM, Dave Beech<da...@gmx.net>  wrote:
>>>
>>> Guys - just wanted to make you aware that some unit tests are currently
>>> failing in trunk when building against Java 7. I know this isn't strictly
>>> a
>>> problem, or even wrong since Hadoop itself requires Java 6, but it just
>>> caught me out for a minute because of my Eclipse setup!
>>>
>>> Test failures:
>>>
>>>  testJavaSerialization(org.apache.hadoop.mrunit.mapreduce.TestMapReduceDriver):
>>> java.lang.Integer cannot be cast to
>>> org.apache.hadoop.io.WritableComparable
>>>  testJavaSerialization(org.apache.hadoop.mrunit.TestMapReduceDriver):
>>> java.lang.Integer cannot be cast to
>>> org.apache.hadoop.io.WritableComparable
>>>
>>>  testJavaSerialization(org.apache.hadoop.mrunit.TestPipelineMapReduceDriver):
>>> java.lang.Integer cannot be cast to
>>> org.apache.hadoop.io.WritableComparable
>>>
>>> Maybe we should add a note in BUILD.txt to say Java 6 *must* be used? It
>>> sort of does at the moment but I think it could be more explicit.
>>>
>>> Dave
>>>
>>
>>
>



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Re: Java 7 and failing unit tests

Posted by Jim Donofrio <do...@gmail.com>.
java.lang.ClassCastException: java.lang.Integer cannot be cast to 
org.apache.hadoop.io.WritableComparable
     at 
org.apache.hadoop.io.WritableComparator.compare(WritableComparator.java:119)
     at java.util.TreeMap.compare(TreeMap.java:1188)
     at java.util.TreeMap.put(TreeMap.java:531)
     at 
org.apache.hadoop.mrunit.MapReduceDriverBase.shuffle(MapReduceDriverBase.java:172)
     at 
org.apache.hadoop.mrunit.MapReduceDriver.run(MapReduceDriver.java:328)
     at 
org.apache.hadoop.mrunit.MapReduceDriverBase.runTest(MapReduceDriverBase.java:137)
     at org.apache.hadoop.mrunit.TestDriver.runTest(TestDriver.java:158)
     at 
org.apache.hadoop.mrunit.TestMapReduceDriver.testJavaSerialization(TestMapReduceDriver.java:409)

   public int compare(Object a, Object b) {
     return compare((WritableComparable)a, (WritableComparable)b);
   }

Java 7:
         Entry<K,V> t = root;
         if (t == null) {
             compare(key, key); // type (and possibly null) check

             root = new Entry<>(key, value, null);
             size = 1;
             modCount++;
             return null;
         }

Java 6:
         Entry<K,V> t = root;
         if (t == null) {
         // TBD:
         // 5045147: (coll) Adding null to an empty TreeSet should
         // throw NullPointerException
         //
         // compare(key, key); // type check
             root = new Entry<K,V>(key, value, null);
             size = 1;
             modCount++;
             return null;
         }

Oops this is a bug in test case not the actual code, good catch. This 
problem is not unique to Java 7. The problem only appears in Java 7 
because my java serialization tests only use 1 input value. Java 6's put 
method in TreeMap does not call compare when adding to an empty map 
while Java 7's put method does call compare when adding to an empty map.

This can be fixed in the test case by setting a 
OutputValueGroupingComparator in the conf for Integer. Users that use 
java or other non Writable serializations would know they have to set 
the relevant comparators.

I dont think we need to halt the current release because the test will 
only fail under java 7 which most users will not use with hadoop. I will 
create a jira and fix those test cases though.


On 04/21/2012 09:46 AM, Brock Noland wrote:
> That is interesting...What line numbers are the errors being thrown?
>
> On Sat, Apr 21, 2012 at 3:51 AM, Dave Beech<da...@gmx.net>  wrote:
>> Guys - just wanted to make you aware that some unit tests are currently
>> failing in trunk when building against Java 7. I know this isn't strictly a
>> problem, or even wrong since Hadoop itself requires Java 6, but it just
>> caught me out for a minute because of my Eclipse setup!
>>
>> Test failures:
>>   testJavaSerialization(org.apache.hadoop.mrunit.mapreduce.TestMapReduceDriver):
>> java.lang.Integer cannot be cast to org.apache.hadoop.io.WritableComparable
>>   testJavaSerialization(org.apache.hadoop.mrunit.TestMapReduceDriver):
>> java.lang.Integer cannot be cast to org.apache.hadoop.io.WritableComparable
>>   testJavaSerialization(org.apache.hadoop.mrunit.TestPipelineMapReduceDriver):
>> java.lang.Integer cannot be cast to org.apache.hadoop.io.WritableComparable
>>
>> Maybe we should add a note in BUILD.txt to say Java 6 *must* be used? It
>> sort of does at the moment but I think it could be more explicit.
>>
>> Dave
>>
>
>

Re: Java 7 and failing unit tests

Posted by Brock Noland <br...@cloudera.com>.
That is interesting...What line numbers are the errors being thrown?

On Sat, Apr 21, 2012 at 3:51 AM, Dave Beech <da...@gmx.net> wrote:
> Guys - just wanted to make you aware that some unit tests are currently
> failing in trunk when building against Java 7. I know this isn't strictly a
> problem, or even wrong since Hadoop itself requires Java 6, but it just
> caught me out for a minute because of my Eclipse setup!
>
> Test failures:
>  testJavaSerialization(org.apache.hadoop.mrunit.mapreduce.TestMapReduceDriver):
> java.lang.Integer cannot be cast to org.apache.hadoop.io.WritableComparable
>  testJavaSerialization(org.apache.hadoop.mrunit.TestMapReduceDriver):
> java.lang.Integer cannot be cast to org.apache.hadoop.io.WritableComparable
>  testJavaSerialization(org.apache.hadoop.mrunit.TestPipelineMapReduceDriver):
> java.lang.Integer cannot be cast to org.apache.hadoop.io.WritableComparable
>
> Maybe we should add a note in BUILD.txt to say Java 6 *must* be used? It
> sort of does at the moment but I think it could be more explicit.
>
> Dave
>



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/