You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mrunit.apache.org by "Cosmin Lehene (JIRA)" <ji...@apache.org> on 2013/10/07 18:08:42 UTC
[jira] [Updated] (MRUNIT-193) Serialization.copy throws NPE instead
of ISE (missing serilization impl) for Hadoop 2.x
[ https://issues.apache.org/jira/browse/MRUNIT-193?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Cosmin Lehene updated MRUNIT-193:
---------------------------------
Description:
This may be the result of a refactoring.
The current code attempts to catch a NPE with the intent to detect a missing serialization implementation.
However this behavior differs between Hadoop 1.x and 2.x
Hadoop 1.x will NPE in the first try/catch block, while 2.x will in the second.
{noformat}
try {
serializer = (Serializer<Object>) serializationFactory
.getSerializer(clazz);
deserializer = (Deserializer<Object>) serializationFactory
.getDeserializer(clazz);
} catch (NullPointerException e) {
throw new IllegalStateException(
"No applicable class implementing Serialization in conf at io.serializations for "
+ orig.getClass(), e);
}
try {
final DataOutputBuffer outputBuffer = new DataOutputBuffer();
serializer.open(outputBuffer);
serializer.serialize(orig);
final DataInputBuffer inputBuffer = new DataInputBuffer();
inputBuffer.reset(outputBuffer.getData(), outputBuffer.getLength());
deserializer.open(inputBuffer);
return (T) deserializer.deserialize(copy);
} catch (final IOException e) {
throw new RuntimeException(e);
}
{noformat}
Hadoop 1.x
{code}
public <T> Serializer<T> getSerializer(Class<T> c) {
return getSerialization(c).getSerializer(c);
}
{code}
Hadoop 2.x
{code}
public <T> Serializer<T> getSerializer(Class<T> c) {
Serialization<T> serializer = getSerialization(c);
if (serializer != null) {
return serializer.getSerializer(c);
}
return null;
}
{code}
was:
This may be the result of a refactoring.
The current code attempts to catch a NPE with the intent to detect a missing serialization implementation.
However this behavior differs between Hadoop 1.x and 2.x
Hadoop 1.x will NPE in the first try/catch block, while 2.x will in the second.
{code}
try {
serializer = (Serializer<Object>) serializationFactory
.getSerializer(clazz);
deserializer = (Deserializer<Object>) serializationFactory
.getDeserializer(clazz);
} catch (NullPointerException e) {
throw new IllegalStateException(
"No applicable class implementing Serialization in conf at io.serializations for "
+ orig.getClass(), e);
}
try {
final DataOutputBuffer outputBuffer = new DataOutputBuffer();
serializer.open(outputBuffer);
serializer.serialize(orig);
final DataInputBuffer inputBuffer = new DataInputBuffer();
inputBuffer.reset(outputBuffer.getData(), outputBuffer.getLength());
deserializer.open(inputBuffer);
return (T) deserializer.deserialize(copy);
} catch (final IOException e) {
throw new RuntimeException(e);
}
{code}
Hadoop 1.x
{code}
public <T> Serializer<T> getSerializer(Class<T> c) {
return getSerialization(c).getSerializer(c);
}
{code}
Hadoop 2.x
{code}
public <T> Serializer<T> getSerializer(Class<T> c) {
Serialization<T> serializer = getSerialization(c);
if (serializer != null) {
return serializer.getSerializer(c);
}
return null;
}
{code}
> Serialization.copy throws NPE instead of ISE (missing serilization impl) for Hadoop 2.x
> ---------------------------------------------------------------------------------------
>
> Key: MRUNIT-193
> URL: https://issues.apache.org/jira/browse/MRUNIT-193
> Project: MRUnit
> Issue Type: Bug
> Affects Versions: 1.0.0
> Reporter: Cosmin Lehene
> Priority: Trivial
> Fix For: 1.0.0
>
> Original Estimate: 1h
> Remaining Estimate: 1h
>
> This may be the result of a refactoring.
> The current code attempts to catch a NPE with the intent to detect a missing serialization implementation.
> However this behavior differs between Hadoop 1.x and 2.x
> Hadoop 1.x will NPE in the first try/catch block, while 2.x will in the second.
> {noformat}
> try {
> serializer = (Serializer<Object>) serializationFactory
> .getSerializer(clazz);
> deserializer = (Deserializer<Object>) serializationFactory
> .getDeserializer(clazz);
> } catch (NullPointerException e) {
> throw new IllegalStateException(
> "No applicable class implementing Serialization in conf at io.serializations for "
> + orig.getClass(), e);
> }
> try {
> final DataOutputBuffer outputBuffer = new DataOutputBuffer();
> serializer.open(outputBuffer);
> serializer.serialize(orig);
> final DataInputBuffer inputBuffer = new DataInputBuffer();
> inputBuffer.reset(outputBuffer.getData(), outputBuffer.getLength());
> deserializer.open(inputBuffer);
> return (T) deserializer.deserialize(copy);
> } catch (final IOException e) {
> throw new RuntimeException(e);
> }
> {noformat}
> Hadoop 1.x
> {code}
> public <T> Serializer<T> getSerializer(Class<T> c) {
> return getSerialization(c).getSerializer(c);
> }
> {code}
> Hadoop 2.x
> {code}
> public <T> Serializer<T> getSerializer(Class<T> c) {
> Serialization<T> serializer = getSerialization(c);
> if (serializer != null) {
> return serializer.getSerializer(c);
> }
> return null;
> }
> {code}
--
This message was sent by Atlassian JIRA
(v6.1#6144)