You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@felix.apache.org by David Leangen <os...@leangen.net> on 2016/08/20 06:23:19 UTC

[Converter] Serializer

Hi,

I am trying to use the Convert/Codec as a serializer, but it has been a bit of a struggle so far.

I have a “deep” object structure (i.e. at least one level of embedded objects). Writing the data as a JSON string works just fine. However, when deserialising, I am having trouble. The data object uses generics, so it is not possible to determine the type at runtime. My embedded objects end up being deserialised as Maps, which of course causes a ClassCastException later on in the code.

I have tried adding a Rule using an adapter, and setting the codec.with(thatAdapter), but that didn’t work out so well, either.

Here is an example of my attempt so far. I am working with Prevayler, so I need to serialize/deserialze a command object, which contains some data.

I think that this code _should_ work, which means that there is likely a bug in the code. Even so…

This is a very convoluted way of working, which doesn’t seem right to me. Am I missing some concept?


Example below.

Cheers,
=David


public class DTOSerializer<E>
        implements Serializer // This is the interface provided by Prevayler
{
    private final Converter converter;
    private final Codec codec;
    private final Class<E> type;

    public DTOSerializer( Converter aConverter, Codec aCodec, Class<E> aType )
    {
	// Very convoluted attempt at adding a rule to try to make the conversion of my “PutCommand” work during deserialization
        converter = aConverter.getAdapter()
                .rule( 
                        PutCommand.class, 
                        Map.class, 
                        dto -> { Map map = new HashMap<>(); map.put( "key", dto.key ); map.put( "entity", dto.entity ); return map; }, 
                        map -> { PutCommand c = new PutCommand(); c.key = (String)map.get( "key" ); c.entity = aConverter.convert( map.get( "entity" ) ).to( aType ); return c; } );
        codec = aCodec.with( converter );
    }

    @Override
    public Object readObject( InputStream in )
            throws Exception
    {
	// This is the raw data
        final Map<String, Object> map = codec.decode( Map.class ).from( in );
	// The name of the object type to cast to
        final String commandTypeName = (String)map.get( "command" );
        final Class<?> commandType = Class.forName( commandTypeName );
	// This indeed returns an object of the correct type,
	// HOWEVER, the embedded objects are not of the correct type, they are of type Map
        final Object command = converter.convert( map.get( "payload" ) ).to( commandType );
        return command;
    }

    @Override
    public void writeObject( OutputStream out, Object object )
            throws Exception
    {
        final Map<String, Object> map = new HashMap<>();
	// Serialize the name of the command so I know what to cast it do when deserializing
        map.put( "command", object.getClass().getName() );
	// This is the actual payload, which is nothing more than the command object
        map.put( "payload", object );
        codec.encode( map ).to( out );
    }
}


Re: [Converter] Serializer

Posted by David Leangen <os...@leangen.net>.
Hi,

I thought it would be easier to base this on some code, so I submitted a patch to get us started:

  https://issues.apache.org/jira/browse/FELIX-5332 <https://issues.apache.org/jira/browse/FELIX-5332>


Cheers,
=David



> On Aug 24, 2016, at 6:07 AM, David Leangen <os...@leangen.net> wrote:
> 
> 
> Hi David B.,
> 
> You are definitely right: when you don’t need the type information serialised, it would be much too noisy and intrusive. If you need it, it would be very handy. The default should be that it not be there (i.e. as it works currently). If there were a way to configure to have the type information serialised, personally I think it would be a very nice feature.
> 
> I’m not sure if via ConfigAdmin would be the right path. IIUC, it would make it “all or nothing”. I think this type serialisation is required only in certain circumstances. I’m not sure how it could be done, but is there a way of using something like an Adaptor, or something else as a one-off mechanism?
> 
> 
> Cheers,
> =David
> 
> 
>> On Aug 24, 2016, at 12:35 AM, David Bosschaert <da...@gmail.com> wrote:
>> 
>> Hi David,
>> 
>> In that case you'd need to store the type information somewhere which,
>> depending on the target serialization format may or may not be possible.
>> For example in plain JSON you don't really have a place to store such
>> metadata, although you could add it in a 'special' "__metadata" key or
>> something.
>> 
>> I think this may be a use-case that could for some people be handy but for
>> others it might pollute the generated serialization with stuff that they
>> don't really want to see.
>> 
>> I guess it could be a special feature of a certain Codec. Codecs could be
>> configured via some mechanism (e.g. ConfigAdmin) to support this feature.
>> Then you should be able to get the behaviour you are looking for by calling:
>> 
>> MyDTO dto = (MyDTO) myCodec.decode(Object.class).from(json)
>> 
>> So my feeling is that something like this might already be possible given
>> the current API, although not every codec might support it. Maybe we can
>> try to expand one of the codecs that is in the Felix codebase to support
>> this enabled via a configuration setting?
>> 
>> Thoughts?
>> 
>> David
>> 
>> On 23 August 2016 at 01:50, David Leangen <os...@leangen.net> wrote:
>> 
>>> 
>>> Thanks, David B.,
>>> 
>>> The use cases are indeed very similar, but for one difference, I think.
>>> This line:
>>> 
>>> MyDTO dto2 = jsonCodec.decode(MyDTO.class).from(json);
>>> 
>>> You see, when you decode from JSON in this line, you already know in
>>> advance that you are deserialising a MyDTO object, so you feed it into the
>>> decoder. Easy schmeasy. But what do you do if you don’t know in advance
>>> what type the object is? That is the problem with deserialisation.
>>> 
>>> A generic serialiser has this very problem. Its job is to serialise (very
>>> easy) and deserialise (difficult because it somehow needs to know what the
>>> type is). If this were XML, you would somehow need to first determine the
>>> schema before you could deserialise into a statically-typed object. If you
>>> know the schema, then you can convert. JSON is no different. What is really
>>> cool about the DTO, though, is that the DTO class becomes the schema. So,
>>> all we would need to do so solve this problem is serialise the DTO type
>>> along with the data. Upon deserialisation, we could instantiate a class
>>> object (assuming with Class.forName(“dtoTypeName”)), then we’re good.
>>> 
>>> So, my suggestion is to, as part of the API, serialise the type
>>> information along with the data. That way, when calling the API, however,
>>> all you would need to do to deserialise is this:
>>> 
>>> // Note the absence of type information, which is different from the
>>> line above.
>>> // The line above requires knowledge of the type in advance.
>>> Object dto2 = jsonCodec.deserializeFrom(json);
>>> 
>>> 
>>> Of course, this cruft could be done outside of the service. However, it
>>> seems like an important use case to consider, and would make the
>>> DTO/Converter/Codec more complete and more compelling, for such a small
>>> addition.
>>> 
>>> wdyt?
>>> 
>>> Cheers,
>>> =David
>>> 
>>> 
>>>> On Aug 23, 2016, at 6:23 AM, David Bosschaert <
>>> david.bosschaert@gmail.com> wrote:
>>>> 
>>>> Hi David,
>>>> 
>>>> Maybe I'm missing something, but isn't this the same use-case as what is
>>>> done in the unit test JsonCodecTest.testDTO()? [1]
>>>> 
>>>> Basically there it has a DTO with an embedded DTO.
>>>> This is then converted into JSON with the JSON codec. The JSON contains
>>> the
>>>> embedded content as an embedded JSON object.
>>>> Then that JSON is converted back into the DTO and the embedded DTO is
>>>> automatically filled as part of that process.
>>>> 
>>>> Or is your case different?
>>>> 
>>>> Cheers,
>>>> 
>>>> David
>>>> 
>>>> [1]
>>>> https://svn.apache.org/viewvc/felix/trunk/converter/src/
>>> test/java/org/apache/felix/converter/impl/json/JsonCodecTest.java?view=
>>> markup#l102
>>>> 
>>>> On 21 August 2016 at 06:46, David Leangen <os...@leangen.net> wrote:
>>>> 
>>>>> 
>>>>> Hi Johan,
>>>>> 
>>>>> Thanks for your thoughts. Are you able to elaborate a little more?
>>>>> 
>>>>>> Whenever you serialize and say “Hey’ let’s use polymorphism", you are
>>>>> pretty
>>>>>> much bound to fail.
>>>>> 
>>>>> I’m not sure why you say this. I thought that by including the schema
>>>>> information, that is exactly what we were avoiding…
>>>>> 
>>>>> I’m not sure I’m getting your idea about the command pattern. Would you
>>>>> mind showing me a quick example?
>>>>> 
>>>>> Cheers,
>>>>> =David
>>>>> 
>>>>> 
>>>>> 
>>>>>> On Aug 21, 2016, at 2:42 PM, Johan Edstrom <se...@gmail.com> wrote:
>>>>>> 
>>>>>> Whenever you serialize and say “Hey’ let’s use polymorphism", you are
>>>>> pretty
>>>>>> much bound to fail.
>>>>>> 
>>>>>> Use something like a command pattern identifying what you are doing
>>>>>> so you can pick the de-serializer.
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>>> On Aug 20, 2016, at 11:32 PM, David Leangen <os...@leangen.net> wrote:
>>>>>>> 
>>>>>>> 
>>>>>>> I had a few thoughts about this topic.
>>>>>>> 
>>>>>>> In order to properly deserialise, we would need the schema. Since the
>>>>> DTO type *is* the schema, then somehow the schema needs to be
>>> serialised as
>>>>> well, or at least known upon serialisation.
>>>>>>> 
>>>>>>> If the API were to have a toSchema() method, that would simplify
>>> things
>>>>> a lot. A client should also be able to do something like this:
>>>>>>> 
>>>>>>> SomeDTO dto = ...
>>>>>>> codec.encode( dto ).serializeTo( out );
>>>>>>> 
>>>>>>> Or maybe:
>>>>>>> 
>>>>>>> codec.withSchema( SomeDTO.class).encode( dto ).to( out );
>>>>>>> 
>>>>>>> The SomeDTO schema would also be serialised (perhaps simply using the
>>>>> fully-qualified class name as an alias?), so then the client could do
>>> this:
>>>>>>> 
>>>>>>> Object o = codec.deserializeFrom( in )
>>>>>>> 
>>>>>>> Since the schema is included in the input data, the serializer will
>>>>> behind the scene do something like:
>>>>>>> 
>>>>>>> 
>>>>>>> Map<String, Object> map = codec.decode( Map.class ).from( in );
>>>>>>> // Get the schema type, which is SomeDTO.class
>>>>>>> ...
>>>>>>> Object o = converter.convert( map.get(“data”) ).to( SomeDTO.class );
>>>>>>> 
>>>>>>> 
>>>>>>> wdyt?
>>>>>>> 
>>>>>>> =David
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>>> On Aug 20, 2016, at 3:23 PM, David Leangen <os...@leangen.net> wrote:
>>>>>>>> 
>>>>>>>> 
>>>>>>>> Hi,
>>>>>>>> 
>>>>>>>> I am trying to use the Convert/Codec as a serializer, but it has been
>>>>> a bit of a struggle so far.
>>>>>>>> 
>>>>>>>> I have a “deep” object structure (i.e. at least one level of embedded
>>>>> objects). Writing the data as a JSON string works just fine. However,
>>> when
>>>>> deserialising, I am having trouble. The data object uses generics, so
>>> it is
>>>>> not possible to determine the type at runtime. My embedded objects end
>>> up
>>>>> being deserialised as Maps, which of course causes a ClassCastException
>>>>> later on in the code.
>>>>>>>> 
>>>>>>>> I have tried adding a Rule using an adapter, and setting the
>>>>> codec.with(thatAdapter), but that didn’t work out so well, either.
>>>>>>>> 
>>>>>>>> Here is an example of my attempt so far. I am working with Prevayler,
>>>>> so I need to serialize/deserialze a command object, which contains some
>>>>> data.
>>>>>>>> 
>>>>>>>> I think that this code _should_ work, which means that there is
>>> likely
>>>>> a bug in the code. Even so…
>>>>>>>> 
>>>>>>>> This is a very convoluted way of working, which doesn’t seem right to
>>>>> me. Am I missing some concept?
>>>>>>>> 
>>>>>>>> 
>>>>>>>> Example below.
>>>>>>>> 
>>>>>>>> Cheers,
>>>>>>>> =David
>>>>>>>> 
>>>>>>>> 
>>>>>>>> public class DTOSerializer<E>
>>>>>>>>   implements Serializer // This is the interface provided by
>>>>> Prevayler
>>>>>>>> {
>>>>>>>> private final Converter converter;
>>>>>>>> private final Codec codec;
>>>>>>>> private final Class<E> type;
>>>>>>>> 
>>>>>>>> public DTOSerializer( Converter aConverter, Codec aCodec, Class<E>
>>>>> aType )
>>>>>>>> {
>>>>>>>>  // Very convoluted attempt at adding a rule to try to make the
>>>>> conversion of my “PutCommand” work during deserialization
>>>>>>>>   converter = aConverter.getAdapter()
>>>>>>>>           .rule(
>>>>>>>>                   PutCommand.class,
>>>>>>>>                   Map.class,
>>>>>>>>                   dto -> { Map map = new HashMap<>(); map.put(
>>>>> "key", dto.key ); map.put( "entity", dto.entity ); return map; },
>>>>>>>>                   map -> { PutCommand c = new PutCommand(); c.key =
>>>>> (String)map.get( "key" ); c.entity = aConverter.convert( map.get(
>>> "entity"
>>>>> ) ).to( aType ); return c; } );
>>>>>>>>   codec = aCodec.with( converter );
>>>>>>>> }
>>>>>>>> 
>>>>>>>> @Override
>>>>>>>> public Object readObject( InputStream in )
>>>>>>>>       throws Exception
>>>>>>>> {
>>>>>>>>  // This is the raw data
>>>>>>>>   final Map<String, Object> map = codec.decode( Map.class ).from(
>>>>> in );
>>>>>>>>  // The name of the object type to cast to
>>>>>>>>   final String commandTypeName = (String)map.get( "command" );
>>>>>>>>   final Class<?> commandType = Class.forName( commandTypeName );
>>>>>>>>  // This indeed returns an object of the correct type,
>>>>>>>>  // HOWEVER, the embedded objects are not of the correct type, they
>>>>> are of type Map
>>>>>>>>   final Object command = converter.convert( map.get( "payload" )
>>>>> ).to( commandType );
>>>>>>>>   return command;
>>>>>>>> }
>>>>>>>> 
>>>>>>>> @Override
>>>>>>>> public void writeObject( OutputStream out, Object object )
>>>>>>>>       throws Exception
>>>>>>>> {
>>>>>>>>   final Map<String, Object> map = new HashMap<>();
>>>>>>>>  // Serialize the name of the command so I know what to cast it do
>>>>> when deserializing
>>>>>>>>   map.put( "command", object.getClass().getName() );
>>>>>>>>  // This is the actual payload, which is nothing more than the
>>>>> command object
>>>>>>>>   map.put( "payload", object );
>>>>>>>>   codec.encode( map ).to( out );
>>>>>>>> }
>>>>>>>> }
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>> 
>>>>> 
>>> 
>>> 
> 


Re: [Converter] Serializer

Posted by David Leangen <os...@leangen.net>.
Hi David B.,

You are definitely right: when you don’t need the type information serialised, it would be much too noisy and intrusive. If you need it, it would be very handy. The default should be that it not be there (i.e. as it works currently). If there were a way to configure to have the type information serialised, personally I think it would be a very nice feature.

I’m not sure if via ConfigAdmin would be the right path. IIUC, it would make it “all or nothing”. I think this type serialisation is required only in certain circumstances. I’m not sure how it could be done, but is there a way of using something like an Adaptor, or something else as a one-off mechanism?


Cheers,
=David


> On Aug 24, 2016, at 12:35 AM, David Bosschaert <da...@gmail.com> wrote:
> 
> Hi David,
> 
> In that case you'd need to store the type information somewhere which,
> depending on the target serialization format may or may not be possible.
> For example in plain JSON you don't really have a place to store such
> metadata, although you could add it in a 'special' "__metadata" key or
> something.
> 
> I think this may be a use-case that could for some people be handy but for
> others it might pollute the generated serialization with stuff that they
> don't really want to see.
> 
> I guess it could be a special feature of a certain Codec. Codecs could be
> configured via some mechanism (e.g. ConfigAdmin) to support this feature.
> Then you should be able to get the behaviour you are looking for by calling:
> 
> MyDTO dto = (MyDTO) myCodec.decode(Object.class).from(json)
> 
> So my feeling is that something like this might already be possible given
> the current API, although not every codec might support it. Maybe we can
> try to expand one of the codecs that is in the Felix codebase to support
> this enabled via a configuration setting?
> 
> Thoughts?
> 
> David
> 
> On 23 August 2016 at 01:50, David Leangen <os...@leangen.net> wrote:
> 
>> 
>> Thanks, David B.,
>> 
>> The use cases are indeed very similar, but for one difference, I think.
>> This line:
>> 
>>  MyDTO dto2 = jsonCodec.decode(MyDTO.class).from(json);
>> 
>> You see, when you decode from JSON in this line, you already know in
>> advance that you are deserialising a MyDTO object, so you feed it into the
>> decoder. Easy schmeasy. But what do you do if you don’t know in advance
>> what type the object is? That is the problem with deserialisation.
>> 
>> A generic serialiser has this very problem. Its job is to serialise (very
>> easy) and deserialise (difficult because it somehow needs to know what the
>> type is). If this were XML, you would somehow need to first determine the
>> schema before you could deserialise into a statically-typed object. If you
>> know the schema, then you can convert. JSON is no different. What is really
>> cool about the DTO, though, is that the DTO class becomes the schema. So,
>> all we would need to do so solve this problem is serialise the DTO type
>> along with the data. Upon deserialisation, we could instantiate a class
>> object (assuming with Class.forName(“dtoTypeName”)), then we’re good.
>> 
>> So, my suggestion is to, as part of the API, serialise the type
>> information along with the data. That way, when calling the API, however,
>> all you would need to do to deserialise is this:
>> 
>>  // Note the absence of type information, which is different from the
>> line above.
>>  // The line above requires knowledge of the type in advance.
>>  Object dto2 = jsonCodec.deserializeFrom(json);
>> 
>> 
>> Of course, this cruft could be done outside of the service. However, it
>> seems like an important use case to consider, and would make the
>> DTO/Converter/Codec more complete and more compelling, for such a small
>> addition.
>> 
>> wdyt?
>> 
>> Cheers,
>> =David
>> 
>> 
>>> On Aug 23, 2016, at 6:23 AM, David Bosschaert <
>> david.bosschaert@gmail.com> wrote:
>>> 
>>> Hi David,
>>> 
>>> Maybe I'm missing something, but isn't this the same use-case as what is
>>> done in the unit test JsonCodecTest.testDTO()? [1]
>>> 
>>> Basically there it has a DTO with an embedded DTO.
>>> This is then converted into JSON with the JSON codec. The JSON contains
>> the
>>> embedded content as an embedded JSON object.
>>> Then that JSON is converted back into the DTO and the embedded DTO is
>>> automatically filled as part of that process.
>>> 
>>> Or is your case different?
>>> 
>>> Cheers,
>>> 
>>> David
>>> 
>>> [1]
>>> https://svn.apache.org/viewvc/felix/trunk/converter/src/
>> test/java/org/apache/felix/converter/impl/json/JsonCodecTest.java?view=
>> markup#l102
>>> 
>>> On 21 August 2016 at 06:46, David Leangen <os...@leangen.net> wrote:
>>> 
>>>> 
>>>> Hi Johan,
>>>> 
>>>> Thanks for your thoughts. Are you able to elaborate a little more?
>>>> 
>>>>> Whenever you serialize and say “Hey’ let’s use polymorphism", you are
>>>> pretty
>>>>> much bound to fail.
>>>> 
>>>> I’m not sure why you say this. I thought that by including the schema
>>>> information, that is exactly what we were avoiding…
>>>> 
>>>> I’m not sure I’m getting your idea about the command pattern. Would you
>>>> mind showing me a quick example?
>>>> 
>>>> Cheers,
>>>> =David
>>>> 
>>>> 
>>>> 
>>>>> On Aug 21, 2016, at 2:42 PM, Johan Edstrom <se...@gmail.com> wrote:
>>>>> 
>>>>> Whenever you serialize and say “Hey’ let’s use polymorphism", you are
>>>> pretty
>>>>> much bound to fail.
>>>>> 
>>>>> Use something like a command pattern identifying what you are doing
>>>>> so you can pick the de-serializer.
>>>>> 
>>>>> 
>>>>> 
>>>>>> On Aug 20, 2016, at 11:32 PM, David Leangen <os...@leangen.net> wrote:
>>>>>> 
>>>>>> 
>>>>>> I had a few thoughts about this topic.
>>>>>> 
>>>>>> In order to properly deserialise, we would need the schema. Since the
>>>> DTO type *is* the schema, then somehow the schema needs to be
>> serialised as
>>>> well, or at least known upon serialisation.
>>>>>> 
>>>>>> If the API were to have a toSchema() method, that would simplify
>> things
>>>> a lot. A client should also be able to do something like this:
>>>>>> 
>>>>>> SomeDTO dto = ...
>>>>>> codec.encode( dto ).serializeTo( out );
>>>>>> 
>>>>>> Or maybe:
>>>>>> 
>>>>>> codec.withSchema( SomeDTO.class).encode( dto ).to( out );
>>>>>> 
>>>>>> The SomeDTO schema would also be serialised (perhaps simply using the
>>>> fully-qualified class name as an alias?), so then the client could do
>> this:
>>>>>> 
>>>>>> Object o = codec.deserializeFrom( in )
>>>>>> 
>>>>>> Since the schema is included in the input data, the serializer will
>>>> behind the scene do something like:
>>>>>> 
>>>>>> 
>>>>>> Map<String, Object> map = codec.decode( Map.class ).from( in );
>>>>>> // Get the schema type, which is SomeDTO.class
>>>>>> ...
>>>>>> Object o = converter.convert( map.get(“data”) ).to( SomeDTO.class );
>>>>>> 
>>>>>> 
>>>>>> wdyt?
>>>>>> 
>>>>>> =David
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>>> On Aug 20, 2016, at 3:23 PM, David Leangen <os...@leangen.net> wrote:
>>>>>>> 
>>>>>>> 
>>>>>>> Hi,
>>>>>>> 
>>>>>>> I am trying to use the Convert/Codec as a serializer, but it has been
>>>> a bit of a struggle so far.
>>>>>>> 
>>>>>>> I have a “deep” object structure (i.e. at least one level of embedded
>>>> objects). Writing the data as a JSON string works just fine. However,
>> when
>>>> deserialising, I am having trouble. The data object uses generics, so
>> it is
>>>> not possible to determine the type at runtime. My embedded objects end
>> up
>>>> being deserialised as Maps, which of course causes a ClassCastException
>>>> later on in the code.
>>>>>>> 
>>>>>>> I have tried adding a Rule using an adapter, and setting the
>>>> codec.with(thatAdapter), but that didn’t work out so well, either.
>>>>>>> 
>>>>>>> Here is an example of my attempt so far. I am working with Prevayler,
>>>> so I need to serialize/deserialze a command object, which contains some
>>>> data.
>>>>>>> 
>>>>>>> I think that this code _should_ work, which means that there is
>> likely
>>>> a bug in the code. Even so…
>>>>>>> 
>>>>>>> This is a very convoluted way of working, which doesn’t seem right to
>>>> me. Am I missing some concept?
>>>>>>> 
>>>>>>> 
>>>>>>> Example below.
>>>>>>> 
>>>>>>> Cheers,
>>>>>>> =David
>>>>>>> 
>>>>>>> 
>>>>>>> public class DTOSerializer<E>
>>>>>>>    implements Serializer // This is the interface provided by
>>>> Prevayler
>>>>>>> {
>>>>>>> private final Converter converter;
>>>>>>> private final Codec codec;
>>>>>>> private final Class<E> type;
>>>>>>> 
>>>>>>> public DTOSerializer( Converter aConverter, Codec aCodec, Class<E>
>>>> aType )
>>>>>>> {
>>>>>>>   // Very convoluted attempt at adding a rule to try to make the
>>>> conversion of my “PutCommand” work during deserialization
>>>>>>>    converter = aConverter.getAdapter()
>>>>>>>            .rule(
>>>>>>>                    PutCommand.class,
>>>>>>>                    Map.class,
>>>>>>>                    dto -> { Map map = new HashMap<>(); map.put(
>>>> "key", dto.key ); map.put( "entity", dto.entity ); return map; },
>>>>>>>                    map -> { PutCommand c = new PutCommand(); c.key =
>>>> (String)map.get( "key" ); c.entity = aConverter.convert( map.get(
>> "entity"
>>>> ) ).to( aType ); return c; } );
>>>>>>>    codec = aCodec.with( converter );
>>>>>>> }
>>>>>>> 
>>>>>>> @Override
>>>>>>> public Object readObject( InputStream in )
>>>>>>>        throws Exception
>>>>>>> {
>>>>>>>   // This is the raw data
>>>>>>>    final Map<String, Object> map = codec.decode( Map.class ).from(
>>>> in );
>>>>>>>   // The name of the object type to cast to
>>>>>>>    final String commandTypeName = (String)map.get( "command" );
>>>>>>>    final Class<?> commandType = Class.forName( commandTypeName );
>>>>>>>   // This indeed returns an object of the correct type,
>>>>>>>   // HOWEVER, the embedded objects are not of the correct type, they
>>>> are of type Map
>>>>>>>    final Object command = converter.convert( map.get( "payload" )
>>>> ).to( commandType );
>>>>>>>    return command;
>>>>>>> }
>>>>>>> 
>>>>>>> @Override
>>>>>>> public void writeObject( OutputStream out, Object object )
>>>>>>>        throws Exception
>>>>>>> {
>>>>>>>    final Map<String, Object> map = new HashMap<>();
>>>>>>>   // Serialize the name of the command so I know what to cast it do
>>>> when deserializing
>>>>>>>    map.put( "command", object.getClass().getName() );
>>>>>>>   // This is the actual payload, which is nothing more than the
>>>> command object
>>>>>>>    map.put( "payload", object );
>>>>>>>    codec.encode( map ).to( out );
>>>>>>> }
>>>>>>> }
>>>>>>> 
>>>>>> 
>>>>> 
>>>> 
>>>> 
>> 
>> 


Re: [Converter] Serializer

Posted by David Bosschaert <da...@gmail.com>.
Hi David,

In that case you'd need to store the type information somewhere which,
depending on the target serialization format may or may not be possible.
For example in plain JSON you don't really have a place to store such
metadata, although you could add it in a 'special' "__metadata" key or
something.

I think this may be a use-case that could for some people be handy but for
others it might pollute the generated serialization with stuff that they
don't really want to see.

I guess it could be a special feature of a certain Codec. Codecs could be
configured via some mechanism (e.g. ConfigAdmin) to support this feature.
Then you should be able to get the behaviour you are looking for by calling:

MyDTO dto = (MyDTO) myCodec.decode(Object.class).from(json)

So my feeling is that something like this might already be possible given
the current API, although not every codec might support it. Maybe we can
try to expand one of the codecs that is in the Felix codebase to support
this enabled via a configuration setting?

Thoughts?

David

On 23 August 2016 at 01:50, David Leangen <os...@leangen.net> wrote:

>
> Thanks, David B.,
>
> The use cases are indeed very similar, but for one difference, I think.
> This line:
>
>   MyDTO dto2 = jsonCodec.decode(MyDTO.class).from(json);
>
> You see, when you decode from JSON in this line, you already know in
> advance that you are deserialising a MyDTO object, so you feed it into the
> decoder. Easy schmeasy. But what do you do if you don’t know in advance
> what type the object is? That is the problem with deserialisation.
>
> A generic serialiser has this very problem. Its job is to serialise (very
> easy) and deserialise (difficult because it somehow needs to know what the
> type is). If this were XML, you would somehow need to first determine the
> schema before you could deserialise into a statically-typed object. If you
> know the schema, then you can convert. JSON is no different. What is really
> cool about the DTO, though, is that the DTO class becomes the schema. So,
> all we would need to do so solve this problem is serialise the DTO type
> along with the data. Upon deserialisation, we could instantiate a class
> object (assuming with Class.forName(“dtoTypeName”)), then we’re good.
>
> So, my suggestion is to, as part of the API, serialise the type
> information along with the data. That way, when calling the API, however,
> all you would need to do to deserialise is this:
>
>   // Note the absence of type information, which is different from the
> line above.
>   // The line above requires knowledge of the type in advance.
>   Object dto2 = jsonCodec.deserializeFrom(json);
>
>
> Of course, this cruft could be done outside of the service. However, it
> seems like an important use case to consider, and would make the
> DTO/Converter/Codec more complete and more compelling, for such a small
> addition.
>
> wdyt?
>
> Cheers,
> =David
>
>
> > On Aug 23, 2016, at 6:23 AM, David Bosschaert <
> david.bosschaert@gmail.com> wrote:
> >
> > Hi David,
> >
> > Maybe I'm missing something, but isn't this the same use-case as what is
> > done in the unit test JsonCodecTest.testDTO()? [1]
> >
> > Basically there it has a DTO with an embedded DTO.
> > This is then converted into JSON with the JSON codec. The JSON contains
> the
> > embedded content as an embedded JSON object.
> > Then that JSON is converted back into the DTO and the embedded DTO is
> > automatically filled as part of that process.
> >
> > Or is your case different?
> >
> > Cheers,
> >
> > David
> >
> > [1]
> > https://svn.apache.org/viewvc/felix/trunk/converter/src/
> test/java/org/apache/felix/converter/impl/json/JsonCodecTest.java?view=
> markup#l102
> >
> > On 21 August 2016 at 06:46, David Leangen <os...@leangen.net> wrote:
> >
> >>
> >> Hi Johan,
> >>
> >> Thanks for your thoughts. Are you able to elaborate a little more?
> >>
> >>> Whenever you serialize and say “Hey’ let’s use polymorphism", you are
> >> pretty
> >>> much bound to fail.
> >>
> >> I’m not sure why you say this. I thought that by including the schema
> >> information, that is exactly what we were avoiding…
> >>
> >> I’m not sure I’m getting your idea about the command pattern. Would you
> >> mind showing me a quick example?
> >>
> >> Cheers,
> >> =David
> >>
> >>
> >>
> >>> On Aug 21, 2016, at 2:42 PM, Johan Edstrom <se...@gmail.com> wrote:
> >>>
> >>> Whenever you serialize and say “Hey’ let’s use polymorphism", you are
> >> pretty
> >>> much bound to fail.
> >>>
> >>> Use something like a command pattern identifying what you are doing
> >>> so you can pick the de-serializer.
> >>>
> >>>
> >>>
> >>>> On Aug 20, 2016, at 11:32 PM, David Leangen <os...@leangen.net> wrote:
> >>>>
> >>>>
> >>>> I had a few thoughts about this topic.
> >>>>
> >>>> In order to properly deserialise, we would need the schema. Since the
> >> DTO type *is* the schema, then somehow the schema needs to be
> serialised as
> >> well, or at least known upon serialisation.
> >>>>
> >>>> If the API were to have a toSchema() method, that would simplify
> things
> >> a lot. A client should also be able to do something like this:
> >>>>
> >>>> SomeDTO dto = ...
> >>>> codec.encode( dto ).serializeTo( out );
> >>>>
> >>>> Or maybe:
> >>>>
> >>>> codec.withSchema( SomeDTO.class).encode( dto ).to( out );
> >>>>
> >>>> The SomeDTO schema would also be serialised (perhaps simply using the
> >> fully-qualified class name as an alias?), so then the client could do
> this:
> >>>>
> >>>> Object o = codec.deserializeFrom( in )
> >>>>
> >>>> Since the schema is included in the input data, the serializer will
> >> behind the scene do something like:
> >>>>
> >>>>
> >>>> Map<String, Object> map = codec.decode( Map.class ).from( in );
> >>>> // Get the schema type, which is SomeDTO.class
> >>>> ...
> >>>> Object o = converter.convert( map.get(“data”) ).to( SomeDTO.class );
> >>>>
> >>>>
> >>>> wdyt?
> >>>>
> >>>> =David
> >>>>
> >>>>
> >>>>
> >>>>> On Aug 20, 2016, at 3:23 PM, David Leangen <os...@leangen.net> wrote:
> >>>>>
> >>>>>
> >>>>> Hi,
> >>>>>
> >>>>> I am trying to use the Convert/Codec as a serializer, but it has been
> >> a bit of a struggle so far.
> >>>>>
> >>>>> I have a “deep” object structure (i.e. at least one level of embedded
> >> objects). Writing the data as a JSON string works just fine. However,
> when
> >> deserialising, I am having trouble. The data object uses generics, so
> it is
> >> not possible to determine the type at runtime. My embedded objects end
> up
> >> being deserialised as Maps, which of course causes a ClassCastException
> >> later on in the code.
> >>>>>
> >>>>> I have tried adding a Rule using an adapter, and setting the
> >> codec.with(thatAdapter), but that didn’t work out so well, either.
> >>>>>
> >>>>> Here is an example of my attempt so far. I am working with Prevayler,
> >> so I need to serialize/deserialze a command object, which contains some
> >> data.
> >>>>>
> >>>>> I think that this code _should_ work, which means that there is
> likely
> >> a bug in the code. Even so…
> >>>>>
> >>>>> This is a very convoluted way of working, which doesn’t seem right to
> >> me. Am I missing some concept?
> >>>>>
> >>>>>
> >>>>> Example below.
> >>>>>
> >>>>> Cheers,
> >>>>> =David
> >>>>>
> >>>>>
> >>>>> public class DTOSerializer<E>
> >>>>>     implements Serializer // This is the interface provided by
> >> Prevayler
> >>>>> {
> >>>>> private final Converter converter;
> >>>>> private final Codec codec;
> >>>>> private final Class<E> type;
> >>>>>
> >>>>> public DTOSerializer( Converter aConverter, Codec aCodec, Class<E>
> >> aType )
> >>>>> {
> >>>>>    // Very convoluted attempt at adding a rule to try to make the
> >> conversion of my “PutCommand” work during deserialization
> >>>>>     converter = aConverter.getAdapter()
> >>>>>             .rule(
> >>>>>                     PutCommand.class,
> >>>>>                     Map.class,
> >>>>>                     dto -> { Map map = new HashMap<>(); map.put(
> >> "key", dto.key ); map.put( "entity", dto.entity ); return map; },
> >>>>>                     map -> { PutCommand c = new PutCommand(); c.key =
> >> (String)map.get( "key" ); c.entity = aConverter.convert( map.get(
> "entity"
> >> ) ).to( aType ); return c; } );
> >>>>>     codec = aCodec.with( converter );
> >>>>> }
> >>>>>
> >>>>> @Override
> >>>>> public Object readObject( InputStream in )
> >>>>>         throws Exception
> >>>>> {
> >>>>>    // This is the raw data
> >>>>>     final Map<String, Object> map = codec.decode( Map.class ).from(
> >> in );
> >>>>>    // The name of the object type to cast to
> >>>>>     final String commandTypeName = (String)map.get( "command" );
> >>>>>     final Class<?> commandType = Class.forName( commandTypeName );
> >>>>>    // This indeed returns an object of the correct type,
> >>>>>    // HOWEVER, the embedded objects are not of the correct type, they
> >> are of type Map
> >>>>>     final Object command = converter.convert( map.get( "payload" )
> >> ).to( commandType );
> >>>>>     return command;
> >>>>> }
> >>>>>
> >>>>> @Override
> >>>>> public void writeObject( OutputStream out, Object object )
> >>>>>         throws Exception
> >>>>> {
> >>>>>     final Map<String, Object> map = new HashMap<>();
> >>>>>    // Serialize the name of the command so I know what to cast it do
> >> when deserializing
> >>>>>     map.put( "command", object.getClass().getName() );
> >>>>>    // This is the actual payload, which is nothing more than the
> >> command object
> >>>>>     map.put( "payload", object );
> >>>>>     codec.encode( map ).to( out );
> >>>>> }
> >>>>> }
> >>>>>
> >>>>
> >>>
> >>
> >>
>
>

Re: [Converter] Serializer

Posted by David Leangen <os...@leangen.net>.
Thanks, David B.,

The use cases are indeed very similar, but for one difference, I think. This line:

  MyDTO dto2 = jsonCodec.decode(MyDTO.class).from(json);  

You see, when you decode from JSON in this line, you already know in advance that you are deserialising a MyDTO object, so you feed it into the decoder. Easy schmeasy. But what do you do if you don’t know in advance what type the object is? That is the problem with deserialisation.

A generic serialiser has this very problem. Its job is to serialise (very easy) and deserialise (difficult because it somehow needs to know what the type is). If this were XML, you would somehow need to first determine the schema before you could deserialise into a statically-typed object. If you know the schema, then you can convert. JSON is no different. What is really cool about the DTO, though, is that the DTO class becomes the schema. So, all we would need to do so solve this problem is serialise the DTO type along with the data. Upon deserialisation, we could instantiate a class object (assuming with Class.forName(“dtoTypeName”)), then we’re good.

So, my suggestion is to, as part of the API, serialise the type information along with the data. That way, when calling the API, however, all you would need to do to deserialise is this:

  // Note the absence of type information, which is different from the line above.
  // The line above requires knowledge of the type in advance.
  Object dto2 = jsonCodec.deserializeFrom(json);  


Of course, this cruft could be done outside of the service. However, it seems like an important use case to consider, and would make the DTO/Converter/Codec more complete and more compelling, for such a small addition.

wdyt?

Cheers,
=David


> On Aug 23, 2016, at 6:23 AM, David Bosschaert <da...@gmail.com> wrote:
> 
> Hi David,
> 
> Maybe I'm missing something, but isn't this the same use-case as what is
> done in the unit test JsonCodecTest.testDTO()? [1]
> 
> Basically there it has a DTO with an embedded DTO.
> This is then converted into JSON with the JSON codec. The JSON contains the
> embedded content as an embedded JSON object.
> Then that JSON is converted back into the DTO and the embedded DTO is
> automatically filled as part of that process.
> 
> Or is your case different?
> 
> Cheers,
> 
> David
> 
> [1]
> https://svn.apache.org/viewvc/felix/trunk/converter/src/test/java/org/apache/felix/converter/impl/json/JsonCodecTest.java?view=markup#l102
> 
> On 21 August 2016 at 06:46, David Leangen <os...@leangen.net> wrote:
> 
>> 
>> Hi Johan,
>> 
>> Thanks for your thoughts. Are you able to elaborate a little more?
>> 
>>> Whenever you serialize and say “Hey’ let’s use polymorphism", you are
>> pretty
>>> much bound to fail.
>> 
>> I’m not sure why you say this. I thought that by including the schema
>> information, that is exactly what we were avoiding…
>> 
>> I’m not sure I’m getting your idea about the command pattern. Would you
>> mind showing me a quick example?
>> 
>> Cheers,
>> =David
>> 
>> 
>> 
>>> On Aug 21, 2016, at 2:42 PM, Johan Edstrom <se...@gmail.com> wrote:
>>> 
>>> Whenever you serialize and say “Hey’ let’s use polymorphism", you are
>> pretty
>>> much bound to fail.
>>> 
>>> Use something like a command pattern identifying what you are doing
>>> so you can pick the de-serializer.
>>> 
>>> 
>>> 
>>>> On Aug 20, 2016, at 11:32 PM, David Leangen <os...@leangen.net> wrote:
>>>> 
>>>> 
>>>> I had a few thoughts about this topic.
>>>> 
>>>> In order to properly deserialise, we would need the schema. Since the
>> DTO type *is* the schema, then somehow the schema needs to be serialised as
>> well, or at least known upon serialisation.
>>>> 
>>>> If the API were to have a toSchema() method, that would simplify things
>> a lot. A client should also be able to do something like this:
>>>> 
>>>> SomeDTO dto = ...
>>>> codec.encode( dto ).serializeTo( out );
>>>> 
>>>> Or maybe:
>>>> 
>>>> codec.withSchema( SomeDTO.class).encode( dto ).to( out );
>>>> 
>>>> The SomeDTO schema would also be serialised (perhaps simply using the
>> fully-qualified class name as an alias?), so then the client could do this:
>>>> 
>>>> Object o = codec.deserializeFrom( in )
>>>> 
>>>> Since the schema is included in the input data, the serializer will
>> behind the scene do something like:
>>>> 
>>>> 
>>>> Map<String, Object> map = codec.decode( Map.class ).from( in );
>>>> // Get the schema type, which is SomeDTO.class
>>>> ...
>>>> Object o = converter.convert( map.get(“data”) ).to( SomeDTO.class );
>>>> 
>>>> 
>>>> wdyt?
>>>> 
>>>> =David
>>>> 
>>>> 
>>>> 
>>>>> On Aug 20, 2016, at 3:23 PM, David Leangen <os...@leangen.net> wrote:
>>>>> 
>>>>> 
>>>>> Hi,
>>>>> 
>>>>> I am trying to use the Convert/Codec as a serializer, but it has been
>> a bit of a struggle so far.
>>>>> 
>>>>> I have a “deep” object structure (i.e. at least one level of embedded
>> objects). Writing the data as a JSON string works just fine. However, when
>> deserialising, I am having trouble. The data object uses generics, so it is
>> not possible to determine the type at runtime. My embedded objects end up
>> being deserialised as Maps, which of course causes a ClassCastException
>> later on in the code.
>>>>> 
>>>>> I have tried adding a Rule using an adapter, and setting the
>> codec.with(thatAdapter), but that didn’t work out so well, either.
>>>>> 
>>>>> Here is an example of my attempt so far. I am working with Prevayler,
>> so I need to serialize/deserialze a command object, which contains some
>> data.
>>>>> 
>>>>> I think that this code _should_ work, which means that there is likely
>> a bug in the code. Even so…
>>>>> 
>>>>> This is a very convoluted way of working, which doesn’t seem right to
>> me. Am I missing some concept?
>>>>> 
>>>>> 
>>>>> Example below.
>>>>> 
>>>>> Cheers,
>>>>> =David
>>>>> 
>>>>> 
>>>>> public class DTOSerializer<E>
>>>>>     implements Serializer // This is the interface provided by
>> Prevayler
>>>>> {
>>>>> private final Converter converter;
>>>>> private final Codec codec;
>>>>> private final Class<E> type;
>>>>> 
>>>>> public DTOSerializer( Converter aConverter, Codec aCodec, Class<E>
>> aType )
>>>>> {
>>>>>    // Very convoluted attempt at adding a rule to try to make the
>> conversion of my “PutCommand” work during deserialization
>>>>>     converter = aConverter.getAdapter()
>>>>>             .rule(
>>>>>                     PutCommand.class,
>>>>>                     Map.class,
>>>>>                     dto -> { Map map = new HashMap<>(); map.put(
>> "key", dto.key ); map.put( "entity", dto.entity ); return map; },
>>>>>                     map -> { PutCommand c = new PutCommand(); c.key =
>> (String)map.get( "key" ); c.entity = aConverter.convert( map.get( "entity"
>> ) ).to( aType ); return c; } );
>>>>>     codec = aCodec.with( converter );
>>>>> }
>>>>> 
>>>>> @Override
>>>>> public Object readObject( InputStream in )
>>>>>         throws Exception
>>>>> {
>>>>>    // This is the raw data
>>>>>     final Map<String, Object> map = codec.decode( Map.class ).from(
>> in );
>>>>>    // The name of the object type to cast to
>>>>>     final String commandTypeName = (String)map.get( "command" );
>>>>>     final Class<?> commandType = Class.forName( commandTypeName );
>>>>>    // This indeed returns an object of the correct type,
>>>>>    // HOWEVER, the embedded objects are not of the correct type, they
>> are of type Map
>>>>>     final Object command = converter.convert( map.get( "payload" )
>> ).to( commandType );
>>>>>     return command;
>>>>> }
>>>>> 
>>>>> @Override
>>>>> public void writeObject( OutputStream out, Object object )
>>>>>         throws Exception
>>>>> {
>>>>>     final Map<String, Object> map = new HashMap<>();
>>>>>    // Serialize the name of the command so I know what to cast it do
>> when deserializing
>>>>>     map.put( "command", object.getClass().getName() );
>>>>>    // This is the actual payload, which is nothing more than the
>> command object
>>>>>     map.put( "payload", object );
>>>>>     codec.encode( map ).to( out );
>>>>> }
>>>>> }
>>>>> 
>>>> 
>>> 
>> 
>> 


Re: [Converter] Serializer

Posted by David Bosschaert <da...@gmail.com>.
Hi David,

Maybe I'm missing something, but isn't this the same use-case as what is
done in the unit test JsonCodecTest.testDTO()? [1]

Basically there it has a DTO with an embedded DTO.
This is then converted into JSON with the JSON codec. The JSON contains the
embedded content as an embedded JSON object.
Then that JSON is converted back into the DTO and the embedded DTO is
automatically filled as part of that process.

Or is your case different?

Cheers,

David

[1]
https://svn.apache.org/viewvc/felix/trunk/converter/src/test/java/org/apache/felix/converter/impl/json/JsonCodecTest.java?view=markup#l102

On 21 August 2016 at 06:46, David Leangen <os...@leangen.net> wrote:

>
> Hi Johan,
>
> Thanks for your thoughts. Are you able to elaborate a little more?
>
> > Whenever you serialize and say “Hey’ let’s use polymorphism", you are
> pretty
> > much bound to fail.
>
> I’m not sure why you say this. I thought that by including the schema
> information, that is exactly what we were avoiding…
>
> I’m not sure I’m getting your idea about the command pattern. Would you
> mind showing me a quick example?
>
> Cheers,
> =David
>
>
>
> > On Aug 21, 2016, at 2:42 PM, Johan Edstrom <se...@gmail.com> wrote:
> >
> > Whenever you serialize and say “Hey’ let’s use polymorphism", you are
> pretty
> > much bound to fail.
> >
> > Use something like a command pattern identifying what you are doing
> > so you can pick the de-serializer.
> >
> >
> >
> >> On Aug 20, 2016, at 11:32 PM, David Leangen <os...@leangen.net> wrote:
> >>
> >>
> >> I had a few thoughts about this topic.
> >>
> >> In order to properly deserialise, we would need the schema. Since the
> DTO type *is* the schema, then somehow the schema needs to be serialised as
> well, or at least known upon serialisation.
> >>
> >> If the API were to have a toSchema() method, that would simplify things
> a lot. A client should also be able to do something like this:
> >>
> >> SomeDTO dto = ...
> >> codec.encode( dto ).serializeTo( out );
> >>
> >> Or maybe:
> >>
> >> codec.withSchema( SomeDTO.class).encode( dto ).to( out );
> >>
> >> The SomeDTO schema would also be serialised (perhaps simply using the
> fully-qualified class name as an alias?), so then the client could do this:
> >>
> >> Object o = codec.deserializeFrom( in )
> >>
> >> Since the schema is included in the input data, the serializer will
> behind the scene do something like:
> >>
> >>
> >>  Map<String, Object> map = codec.decode( Map.class ).from( in );
> >>  // Get the schema type, which is SomeDTO.class
> >>  ...
> >>  Object o = converter.convert( map.get(“data”) ).to( SomeDTO.class );
> >>
> >>
> >> wdyt?
> >>
> >> =David
> >>
> >>
> >>
> >>> On Aug 20, 2016, at 3:23 PM, David Leangen <os...@leangen.net> wrote:
> >>>
> >>>
> >>> Hi,
> >>>
> >>> I am trying to use the Convert/Codec as a serializer, but it has been
> a bit of a struggle so far.
> >>>
> >>> I have a “deep” object structure (i.e. at least one level of embedded
> objects). Writing the data as a JSON string works just fine. However, when
> deserialising, I am having trouble. The data object uses generics, so it is
> not possible to determine the type at runtime. My embedded objects end up
> being deserialised as Maps, which of course causes a ClassCastException
> later on in the code.
> >>>
> >>> I have tried adding a Rule using an adapter, and setting the
> codec.with(thatAdapter), but that didn’t work out so well, either.
> >>>
> >>> Here is an example of my attempt so far. I am working with Prevayler,
> so I need to serialize/deserialze a command object, which contains some
> data.
> >>>
> >>> I think that this code _should_ work, which means that there is likely
> a bug in the code. Even so…
> >>>
> >>> This is a very convoluted way of working, which doesn’t seem right to
> me. Am I missing some concept?
> >>>
> >>>
> >>> Example below.
> >>>
> >>> Cheers,
> >>> =David
> >>>
> >>>
> >>> public class DTOSerializer<E>
> >>>      implements Serializer // This is the interface provided by
> Prevayler
> >>> {
> >>>  private final Converter converter;
> >>>  private final Codec codec;
> >>>  private final Class<E> type;
> >>>
> >>>  public DTOSerializer( Converter aConverter, Codec aCodec, Class<E>
> aType )
> >>>  {
> >>>     // Very convoluted attempt at adding a rule to try to make the
> conversion of my “PutCommand” work during deserialization
> >>>      converter = aConverter.getAdapter()
> >>>              .rule(
> >>>                      PutCommand.class,
> >>>                      Map.class,
> >>>                      dto -> { Map map = new HashMap<>(); map.put(
> "key", dto.key ); map.put( "entity", dto.entity ); return map; },
> >>>                      map -> { PutCommand c = new PutCommand(); c.key =
> (String)map.get( "key" ); c.entity = aConverter.convert( map.get( "entity"
> ) ).to( aType ); return c; } );
> >>>      codec = aCodec.with( converter );
> >>>  }
> >>>
> >>>  @Override
> >>>  public Object readObject( InputStream in )
> >>>          throws Exception
> >>>  {
> >>>     // This is the raw data
> >>>      final Map<String, Object> map = codec.decode( Map.class ).from(
> in );
> >>>     // The name of the object type to cast to
> >>>      final String commandTypeName = (String)map.get( "command" );
> >>>      final Class<?> commandType = Class.forName( commandTypeName );
> >>>     // This indeed returns an object of the correct type,
> >>>     // HOWEVER, the embedded objects are not of the correct type, they
> are of type Map
> >>>      final Object command = converter.convert( map.get( "payload" )
> ).to( commandType );
> >>>      return command;
> >>>  }
> >>>
> >>>  @Override
> >>>  public void writeObject( OutputStream out, Object object )
> >>>          throws Exception
> >>>  {
> >>>      final Map<String, Object> map = new HashMap<>();
> >>>     // Serialize the name of the command so I know what to cast it do
> when deserializing
> >>>      map.put( "command", object.getClass().getName() );
> >>>     // This is the actual payload, which is nothing more than the
> command object
> >>>      map.put( "payload", object );
> >>>      codec.encode( map ).to( out );
> >>>  }
> >>> }
> >>>
> >>
> >
>
>

Re: [Converter] Serializer

Posted by David Leangen <os...@leangen.net>.
Hi Johan,

Thanks for your thoughts. Are you able to elaborate a little more?

> Whenever you serialize and say “Hey’ let’s use polymorphism", you are pretty 
> much bound to fail.

I’m not sure why you say this. I thought that by including the schema information, that is exactly what we were avoiding…

I’m not sure I’m getting your idea about the command pattern. Would you mind showing me a quick example?

Cheers,
=David



> On Aug 21, 2016, at 2:42 PM, Johan Edstrom <se...@gmail.com> wrote:
> 
> Whenever you serialize and say “Hey’ let’s use polymorphism", you are pretty 
> much bound to fail.
> 
> Use something like a command pattern identifying what you are doing 
> so you can pick the de-serializer. 
> 
> 
> 
>> On Aug 20, 2016, at 11:32 PM, David Leangen <os...@leangen.net> wrote:
>> 
>> 
>> I had a few thoughts about this topic.
>> 
>> In order to properly deserialise, we would need the schema. Since the DTO type *is* the schema, then somehow the schema needs to be serialised as well, or at least known upon serialisation.
>> 
>> If the API were to have a toSchema() method, that would simplify things a lot. A client should also be able to do something like this:
>> 
>> SomeDTO dto = ...
>> codec.encode( dto ).serializeTo( out );
>> 
>> Or maybe:
>> 
>> codec.withSchema( SomeDTO.class).encode( dto ).to( out );
>> 
>> The SomeDTO schema would also be serialised (perhaps simply using the fully-qualified class name as an alias?), so then the client could do this:
>> 
>> Object o = codec.deserializeFrom( in )
>> 
>> Since the schema is included in the input data, the serializer will behind the scene do something like:
>> 
>> 
>>  Map<String, Object> map = codec.decode( Map.class ).from( in );
>>  // Get the schema type, which is SomeDTO.class
>>  ...
>>  Object o = converter.convert( map.get(“data”) ).to( SomeDTO.class );
>> 
>> 
>> wdyt?
>> 
>> =David
>> 
>> 
>> 
>>> On Aug 20, 2016, at 3:23 PM, David Leangen <os...@leangen.net> wrote:
>>> 
>>> 
>>> Hi,
>>> 
>>> I am trying to use the Convert/Codec as a serializer, but it has been a bit of a struggle so far.
>>> 
>>> I have a “deep” object structure (i.e. at least one level of embedded objects). Writing the data as a JSON string works just fine. However, when deserialising, I am having trouble. The data object uses generics, so it is not possible to determine the type at runtime. My embedded objects end up being deserialised as Maps, which of course causes a ClassCastException later on in the code.
>>> 
>>> I have tried adding a Rule using an adapter, and setting the codec.with(thatAdapter), but that didn’t work out so well, either.
>>> 
>>> Here is an example of my attempt so far. I am working with Prevayler, so I need to serialize/deserialze a command object, which contains some data.
>>> 
>>> I think that this code _should_ work, which means that there is likely a bug in the code. Even so…
>>> 
>>> This is a very convoluted way of working, which doesn’t seem right to me. Am I missing some concept?
>>> 
>>> 
>>> Example below.
>>> 
>>> Cheers,
>>> =David
>>> 
>>> 
>>> public class DTOSerializer<E>
>>>      implements Serializer // This is the interface provided by Prevayler
>>> {
>>>  private final Converter converter;
>>>  private final Codec codec;
>>>  private final Class<E> type;
>>> 
>>>  public DTOSerializer( Converter aConverter, Codec aCodec, Class<E> aType )
>>>  {
>>> 	// Very convoluted attempt at adding a rule to try to make the conversion of my “PutCommand” work during deserialization
>>>      converter = aConverter.getAdapter()
>>>              .rule( 
>>>                      PutCommand.class, 
>>>                      Map.class, 
>>>                      dto -> { Map map = new HashMap<>(); map.put( "key", dto.key ); map.put( "entity", dto.entity ); return map; }, 
>>>                      map -> { PutCommand c = new PutCommand(); c.key = (String)map.get( "key" ); c.entity = aConverter.convert( map.get( "entity" ) ).to( aType ); return c; } );
>>>      codec = aCodec.with( converter );
>>>  }
>>> 
>>>  @Override
>>>  public Object readObject( InputStream in )
>>>          throws Exception
>>>  {
>>> 	// This is the raw data
>>>      final Map<String, Object> map = codec.decode( Map.class ).from( in );
>>> 	// The name of the object type to cast to
>>>      final String commandTypeName = (String)map.get( "command" );
>>>      final Class<?> commandType = Class.forName( commandTypeName );
>>> 	// This indeed returns an object of the correct type,
>>> 	// HOWEVER, the embedded objects are not of the correct type, they are of type Map
>>>      final Object command = converter.convert( map.get( "payload" ) ).to( commandType );
>>>      return command;
>>>  }
>>> 
>>>  @Override
>>>  public void writeObject( OutputStream out, Object object )
>>>          throws Exception
>>>  {
>>>      final Map<String, Object> map = new HashMap<>();
>>> 	// Serialize the name of the command so I know what to cast it do when deserializing
>>>      map.put( "command", object.getClass().getName() );
>>> 	// This is the actual payload, which is nothing more than the command object
>>>      map.put( "payload", object );
>>>      codec.encode( map ).to( out );
>>>  }
>>> }
>>> 
>> 
> 


Re: [Converter] Serializer

Posted by Johan Edstrom <se...@gmail.com>.
Whenever you serialize and say “Hey’ let’s use polymorphism", you are pretty 
much bound to fail.

Use something like a command pattern identifying what you are doing 
so you can pick the de-serializer. 


 
> On Aug 20, 2016, at 11:32 PM, David Leangen <os...@leangen.net> wrote:
> 
> 
> I had a few thoughts about this topic.
> 
> In order to properly deserialise, we would need the schema. Since the DTO type *is* the schema, then somehow the schema needs to be serialised as well, or at least known upon serialisation.
> 
> If the API were to have a toSchema() method, that would simplify things a lot. A client should also be able to do something like this:
> 
>  SomeDTO dto = ...
>  codec.encode( dto ).serializeTo( out );
> 
> Or maybe:
> 
>  codec.withSchema( SomeDTO.class).encode( dto ).to( out );
> 
> The SomeDTO schema would also be serialised (perhaps simply using the fully-qualified class name as an alias?), so then the client could do this:
> 
>  Object o = codec.deserializeFrom( in )
> 
> Since the schema is included in the input data, the serializer will behind the scene do something like:
> 
> 
>   Map<String, Object> map = codec.decode( Map.class ).from( in );
>   // Get the schema type, which is SomeDTO.class
>   ...
>   Object o = converter.convert( map.get(“data”) ).to( SomeDTO.class );
> 
> 
> wdyt?
> 
> =David
> 
> 
> 
>> On Aug 20, 2016, at 3:23 PM, David Leangen <os...@leangen.net> wrote:
>> 
>> 
>> Hi,
>> 
>> I am trying to use the Convert/Codec as a serializer, but it has been a bit of a struggle so far.
>> 
>> I have a “deep” object structure (i.e. at least one level of embedded objects). Writing the data as a JSON string works just fine. However, when deserialising, I am having trouble. The data object uses generics, so it is not possible to determine the type at runtime. My embedded objects end up being deserialised as Maps, which of course causes a ClassCastException later on in the code.
>> 
>> I have tried adding a Rule using an adapter, and setting the codec.with(thatAdapter), but that didn’t work out so well, either.
>> 
>> Here is an example of my attempt so far. I am working with Prevayler, so I need to serialize/deserialze a command object, which contains some data.
>> 
>> I think that this code _should_ work, which means that there is likely a bug in the code. Even so…
>> 
>> This is a very convoluted way of working, which doesn’t seem right to me. Am I missing some concept?
>> 
>> 
>> Example below.
>> 
>> Cheers,
>> =David
>> 
>> 
>> public class DTOSerializer<E>
>>       implements Serializer // This is the interface provided by Prevayler
>> {
>>   private final Converter converter;
>>   private final Codec codec;
>>   private final Class<E> type;
>> 
>>   public DTOSerializer( Converter aConverter, Codec aCodec, Class<E> aType )
>>   {
>> 	// Very convoluted attempt at adding a rule to try to make the conversion of my “PutCommand” work during deserialization
>>       converter = aConverter.getAdapter()
>>               .rule( 
>>                       PutCommand.class, 
>>                       Map.class, 
>>                       dto -> { Map map = new HashMap<>(); map.put( "key", dto.key ); map.put( "entity", dto.entity ); return map; }, 
>>                       map -> { PutCommand c = new PutCommand(); c.key = (String)map.get( "key" ); c.entity = aConverter.convert( map.get( "entity" ) ).to( aType ); return c; } );
>>       codec = aCodec.with( converter );
>>   }
>> 
>>   @Override
>>   public Object readObject( InputStream in )
>>           throws Exception
>>   {
>> 	// This is the raw data
>>       final Map<String, Object> map = codec.decode( Map.class ).from( in );
>> 	// The name of the object type to cast to
>>       final String commandTypeName = (String)map.get( "command" );
>>       final Class<?> commandType = Class.forName( commandTypeName );
>> 	// This indeed returns an object of the correct type,
>> 	// HOWEVER, the embedded objects are not of the correct type, they are of type Map
>>       final Object command = converter.convert( map.get( "payload" ) ).to( commandType );
>>       return command;
>>   }
>> 
>>   @Override
>>   public void writeObject( OutputStream out, Object object )
>>           throws Exception
>>   {
>>       final Map<String, Object> map = new HashMap<>();
>> 	// Serialize the name of the command so I know what to cast it do when deserializing
>>       map.put( "command", object.getClass().getName() );
>> 	// This is the actual payload, which is nothing more than the command object
>>       map.put( "payload", object );
>>       codec.encode( map ).to( out );
>>   }
>> }
>> 
> 


Re: [Converter] Serializer

Posted by David Leangen <os...@leangen.net>.
I had a few thoughts about this topic.

In order to properly deserialise, we would need the schema. Since the DTO type *is* the schema, then somehow the schema needs to be serialised as well, or at least known upon serialisation.

If the API were to have a toSchema() method, that would simplify things a lot. A client should also be able to do something like this:

  SomeDTO dto = ...
  codec.encode( dto ).serializeTo( out );

Or maybe:
  
  codec.withSchema( SomeDTO.class).encode( dto ).to( out );

The SomeDTO schema would also be serialised (perhaps simply using the fully-qualified class name as an alias?), so then the client could do this:

  Object o = codec.deserializeFrom( in )

Since the schema is included in the input data, the serializer will behind the scene do something like:

  
   Map<String, Object> map = codec.decode( Map.class ).from( in );
   // Get the schema type, which is SomeDTO.class
   ...
   Object o = converter.convert( map.get(“data”) ).to( SomeDTO.class );


wdyt?

=David



> On Aug 20, 2016, at 3:23 PM, David Leangen <os...@leangen.net> wrote:
> 
> 
> Hi,
> 
> I am trying to use the Convert/Codec as a serializer, but it has been a bit of a struggle so far.
> 
> I have a “deep” object structure (i.e. at least one level of embedded objects). Writing the data as a JSON string works just fine. However, when deserialising, I am having trouble. The data object uses generics, so it is not possible to determine the type at runtime. My embedded objects end up being deserialised as Maps, which of course causes a ClassCastException later on in the code.
> 
> I have tried adding a Rule using an adapter, and setting the codec.with(thatAdapter), but that didn’t work out so well, either.
> 
> Here is an example of my attempt so far. I am working with Prevayler, so I need to serialize/deserialze a command object, which contains some data.
> 
> I think that this code _should_ work, which means that there is likely a bug in the code. Even so…
> 
> This is a very convoluted way of working, which doesn’t seem right to me. Am I missing some concept?
> 
> 
> Example below.
> 
> Cheers,
> =David
> 
> 
> public class DTOSerializer<E>
>        implements Serializer // This is the interface provided by Prevayler
> {
>    private final Converter converter;
>    private final Codec codec;
>    private final Class<E> type;
> 
>    public DTOSerializer( Converter aConverter, Codec aCodec, Class<E> aType )
>    {
> 	// Very convoluted attempt at adding a rule to try to make the conversion of my “PutCommand” work during deserialization
>        converter = aConverter.getAdapter()
>                .rule( 
>                        PutCommand.class, 
>                        Map.class, 
>                        dto -> { Map map = new HashMap<>(); map.put( "key", dto.key ); map.put( "entity", dto.entity ); return map; }, 
>                        map -> { PutCommand c = new PutCommand(); c.key = (String)map.get( "key" ); c.entity = aConverter.convert( map.get( "entity" ) ).to( aType ); return c; } );
>        codec = aCodec.with( converter );
>    }
> 
>    @Override
>    public Object readObject( InputStream in )
>            throws Exception
>    {
> 	// This is the raw data
>        final Map<String, Object> map = codec.decode( Map.class ).from( in );
> 	// The name of the object type to cast to
>        final String commandTypeName = (String)map.get( "command" );
>        final Class<?> commandType = Class.forName( commandTypeName );
> 	// This indeed returns an object of the correct type,
> 	// HOWEVER, the embedded objects are not of the correct type, they are of type Map
>        final Object command = converter.convert( map.get( "payload" ) ).to( commandType );
>        return command;
>    }
> 
>    @Override
>    public void writeObject( OutputStream out, Object object )
>            throws Exception
>    {
>        final Map<String, Object> map = new HashMap<>();
> 	// Serialize the name of the command so I know what to cast it do when deserializing
>        map.put( "command", object.getClass().getName() );
> 	// This is the actual payload, which is nothing more than the command object
>        map.put( "payload", object );
>        codec.encode( map ).to( out );
>    }
> }
>