You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@beam.apache.org by Jack McCluskey via dev <de...@beam.apache.org> on 2022/09/16 18:45:32 UTC

Custom Inference Fns in RunInference

Hey everyone,

I'm back with a brief design doc discussing ways that users could provide
custom inference functions for RunInference model handlers, which is
available at
 https://docs.google.com/document/d/1YYGsF20kminz7j9ifFdCD5WQwVl8aTeCo0cgPjbdFNU/edit?usp=sharing
<https://docs.google.com/document/d/1YYGsF20kminz7j9ifFdCD5WQwVl8aTeCo0cgPjbdFNU/edit?usp=sharing>
 now.

It's not a huge code change or a significantly long doc, but it's
establishing a convention for model handlers moving forward and that
warrants some discussion.

Thanks,

Jack McCluskey

-- 


Jack McCluskey
SWE - DataPLS PLAT/ Beam Go
RDU
jrmccluskey@gmail.com

Re: Custom Inference Fns in RunInference

Posted by Jack McCluskey via dev <de...@beam.apache.org>.
Thank you to everyone who chimed in on the doc! The discussion was very
productive, and I have since updated the design doc with some more detail
based on feedback and some suggestions. Any and all feedback is welcome!

Thanks,

Jack McCluskey

On Fri, Sep 16, 2022 at 2:45 PM Jack McCluskey <jr...@google.com>
wrote:

> Hey everyone,
>
> I'm back with a brief design doc discussing ways that users could provide
> custom inference functions for RunInference model handlers, which is
> available at
>  https://docs.google.com/document/d/1YYGsF20kminz7j9ifFdCD5WQwVl8aTeCo0cgPjbdFNU/edit?usp=sharing
> <https://docs.google.com/document/d/1YYGsF20kminz7j9ifFdCD5WQwVl8aTeCo0cgPjbdFNU/edit?usp=sharing>
>  now.
>
> It's not a huge code change or a significantly long doc, but it's
> establishing a convention for model handlers moving forward and that
> warrants some discussion.
>
> Thanks,
>
> Jack McCluskey
>
> --
>
>
> Jack McCluskey
> SWE - DataPLS PLAT/ Beam Go
> RDU
> jrmccluskey@gmail.com
>
>
>