You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/06/14 19:26:26 UTC

[GitHub] [beam] yeandy opened a new issue, #21863: [Feature Request]: RunInference: investigate adding optional batching flag

yeandy opened a new issue, #21863:
URL: https://github.com/apache/beam/issues/21863

   ### What would you like to happen?
   
   Look into adding a flag that users can specify to turn off `BatchElements`.
   
   ### Issue Priority
   
   Priority: 2
   
   ### Issue Component
   
   Component: sdk-py-core


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm commented on issue #21863: RunInference: investigate adding optional batching flag

Posted by GitBox <gi...@apache.org>.
damccorm commented on issue #21863:
URL: https://github.com/apache/beam/issues/21863#issuecomment-1235527331

   @yeandy do we actually need this or is setting the max batch size to 1 enough? Is the idea here just to offer a convenience flag?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm commented on issue #21863: RunInference: investigate adding optional batching flag

Posted by GitBox <gi...@apache.org>.
damccorm commented on issue #21863:
URL: https://github.com/apache/beam/issues/21863#issuecomment-1333828549

   Once this is done, we should update [run_inference_multi_model.ipynb](https://github.com/apache/beam/pull/24437/files/d29825911841635ba03c7d2d80943d0326d149de#diff-280d6ee1fd5f85ade179b3d50cb0914f26155a5865dd9dddb113d751e5f4a888) to remove the workarounds we're using there


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] yeandy commented on issue #21863: RunInference: investigate adding optional batching flag

Posted by GitBox <gi...@apache.org>.
yeandy commented on issue #21863:
URL: https://github.com/apache/beam/issues/21863#issuecomment-1235895167

   Not quite. I think the original idea was that users may want to pass in pre-batched inputs (i.e. tensors that include a batched dimension). If they did that, however, then an additional dimension would be added because we call `torch.stack()` or `np.stack()` inside `run_inference()`. 
   
   The thinking was that we could pass in some sort of flag to skip `BatchElements()` (and also `torch.stack()` or `np.stack()`) to allow users to more easily operate with this type of batched data.
   
   The Batched DoFns from Brian's work should enable us to do this in better way.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] TheNeuralBit commented on issue #21863: RunInference: investigate adding optional batching flag

Posted by GitBox <gi...@apache.org>.
TheNeuralBit commented on issue #21863:
URL: https://github.com/apache/beam/issues/21863#issuecomment-1213594754

   Note that the ability to elide batching/unbatching is the primary value provided by https://s.apache.org/batched-dofns right now.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm commented on issue #21863: RunInference: investigate adding optional batching flag

Posted by GitBox <gi...@apache.org>.
damccorm commented on issue #21863:
URL: https://github.com/apache/beam/issues/21863#issuecomment-1240890880

   > The Batched DoFns from Brian's work should enable us to do this in better way.
   
   For future reference/any larger audience, hooking into this is tracked in #21440


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org