You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Andy Ye (Jira)" <ji...@apache.org> on 2022/03/02 22:16:00 UTC

[jira] [Updated] (BEAM-13970) RunInference V1

     [ https://issues.apache.org/jira/browse/BEAM-13970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andy Ye updated BEAM-13970:
---------------------------
    Description: 
Users of machine learning frameworks must currently implement their own transforms for running ML inferences. The exception is the TensorFlow [[RunInference transform|https://github.com/tensorflow/tfx-bsl/blob/master/tfx_bsl/beam/run_inference.py]|https://github.com/tensorflow/tfx-bsl/blob/master/tfx_bsl/beam/run_inference.py]. However, this is hosted in its own [[repo|https://github.com/tensorflow/tfx-bsl]|https://github.com/tensorflow/tfx-bsl], and has an [[API|https://www.tensorflow.org/tfx/tfx_bsl/api_docs/python/tfx_bsl/public/beam/RunInference|https://www.tensorflow.org/tfx/tfx_bsl/api_docs/python/tfx_bsl/public/beam/RunInference] []|https://www.tensorflow.org/tfx/tfx_bsl/api_docs/python/tfx_bsl/public/beam/RunInference] that is exclusively geared towards the TensorFlow TFX library. Our goal is to add new implementations of RunInference for the two other popular machine learning frameworks: scikit-learn and Pytorch.

Please see main design document [here|[https://s.apache.org/inference-sklearn-pytorch]].

> RunInference V1
> ---------------
>
>                 Key: BEAM-13970
>                 URL: https://issues.apache.org/jira/browse/BEAM-13970
>             Project: Beam
>          Issue Type: New Feature
>          Components: sdk-py-core
>            Reporter: Andy Ye
>            Assignee: Andy Ye
>            Priority: P2
>              Labels: run-inference
>
> Users of machine learning frameworks must currently implement their own transforms for running ML inferences. The exception is the TensorFlow [[RunInference transform|https://github.com/tensorflow/tfx-bsl/blob/master/tfx_bsl/beam/run_inference.py]|https://github.com/tensorflow/tfx-bsl/blob/master/tfx_bsl/beam/run_inference.py]. However, this is hosted in its own [[repo|https://github.com/tensorflow/tfx-bsl]|https://github.com/tensorflow/tfx-bsl], and has an [[API|https://www.tensorflow.org/tfx/tfx_bsl/api_docs/python/tfx_bsl/public/beam/RunInference|https://www.tensorflow.org/tfx/tfx_bsl/api_docs/python/tfx_bsl/public/beam/RunInference] []|https://www.tensorflow.org/tfx/tfx_bsl/api_docs/python/tfx_bsl/public/beam/RunInference] that is exclusively geared towards the TensorFlow TFX library. Our goal is to add new implementations of RunInference for the two other popular machine learning frameworks: scikit-learn and Pytorch.
> Please see main design document [here|[https://s.apache.org/inference-sklearn-pytorch]].



--
This message was sent by Atlassian Jira
(v8.20.1#820001)