You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/08/30 19:58:01 UTC

[GitHub] [beam] rszper commented on a diff in pull request #22949: Adding TensorFlow support to the Machine Learning overview page

rszper commented on code in PR #22949:
URL: https://github.com/apache/beam/pull/22949#discussion_r958876075


##########
website/www/site/content/en/documentation/sdks/python-machine-learning.md:
##########
@@ -165,7 +165,29 @@ For detailed instructions explaining how to build and run a pipeline that uses M
 
 ## Beam Java SDK support
 
-RunInference API is available to Beam Java SDK 2.41.0 and later through Apache Beam's [Multi-language Pipelines framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines). Please see [here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java) for the Java wrapper transform to use and please see [here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java) for some example pipelines.
+The RunInference API is available with the Beam Java SDK versions 2.41.0 and later through Apache Beam's [Multi-language Pipelines framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines). For information about the Java wrapper transform, see [RunInference.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java). For example pipelines, see [RunInferenceTransformTest.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java).
+
+## TensorFlow support
+
+To use TensorFlow with the RunInference API, create a model handler from within `tfx_bsl`. The model handler can be keyed or unkeyed.
+For more information, see [run_inference.py](https://github.com/tensorflow/tfx-bsl/blob/d1fca25e5eeaac9ef0111ec13e7634df836f36f6/tfx_bsl/public/beam/run_inference.py) in the TensorFlow GitHub repository.
+
+```
+tf_handler = CreateModelHandler(inference_spec_type)
+
+# unkeyed
+beam.run_inference(tf_handler)
+
+# keyed
+beam.run_inference(beam.ml.inference.KeyedHandler(tf_handler))
+
+Args:
+  inference_spec_type: Model inference endpoint
+Returns:
+  A Beam RunInference ModelHandler for TensorFlow
+"""
+return run_inference.create_model_handler(inference_spec_type, None, None)

Review Comment:
   Thanks! I removed it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org