You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@opennlp.apache.org by "Jeff Zemerick (Jira)" <ji...@apache.org> on 2022/07/22 18:50:00 UTC

[jira] [Created] (OPENNLP-1375) Enable optional GPU inference in ONNX Runtime config

Jeff Zemerick created OPENNLP-1375:
--------------------------------------

             Summary: Enable optional GPU inference in ONNX Runtime config
                 Key: OPENNLP-1375
                 URL: https://issues.apache.org/jira/browse/OPENNLP-1375
             Project: OpenNLP
          Issue Type: Task
    Affects Versions: 2.0.0
            Reporter: Jeff Zemerick
            Assignee: Jeff Zemerick


Enable optional GPU inference in ONNX Runtime config. Expose a property (probably through a constructor) to enable GPU inference when doing inference using ONNX Runtime.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)