You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@opennlp.apache.org by "Jeff Zemerick (Jira)" <ji...@apache.org> on 2022/08/06 12:49:00 UTC
[jira] [Closed] (OPENNLP-1375) Enable optional GPU inference in ONNX Runtime config
[ https://issues.apache.org/jira/browse/OPENNLP-1375?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Jeff Zemerick closed OPENNLP-1375.
----------------------------------
> Enable optional GPU inference in ONNX Runtime config
> ----------------------------------------------------
>
> Key: OPENNLP-1375
> URL: https://issues.apache.org/jira/browse/OPENNLP-1375
> Project: OpenNLP
> Issue Type: Task
> Affects Versions: 2.0.0
> Reporter: Jeff Zemerick
> Assignee: Jeff Zemerick
> Priority: Major
> Fix For: 2.0.1
>
>
> Enable optional GPU inference in ONNX Runtime config. Expose a property (probably through a constructor) to enable GPU inference when doing inference using ONNX Runtime.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)