You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@stanbol.apache.org by "Alessandro Adamou (Resolved) (JIRA)" <ji...@apache.org> on 2012/02/20 19:05:38 UTC

[jira] [Resolved] (STANBOL-433) Loading large ontology using Java API gives out-of-memory error

     [ https://issues.apache.org/jira/browse/STANBOL-433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Alessandro Adamou resolved STANBOL-433.
---------------------------------------

    Resolution: Fixed

Since it has been reported on the ML that the same ontologies now load flawlessly into OntoNet, I am closing this bug.

Further optimizations are advisable, but they will refer to a new Improvement ticket.
                
> Loading large ontology using Java API gives out-of-memory error
> ---------------------------------------------------------------
>
>                 Key: STANBOL-433
>                 URL: https://issues.apache.org/jira/browse/STANBOL-433
>             Project: Stanbol
>          Issue Type: Bug
>          Components: Ontology Manager
>            Reporter: Stephen Bayliss
>            Priority: Minor
>
> Loading a large ontology - in our case an RDF file in the order of hundreds of Megabytes - leads to an out of memory error.
> The ontology is being loaded into a custom space, using an OntologyInputSource.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira