You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@jena.apache.org by Xabriel J Collazo-Mojica <xc...@fiu.edu> on 2011/07/21 17:21:32 UTC

Best (most efficient) way to implement a chain of custom rules reasoners.

Hi jena-users,

I have an RDF model, and I want to infer new triples from a chain of Jena
Custom Rules Reasoners.

I'm currently doing the following:


// def. of the reasoners goes here..
private static Reasoner firstReasoner =
RuleReasonerFactory.createRulesReasoner("first_service.rules");
private static Reasoner secondReasoner =
RuleReasonerFactory.createRulesReasoner("second_service.rules");
private static Reasoner thirdReasoner =
RuleReasonerFactory.createRulesReasoner("third_service.rules");
...


public Model createMapping(Model model) {

// first, use the incoming RDF deva model
Model infmodel = ModelFactory.createInfModel(firstReasoner, model);

// all subsequent calls just take the previously created infmodel and reason
over it
infmodel = ModelFactory.createInfModel(secondReasoner, infmodel);
infmodel = ModelFactory.createInfModel(thirdReasoner, infmodel);
...

return infmodel;

}



To get the best performance:

Should I call infmodel.prepare() before the call to the next reasoner?
Is there an efficient way to strip the infmodel from the reasoner (I.e. get
a simple Model again) so that the next inference is faster? (I think I'm
getting way too many rules fired..)


Thanks!,
Xabriel J. Collazo-Mojica <http://cs.fiu.edu/~xcoll001/>
CS <http://www.cis.fiu.edu/> PhD Student @ FIU <http://www.fiu.edu>

Re: Best (most efficient) way to implement a chain of custom rules reasoners.

Posted by Xabriel J Collazo-Mojica <xc...@fiu.edu>.
Hi Dave,

Thanks for your help. Yes, all of my rules are forward, so I followed your
advice of using only one reasoner. Got a huge performance increase:

Before:
Inference on Average for 200 iterations took: 0.5029099999999995 seconds.

After:
Inference on Average for 200 iterations took: 0.03552999999999996 seconds.


I'll now check on your second suggestion to see if I I win some memory space
back by persisting only plain non-inference models.

Thanks!,
Xabriel J. Collazo-Mojica <http://cs.fiu.edu/~xcoll001/>
CS <http://www.cis.fiu.edu/> PhD Student @ FIU <http://www.fiu.edu>




On Fri, Jul 22, 2011 at 4:09 PM, Dave Reynolds <da...@gmail.com>wrote:

> On Thu, 2011-07-21 at 17:21 +0200, Xabriel J Collazo-Mojica wrote:
> > Hi jena-users,
> >
> > I have an RDF model, and I want to infer new triples from a chain of Jena
> > Custom Rules Reasoners.
> >
> > I'm currently doing the following:
> >
> >
> > // def. of the reasoners goes here..
> > private static Reasoner firstReasoner =
> > RuleReasonerFactory.createRulesReasoner("first_service.rules");
> > private static Reasoner secondReasoner =
> > RuleReasonerFactory.createRulesReasoner("second_service.rules");
> > private static Reasoner thirdReasoner =
> > RuleReasonerFactory.createRulesReasoner("third_service.rules");
> > ...
>
> Are the rules forward or hybrid?
>
> If forward, why not just put them all in one reasoner?
>
> > public Model createMapping(Model model) {
> >
> > // first, use the incoming RDF deva model
> > Model infmodel = ModelFactory.createInfModel(firstReasoner, model);
> >
> > // all subsequent calls just take the previously created infmodel and
> reason
> > over it
> > infmodel = ModelFactory.createInfModel(secondReasoner, infmodel);
> > infmodel = ModelFactory.createInfModel(thirdReasoner, infmodel);
> > ...
> >
> > return infmodel;
> >
> > }
> >
> >
> >
> > To get the best performance:
> >
> > Should I call infmodel.prepare() before the call to the next reasoner?
>
> Doesn't make any difference to the total cost, just changes when the
> work gets done.
>
> > Is there an efficient way to strip the infmodel from the reasoner (I.e.
> get
> > a simple Model again) so that the next inference is faster? (I think I'm
> > getting way too many rules fired..)
>
> If you mean you a plain non-inference model containing the full set of
> inference results then you can create a memory model and use add() to
> add the inference model into that.
>
> Alternatively, if you are using pure forward rules you can use
> getRawModel and getDeductionsModel which are essentially plain models
> and create a (dynamic or static) union of those.
>
> Dave
>
>
>
>

Re: Best (most efficient) way to implement a chain of custom rules reasoners.

Posted by Dave Reynolds <da...@gmail.com>.
On Thu, 2011-07-21 at 17:21 +0200, Xabriel J Collazo-Mojica wrote: 
> Hi jena-users,
> 
> I have an RDF model, and I want to infer new triples from a chain of Jena
> Custom Rules Reasoners.
> 
> I'm currently doing the following:
> 
> 
> // def. of the reasoners goes here..
> private static Reasoner firstReasoner =
> RuleReasonerFactory.createRulesReasoner("first_service.rules");
> private static Reasoner secondReasoner =
> RuleReasonerFactory.createRulesReasoner("second_service.rules");
> private static Reasoner thirdReasoner =
> RuleReasonerFactory.createRulesReasoner("third_service.rules");
> ...

Are the rules forward or hybrid?

If forward, why not just put them all in one reasoner?

> public Model createMapping(Model model) {
> 
> // first, use the incoming RDF deva model
> Model infmodel = ModelFactory.createInfModel(firstReasoner, model);
> 
> // all subsequent calls just take the previously created infmodel and reason
> over it
> infmodel = ModelFactory.createInfModel(secondReasoner, infmodel);
> infmodel = ModelFactory.createInfModel(thirdReasoner, infmodel);
> ...
> 
> return infmodel;
> 
> }
> 
> 
> 
> To get the best performance:
> 
> Should I call infmodel.prepare() before the call to the next reasoner?

Doesn't make any difference to the total cost, just changes when the
work gets done.

> Is there an efficient way to strip the infmodel from the reasoner (I.e. get
> a simple Model again) so that the next inference is faster? (I think I'm
> getting way too many rules fired..)

If you mean you a plain non-inference model containing the full set of
inference results then you can create a memory model and use add() to
add the inference model into that.

Alternatively, if you are using pure forward rules you can use
getRawModel and getDeductionsModel which are essentially plain models
and create a (dynamic or static) union of those.

Dave