You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@jena.apache.org by David Jordan <Da...@sas.com> on 2013/04/29 21:06:29 UTC

questions about Dataset.abort and configuring

This email has several questions about an issue I am having.

The following code is executed when getConfiguredOntModel() is called.
Depending on the value of MODEL_CACHE, the intent is to have
accesses run against the DB (MODEL_CACHE_DB)
versus having it run against an in-memory realization (MODEL_CACHE_MEMORY).

These two configurations deal with a model stored in TDB where preinferencing was performed and stored in the database. I am wanting some configurations to leverage the fact that inferencing has already been performed. But for this particular transaction, it is also necessary that there is a reasoner running to check for this invalid update.

Model dbmodel = dataset.getNamedModel(modelName);  // get a model already stored in TDB
if( MODEL_CACHE_DB.equals(MODEL_CACHE) ){
      configuredOntModel = ModelFactory.createOntologyModel(spec, dbModel);
      configuredOntModel.prepare();
} else if( MODEL_CACHE_MEMORY.equals(MODEL_CACHE) ){
      configuredModel = ModelFactory.createDefaultModel();
      configuredModel.add(dbModel);
      configuredOntModel = ModelFactory.createOntologyModel(spec, configuredModel);
      configuredOntModel.prepare();
}

I am having troubles with these OntModel configurations in several contexts, so I suspect I may have something wrong here. I have some other OntModel configurations that work just fine, no problems at all, with this example same transaction (below). They can be repeated over and over and I don’t have the issue described below. So I suspect they are somehow not configured correctly.

Then I am having problems with the following code, but only with the two above configurations. The first time I run it, it works fine. But the second and all subsequent test runs and I get an error that it is unable to find Individual P2. These subsequent test runs are done in separate JVMs executed later.

My update involves setting a property to an invalid value. This must be causing individual P2 to not be found because of the invalid update. But at the end of this transaction I am doing an abort, so no update should be going to the database. It is as if the invalid update is making it into the database, causing all future runs to fail. Yet I can run the working configurations and they have no problem at all, even after these configurations start to fail.

Another factor here is that this ontology model is not directly associated with the database anyway, except that we have associated it through the calls abovej with a model from the database.

public void testInvalidPropertyValue(){
      Dataset dataset = getDataset();
      OntModel omodel = getConfiguredOntModel();
      try {
            dataset.begin(ReadWrite.WRITE);
            Individual p1 = omodel.getIndividual(P1);
            Individual p2 = omodel.getIndividual(P2);
            assertNotNull("found Individual P1", p1);
            assertNotNull("found Individual P2", p2);
            Property hasSex = omodel.getProperty(HAS_SEX);
            assertNotNull("found property hasSex", hasSex);

            omodel.add(omodel.createStatement(p1, hasSex, p2)); // Sets invalid property value
            ValidityReport validity = omodel.validate();
            if( validity == null ) System.out.println("validate() returns null");
            assertNotNull("validity report is null", validity);
            boolean isValid = validity.isValid();
            assertFalse(isValid);
            dataset.abort();
      } finally {
            dataset.end();
      }
}


David Jordan
Senior Software Developer
SAS Institute Inc.
Health & Life Sciences, Research & Development
Bldg R ▪ Office 4467
600 Research Drive ▪ Cary, NC 27513
Tel: 919 531 1233 ▪ david.jordan@sas.com<ma...@sas.com>
www.sas.com<http://www.sas.com/>
SAS® … THE POWER TO KNOW®


Re: questions about Dataset.abort and configuring

Posted by Andy Seaborne <an...@apache.org>.
On 29/04/13 21:36, David Jordan wrote:
>
> I was wondering about the need for the OntModel to be in the
> transaction…
>
> I started with Models and OntModels and no transactions except at
> Model level, because that is what I had read in Jena documentation.
>
> I can switch to Graphs if that is recommended, I have not worked
> directly with them yet. I'll need to see what functionality they lack
> relative to Models that I am using.

Using models is fine.  In TDB, both are tied to a dataset as views.  But 
you have to make sure that the view is not transaction spanning so it is 
a view after the transaction begin:

       Dataset dataset = getDataset();
       OntModel omodel = getConfiguredOntModel();
       try {
             dataset.begin(ReadWrite.WRITE);

==>

       Dataset dataset = getDataset();
       try {
             dataset.begin(ReadWrite.WRITE);
             OntModel omodel = getConfiguredOntModel(dataset);

and do the dataset.getNamedModel on the dataset after .begin

(internally, the DatasetGraph flips to redirect to a different storage 
layer DatasetGraphTDB as you cross a transaction .begin)

	Andy

Re: questions about Dataset.abort and configuring

Posted by David Jordan <da...@bellsouth.net>.
I was wondering about the need for the OntModel to be in the transaction…

I started with Models and OntModels and no transactions except at Model level, because that is what I had read in Jena documentation.

I can switch to Graphs if that is recommended, I have not worked directly with them yet. I'll need to see what functionality they lack relative to Models that I am using.

On Apr 29, 2013, at 4:20 PM, Andy Seaborne wrote:

> On 29/04/13 20:06, David Jordan wrote:
>> 
>> This email has several questions about an issue I am having.
>> 
>> The following code is executed when getConfiguredOntModel() is called.
>> Depending on the value of MODEL_CACHE, the intent is to have
>> accesses run against the DB (MODEL_CACHE_DB)
>> versus having it run against an in-memory realization (MODEL_CACHE_MEMORY).
>> 
>> These two configurations deal with a model stored in TDB where preinferencing was performed and stored in the database. I am wanting some configurations to leverage the fact that inferencing has already been performed. But for this particular transaction, it is also necessary that there is a reasoner running to check for this invalid update.
>> 
>> Model dbmodel = dataset.getNamedModel(modelName);  // get a model already stored in TDB
>> if( MODEL_CACHE_DB.equals(MODEL_CACHE) ){
>>       configuredOntModel = ModelFactory.createOntologyModel(spec, dbModel);
>>       configuredOntModel.prepare();
>> } else if( MODEL_CACHE_MEMORY.equals(MODEL_CACHE) ){
>>       configuredModel = ModelFactory.createDefaultModel();
>>       configuredModel.add(dbModel);
>>       configuredOntModel = ModelFactory.createOntologyModel(spec, configuredModel);
>>       configuredOntModel.prepare();
>> }
>> 
>> I am having troubles with these OntModel configurations in several contexts, so I suspect I may have something wrong here. I have some other OntModel configurations that work just fine, no problems at all, with this example same transaction (below). They can be repeated over and over and I don’t have the issue described below. So I suspect they are somehow not configured correctly.
>> 
>> Then I am having problems with the following code, but only with the two above configurations. The first time I run it, it works fine. But the second and all subsequent test runs and I get an error that it is unable to find Individual P2. These subsequent test runs are done in separate JVMs executed later.
>> 
>> My update involves setting a property to an invalid value. This must be causing individual P2 to not be found because of the invalid update. But at the end of this transaction I am doing an abort, so no update should be going to the database. It is as if the invalid update is making it into the database, causing all future runs to fail. Yet I can run the working configurations and they have no problem at all, even after these configurations start to fail.
>> 
>> Another factor here is that this ontology model is not directly associated with the database anyway, except that we have associated it through the calls abovej with a model from the database.
>> 
>> public void testInvalidPropertyValue(){
>>       Dataset dataset = getDataset();
>>       OntModel omodel = getConfiguredOntModel();
>>       try {
>>             dataset.begin(ReadWrite.WRITE);
> 
> You'll need to create the OntModel inside the transaction - otherwise it's not a model view of the transactional dataset.  .abort will not work.
> 
> I'm not entirely happy with this design - creating graphs as views is cheap but Models (and OntModels) *may be more expensive (needs measuring).
> 
> Maybe it would be better to make the models or graphs views of the trasnactional database (they are currently views of the transaction representation).  Not for this next release - the graphs also carry around some internal information which would need sorting out.
> 
> 	Andy
> 
> 
>>             Individual p1 = omodel.getIndividual(P1);
>>             Individual p2 = omodel.getIndividual(P2);
>>             assertNotNull("found Individual P1", p1);
>>             assertNotNull("found Individual P2", p2);
>>             Property hasSex = omodel.getProperty(HAS_SEX);
>>             assertNotNull("found property hasSex", hasSex);
>> 
>>             omodel.add(omodel.createStatement(p1, hasSex, p2)); // Sets invalid property value
>>             ValidityReport validity = omodel.validate();
>>             if( validity == null ) System.out.println("validate() returns null");
>>             assertNotNull("validity report is null", validity);
>>             boolean isValid = validity.isValid();
>>             assertFalse(isValid);
>>             dataset.abort();
> 
> 
> 
>>       } finally {
>>             dataset.end();
>>       }
>> }
>> 
>> 
>> David Jordan
>> Senior Software Developer
>> SAS Institute Inc.
>> Health & Life Sciences, Research & Development
>> Bldg R ▪ Office 4467
>> 600 Research Drive ▪ Cary, NC 27513
>> Tel: 919 531 1233 ▪ david.jordan@sas.com<ma...@sas.com>
>> www.sas.com<http://www.sas.com/>
>> SAS® … THE POWER TO KNOW®
>> 
> 


Re: questions about Dataset.abort and configuring

Posted by Andy Seaborne <an...@apache.org>.
On 29/04/13 20:06, David Jordan wrote:
>
> This email has several questions about an issue I am having.
>
> The following code is executed when getConfiguredOntModel() is called.
> Depending on the value of MODEL_CACHE, the intent is to have
> accesses run against the DB (MODEL_CACHE_DB)
> versus having it run against an in-memory realization (MODEL_CACHE_MEMORY).
>
> These two configurations deal with a model stored in TDB where preinferencing was performed and stored in the database. I am wanting some configurations to leverage the fact that inferencing has already been performed. But for this particular transaction, it is also necessary that there is a reasoner running to check for this invalid update.
>
> Model dbmodel = dataset.getNamedModel(modelName);  // get a model already stored in TDB
> if( MODEL_CACHE_DB.equals(MODEL_CACHE) ){
>        configuredOntModel = ModelFactory.createOntologyModel(spec, dbModel);
>        configuredOntModel.prepare();
> } else if( MODEL_CACHE_MEMORY.equals(MODEL_CACHE) ){
>        configuredModel = ModelFactory.createDefaultModel();
>        configuredModel.add(dbModel);
>        configuredOntModel = ModelFactory.createOntologyModel(spec, configuredModel);
>        configuredOntModel.prepare();
> }
>
> I am having troubles with these OntModel configurations in several contexts, so I suspect I may have something wrong here. I have some other OntModel configurations that work just fine, no problems at all, with this example same transaction (below). They can be repeated over and over and I don’t have the issue described below. So I suspect they are somehow not configured correctly.
>
> Then I am having problems with the following code, but only with the two above configurations. The first time I run it, it works fine. But the second and all subsequent test runs and I get an error that it is unable to find Individual P2. These subsequent test runs are done in separate JVMs executed later.
>
> My update involves setting a property to an invalid value. This must be causing individual P2 to not be found because of the invalid update. But at the end of this transaction I am doing an abort, so no update should be going to the database. It is as if the invalid update is making it into the database, causing all future runs to fail. Yet I can run the working configurations and they have no problem at all, even after these configurations start to fail.
>
> Another factor here is that this ontology model is not directly associated with the database anyway, except that we have associated it through the calls abovej with a model from the database.
>
> public void testInvalidPropertyValue(){
>        Dataset dataset = getDataset();
>        OntModel omodel = getConfiguredOntModel();
>        try {
>              dataset.begin(ReadWrite.WRITE);

You'll need to create the OntModel inside the transaction - otherwise 
it's not a model view of the transactional dataset.  .abort will not work.

I'm not entirely happy with this design - creating graphs as views is 
cheap but Models (and OntModels) *may be more expensive (needs measuring).

Maybe it would be better to make the models or graphs views of the 
trasnactional database (they are currently views of the transaction 
representation).  Not for this next release - the graphs also carry 
around some internal information which would need sorting out.

	Andy


>              Individual p1 = omodel.getIndividual(P1);
>              Individual p2 = omodel.getIndividual(P2);
>              assertNotNull("found Individual P1", p1);
>              assertNotNull("found Individual P2", p2);
>              Property hasSex = omodel.getProperty(HAS_SEX);
>              assertNotNull("found property hasSex", hasSex);
>
>              omodel.add(omodel.createStatement(p1, hasSex, p2)); // Sets invalid property value
>              ValidityReport validity = omodel.validate();
>              if( validity == null ) System.out.println("validate() returns null");
>              assertNotNull("validity report is null", validity);
>              boolean isValid = validity.isValid();
>              assertFalse(isValid);
>              dataset.abort();



>        } finally {
>              dataset.end();
>        }
> }
>
>
> David Jordan
> Senior Software Developer
> SAS Institute Inc.
> Health & Life Sciences, Research & Development
> Bldg R ▪ Office 4467
> 600 Research Drive ▪ Cary, NC 27513
> Tel: 919 531 1233 ▪ david.jordan@sas.com<ma...@sas.com>
> www.sas.com<http://www.sas.com/>
> SAS® … THE POWER TO KNOW®
>