You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@jena.apache.org by Dibyanshu Jaiswal <dj...@gmail.com> on 2013/10/10 11:12:30 UTC

Problem inferencing from Jena TDB API

Hi!

I am new to semantic web technologies, and have started with RDF/OWL for
making web applications.
Currently i have a requirement for accessing a Ontology (OntModel with
OWLModelSpec.OWL_MEM_RULE_INF) from an OWL file.I am also able to Store in
local TDB, all done by JENA 2.11.0. Thanks to the Nice API and Tutorial
provided for the same.
I need to fire SPARQL queries on the model to get some fruitful results.
Once the TDB is created, in order to fire search query on the same, the
results are not as expected.

As per my SPARQL qurey, when directly made to the TDB Dataset, does not
returns results (i.e. results with inference rules) accordingly. Where as
If the same query is fired on the OntModel (loaded from the TDB, with
OWLModelSpec.OWL_MEM_RULE_INF ) itself the results are found to be as
expected.

How do I solve the problem of making queries directly to the Dataset and
not to OntModel with inference rule enabled?
Please help!!
Thanks in Advance!!

-- 
*Dibyanshu Jaiswal*
Mb: +91 9038304989
Mb: +91 9674272265

Re: Problem inferencing from Jena TDB API

Posted by Ian Dickinson <i....@gmail.com>.
On Wed, Oct 23, 2013 at 7:03 AM, Dibyanshu Jaiswal <dj...@gmail.com> wrote:
> Just tell me what i can understand from your reply is correct or not.
Dave's not around at the moment, but I'll try to answer your question.

> By the solution provided by you, you mean to:
> Firstly create a model reading it from file with inferencing enabled.
> Second store the inferred model in the dataset while during the creation of
> the dataset.

> At last then read the model back from the dataset in OWL_MEM mode, which
> results in OntModel with inferences, but without inferencing mode enabled.
> Right?

I'm slightly confused by your description, but one of these steps is
not necessary.

To explain: inference in RDF models means using an algorithm to derive
additional
triples beyond those that are asserted in the original ('base') model.
For example,

    C rdfs:subClassOf B
    B rdfs:subClassOf A

allows an algorithm to infer the additional triple:

    C rdfs:subClassOf A

We call those additional triples the inference closure. Depending on
the model, there
can be quite a lot of them. When you construct an inference model, the inference
closure is computed dynamically (basically; there is some caching but
we'll leave that
to one side) in response to a query. So if you ask, via the API or SPARQL, "is
C rdfs:subClassOf A?", the inference algorithm will do work -
including pulling triples
from the base model - to answer that question.  The upside of this is
that (i) the
inference algorithm only has to do work in response to questions that
are asked (if
no-one cares about C's relationship to A, that inferred triple need
never be calculated),
and (ii) the results will adapt to updates to the model. The downside
is that every
time a question is asked, the algorithm typically has to make a lot of
queries into the
base model to test for the presence or absence of asserted triples.
That's OK for
memory models, which are efficient to access, but any model that is
stored on disk
makes that process very slow. There are two basic ways to get around
this problem:

1. When your application starts, make a copy of the model that's
stored on disk in
memory, and then query that via the inference engine. Pros: can be
more responsive
to updates, does not require the entire inference closure to be
calculated. Cons: the
model may be too large to fit in memory, persisting updates needs care.

2. Before your application starts, load the model from disk into an
inference memory
model, then save the entire inference closure to a new disk-based model. Then,
when your application runs, run queries directly against that new
model. Pros: you'll
get results from the inference closure as well as the base model
without having to
run inference algorithms when queries are asked, and you can cope with larger
models. Cons: needs more disk space, updating the model is much harder.

> If this is so! then Yes this can be useful to me. But I guess you missed a
> point in my question that, I directly want to query the dataset (Eg.:
> QueryExecution qexec = QueryExecutionFactory.create(**query, dataset); )
> and not the OntModel read from the dataset.
See above. You have to make choices among different trade-offs, and
those choices
will depend on the needs of your application and your users.

Ian

Re: Problem inferencing from Jena TDB API

Posted by Dibyanshu Jaiswal <dj...@gmail.com>.
Thanks Dave!!
Just tell me what i can understand from your reply is correct or not.

By the solution provided by you, you mean to:
Firstly create a model reading it from file with inferencing enabled.
Second store the inferred model in the dataset while during the creation of
the dataset.

At last then read the model back from the dataset in OWL_MEM mode, which
results in OntModel with inferences, but without inferencing mode enabled.
Right?

If this is so! then Yes this can be useful to me. But I guess you missed a
point in my question that, I directly want to query the dataset (Eg.:
QueryExecution qexec = QueryExecutionFactory.create(**query, dataset); )
and not the OntModel read from the dataset.

Thanks once again!!


On Thu, Oct 17, 2013 at 1:18 PM, Dave Reynolds <da...@gmail.com>wrote:

> For largely static data then the best way to mix inference and persistent
> storage in Jena is to load your data into memory, do the inference and
> store the results to the TDB store. Then when you want to use it open the
> TDB store as a plain OntModel (or Model) with no inference.
>
> A brute force version of this would be something like:
>
>         OntModel om = ModelFactory.**createOntologyModel(**
> OntModelSpec.OWL_MEM_MICRO_**RULE_INF);
>         FileManager.get().readModel( om, "myfile");
>
>         Dataset dataset = TDBFactory.createDataset("**mydir") ;
>         dataset.begin( ReadWrite.WRITE );
>         try {
>             dataset.getDefaultModel().add( om );
>         } finally {
>             dataset.end();
>         }
>
> Then when a future program wants to access the data:
>
>     OntModel om = ModelFactory.**createOntologyModel(
>          OntModelSpec.OWL_MEM, tdbdataset.getDefaultModel() );
>
> In some applications you only need to support a limited range of query
> patterns, in which case you can replace the add(om) stage by a more
> selective add of just the results you are going to be interested in.
>
> Dave
>
>
>
> On 17/10/13 08:24, Dibyanshu Jaiswal wrote:
>
>> Hi!
>> Yes you are right! Its true that the inferencing is done by the OntModel
>> and not the database. I tried to set the default model of the dataset by
>> using dataset.setDefaultModel(om), where om is a OntModel with
>> OntModelSpec.OWL_MEM_RULE_INF. But in this case the programme undergoes an
>> error.
>>
>> Can you please elaborate a bit on the solution which you say about writing
>> the whole OntModel to a Graph? The TDB I use is initialized (create) using
>> an owl file read locally, which stands to be the conceptual model for our
>> application and i dont want to modify that file, but at the same time want
>> to use inferencing.
>>
>>
>>
>>
>> On Fri, Oct 11, 2013 at 6:22 PM, Andy Seaborne <an...@apache.org> wrote:
>>
>>  Hi there,
>>>
>>> If you query the dataset directly, do you see any triples?  You will see
>>> different results if you query via the inference model.  The inference is
>>> not done in the database but in the inference engine associate with the
>>> OntModel.  The dtabase does not contain the deductions.
>>>
>>> You can store the inferred results to a database by writing the whole
>>> OntModel to a graph in the database.
>>>
>>>          Andy
>>>
>>>
>>> On 10/10/13 13:21, Dibyanshu Jaiswal wrote:
>>>
>>>  Here is a sample code of the above stated problem.
>>>>
>>>> public class ReadTDB {
>>>>
>>>>       public static void main(String[] args){
>>>>
>>>>       // open TDB dataset
>>>>       String directory = "./MyDatabases/OntologyTDB"; // My TDB created
>>>> beforehand
>>>>       Dataset dataset = TDBFactory.createDataset(****directory);
>>>>
>>>>       //Read Model from the tdb
>>>>       Model tdb = dataset.getDefaultModel();
>>>>
>>>>       // read the OntModel from the Model
>>>>       OntModel m = ModelFactory.****createOntologyModel(
>>>>
>>>> OntModelSpec.OWL_MEM,
>>>> tdb );
>>>>
>>>>       String sparqlQueryString=null;
>>>>
>>>>        sparqlQueryString = "SELECT ?s WHERE { ?s <
>>>> http://www.w3.org/2000/01/rdf-****schema#subClassOf<http://www.w3.org/2000/01/rdf-**schema#subClassOf>
>>>> <http://**www.w3.org/2000/01/rdf-schema#**subClassOf<http://www.w3.org/2000/01/rdf-schema#subClassOf>
>>>> >>
>>>> <
>>>> http://www.owl-ontologies.com/****unnamed.owl#ABCD<http://www.owl-ontologies.com/**unnamed.owl#ABCD>
>>>> <http://www.**owl-ontologies.com/unnamed.**owl#ABCD<http://www.owl-ontologies.com/unnamed.owl#ABCD>
>>>> >>
>>>> }";
>>>>
>>>>
>>>>       Query query = QueryFactory.create(****sparqlQueryString) ;
>>>>       QueryExecution qexec = QueryExecutionFactory.create(****query,
>>>> m) ;
>>>>
>>>>   //LINE
>>>> OF CONSIDERATION
>>>>       ResultSet results = qexec.execSelect() ;
>>>>
>>>>       ResultSetFormatter.out(****results) ;
>>>>
>>>>
>>>>       tdb.close();
>>>>       dataset.close();
>>>>
>>>>       }
>>>>
>>>> }
>>>>
>>>>
>>>>
>>>>
>>>> As per the above code snippet given, and the line marked as "LINE OF
>>>> CONSIDERATION", when i pass the OntModel m as the parameter the results
>>>> are
>>>> in accordence to the inference mechanisms (such as Transitive relations)
>>>> but if I change the parameter to dataset i.e.
>>>> QueryExecution qexec = QueryExecutionFactory.create(****query,
>>>> dataset) ;
>>>>
>>>> and hence execute the query, the results are not the same.
>>>> As per my observation the a Query made to the Dataset/TDB directly is
>>>> unable to provide inference mechanisms as provided by the OntModel, even
>>>> when the TDB creation is done as follows:
>>>>
>>>> public static OntModel createTDBFromOWL(){
>>>>
>>>>           Dataset dataset =
>>>> TDBFactory.createDataset("./****MyDatabases/OntologyTDB") ;
>>>>           Model m = dataset.getDefaultModel();
>>>>           OntModel om =ModelFactory.****createOntologyModel(
>>>>
>>>> OntModelSpec.OWL_MEM_RULE_INF, m );
>>>>           FileManager.get().readModel( om,"./OWLs/MyOWLfile.owl");
>>>>           return om;
>>>>
>>>>       }
>>>>
>>>>
>>>
>>>
>>>
>>>
>>>>
>>>>
>>>> Is there some way to create a Dataset object, which is Inference
>>>> enabled,
>>>> similar to creation of an OntModel like:
>>>>     OntModel om =ModelFactory.****createOntologyModel(
>>>>
>>>> OntModelSpec.OWL_MEM_RULE_INF, m );
>>>> so that the dataset supports inferenceing mechanisms?
>>>>
>>>>
>>>> On Thu, Oct 10, 2013 at 3:51 PM, Andy Seaborne <an...@apache.org> wrote:
>>>>
>>>>   On 10/10/13 10:12, Dibyanshu Jaiswal wrote:
>>>>
>>>>>
>>>>>   Hi!
>>>>>
>>>>>>
>>>>>> I am new to semantic web technologies, and have started with RDF/OWL
>>>>>> for
>>>>>> making web applications.
>>>>>> Currently i have a requirement for accessing a Ontology (OntModel with
>>>>>> OWLModelSpec.OWL_MEM_RULE_INF) from an OWL file.I am also able to
>>>>>> Store
>>>>>> in
>>>>>> local TDB, all done by JENA 2.11.0. Thanks to the Nice API and
>>>>>> Tutorial
>>>>>> provided for the same.
>>>>>> I need to fire SPARQL queries on the model to get some fruitful
>>>>>> results.
>>>>>> Once the TDB is created, in order to fire search query on the same,
>>>>>> the
>>>>>> results are not as expected.
>>>>>>
>>>>>> As per my SPARQL qurey, when directly made to the TDB Dataset, does
>>>>>> not
>>>>>> returns results (i.e. results with inference rules) accordingly. Where
>>>>>> as
>>>>>> If the same query is fired on the OntModel (loaded from the TDB, with
>>>>>> OWLModelSpec.OWL_MEM_RULE_INF ) itself the results are found to be as
>>>>>> expected.
>>>>>>
>>>>>>
>>>>>>  Generally, showing the details of what you are doing makes it easier
>>>>> for
>>>>> people to provide answers.  The details matter :-)
>>>>>
>>>>>
>>>>>
>>>>>   How do I solve the problem of making queries directly to the Dataset
>>>>> and
>>>>>
>>>>>> not to OntModel with inference rule enabled?
>>>>>>
>>>>>>
>>>>>>  Inference is a characteristic of the model (in RDF inference is
>>>>> within
>>>>> models/graphs, not between graphs).
>>>>>
>>>>> You need to create an ont model backed by a graph from the TDB store.
>>>>>
>>>>>           Andy
>>>>>
>>>>>
>>>>>    Please help!!
>>>>>
>>>>>  Thanks in Advance!!
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>>
>


-- 
*Dibyanshu Jaiswal*
Mb: +91 9038304989
Mb: +91 9674272265

Re: Problem inferencing from Jena TDB API

Posted by Dave Reynolds <da...@gmail.com>.
For largely static data then the best way to mix inference and 
persistent storage in Jena is to load your data into memory, do the 
inference and store the results to the TDB store. Then when you want to 
use it open the TDB store as a plain OntModel (or Model) with no inference.

A brute force version of this would be something like:

         OntModel om = 
ModelFactory.createOntologyModel(OntModelSpec.OWL_MEM_MICRO_RULE_INF);
         FileManager.get().readModel( om, "myfile");

         Dataset dataset = TDBFactory.createDataset("mydir") ;
         dataset.begin( ReadWrite.WRITE );
         try {
             dataset.getDefaultModel().add( om );
         } finally {
             dataset.end();
         }

Then when a future program wants to access the data:

     OntModel om = ModelFactory.createOntologyModel(
          OntModelSpec.OWL_MEM, tdbdataset.getDefaultModel() );

In some applications you only need to support a limited range of query 
patterns, in which case you can replace the add(om) stage by a more 
selective add of just the results you are going to be interested in.

Dave


On 17/10/13 08:24, Dibyanshu Jaiswal wrote:
> Hi!
> Yes you are right! Its true that the inferencing is done by the OntModel
> and not the database. I tried to set the default model of the dataset by
> using dataset.setDefaultModel(om), where om is a OntModel with
> OntModelSpec.OWL_MEM_RULE_INF. But in this case the programme undergoes an
> error.
>
> Can you please elaborate a bit on the solution which you say about writing
> the whole OntModel to a Graph? The TDB I use is initialized (create) using
> an owl file read locally, which stands to be the conceptual model for our
> application and i dont want to modify that file, but at the same time want
> to use inferencing.
>
>
>
>
> On Fri, Oct 11, 2013 at 6:22 PM, Andy Seaborne <an...@apache.org> wrote:
>
>> Hi there,
>>
>> If you query the dataset directly, do you see any triples?  You will see
>> different results if you query via the inference model.  The inference is
>> not done in the database but in the inference engine associate with the
>> OntModel.  The dtabase does not contain the deductions.
>>
>> You can store the inferred results to a database by writing the whole
>> OntModel to a graph in the database.
>>
>>          Andy
>>
>>
>> On 10/10/13 13:21, Dibyanshu Jaiswal wrote:
>>
>>> Here is a sample code of the above stated problem.
>>>
>>> public class ReadTDB {
>>>
>>>       public static void main(String[] args){
>>>
>>>       // open TDB dataset
>>>       String directory = "./MyDatabases/OntologyTDB"; // My TDB created
>>> beforehand
>>>       Dataset dataset = TDBFactory.createDataset(**directory);
>>>       //Read Model from the tdb
>>>       Model tdb = dataset.getDefaultModel();
>>>
>>>       // read the OntModel from the Model
>>>       OntModel m = ModelFactory.**createOntologyModel(
>>> OntModelSpec.OWL_MEM,
>>> tdb );
>>>
>>>       String sparqlQueryString=null;
>>>
>>>        sparqlQueryString = "SELECT ?s WHERE { ?s <
>>> http://www.w3.org/2000/01/rdf-**schema#subClassOf<http://www.w3.org/2000/01/rdf-schema#subClassOf>>
>>> <
>>> http://www.owl-ontologies.com/**unnamed.owl#ABCD<http://www.owl-ontologies.com/unnamed.owl#ABCD>>
>>> }";
>>>
>>>
>>>       Query query = QueryFactory.create(**sparqlQueryString) ;
>>>       QueryExecution qexec = QueryExecutionFactory.create(**query, m) ;
>>>   //LINE
>>> OF CONSIDERATION
>>>       ResultSet results = qexec.execSelect() ;
>>>
>>>       ResultSetFormatter.out(**results) ;
>>>
>>>       tdb.close();
>>>       dataset.close();
>>>
>>>       }
>>>
>>> }
>>>
>>>
>>>
>>>
>>> As per the above code snippet given, and the line marked as "LINE OF
>>> CONSIDERATION", when i pass the OntModel m as the parameter the results
>>> are
>>> in accordence to the inference mechanisms (such as Transitive relations)
>>> but if I change the parameter to dataset i.e.
>>> QueryExecution qexec = QueryExecutionFactory.create(**query, dataset) ;
>>> and hence execute the query, the results are not the same.
>>> As per my observation the a Query made to the Dataset/TDB directly is
>>> unable to provide inference mechanisms as provided by the OntModel, even
>>> when the TDB creation is done as follows:
>>>
>>> public static OntModel createTDBFromOWL(){
>>>
>>>           Dataset dataset =
>>> TDBFactory.createDataset("./**MyDatabases/OntologyTDB") ;
>>>           Model m = dataset.getDefaultModel();
>>>           OntModel om =ModelFactory.**createOntologyModel(
>>> OntModelSpec.OWL_MEM_RULE_INF, m );
>>>           FileManager.get().readModel( om,"./OWLs/MyOWLfile.owl");
>>>           return om;
>>>
>>>       }
>>>
>>
>>
>>
>>
>>>
>>>
>>>
>>> Is there some way to create a Dataset object, which is Inference enabled,
>>> similar to creation of an OntModel like:
>>>     OntModel om =ModelFactory.**createOntologyModel(
>>> OntModelSpec.OWL_MEM_RULE_INF, m );
>>> so that the dataset supports inferenceing mechanisms?
>>>
>>>
>>> On Thu, Oct 10, 2013 at 3:51 PM, Andy Seaborne <an...@apache.org> wrote:
>>>
>>>   On 10/10/13 10:12, Dibyanshu Jaiswal wrote:
>>>>
>>>>   Hi!
>>>>>
>>>>> I am new to semantic web technologies, and have started with RDF/OWL for
>>>>> making web applications.
>>>>> Currently i have a requirement for accessing a Ontology (OntModel with
>>>>> OWLModelSpec.OWL_MEM_RULE_INF) from an OWL file.I am also able to Store
>>>>> in
>>>>> local TDB, all done by JENA 2.11.0. Thanks to the Nice API and Tutorial
>>>>> provided for the same.
>>>>> I need to fire SPARQL queries on the model to get some fruitful results.
>>>>> Once the TDB is created, in order to fire search query on the same, the
>>>>> results are not as expected.
>>>>>
>>>>> As per my SPARQL qurey, when directly made to the TDB Dataset, does not
>>>>> returns results (i.e. results with inference rules) accordingly. Where
>>>>> as
>>>>> If the same query is fired on the OntModel (loaded from the TDB, with
>>>>> OWLModelSpec.OWL_MEM_RULE_INF ) itself the results are found to be as
>>>>> expected.
>>>>>
>>>>>
>>>> Generally, showing the details of what you are doing makes it easier for
>>>> people to provide answers.  The details matter :-)
>>>>
>>>>
>>>>
>>>>   How do I solve the problem of making queries directly to the Dataset and
>>>>> not to OntModel with inference rule enabled?
>>>>>
>>>>>
>>>> Inference is a characteristic of the model (in RDF inference is within
>>>> models/graphs, not between graphs).
>>>>
>>>> You need to create an ont model backed by a graph from the TDB store.
>>>>
>>>>           Andy
>>>>
>>>>
>>>>    Please help!!
>>>>
>>>>> Thanks in Advance!!
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>
>
>


Re: Problem inferencing from Jena TDB API

Posted by Dibyanshu Jaiswal <dj...@gmail.com>.
Hi!
Yes you are right! Its true that the inferencing is done by the OntModel
and not the database. I tried to set the default model of the dataset by
using dataset.setDefaultModel(om), where om is a OntModel with
OntModelSpec.OWL_MEM_RULE_INF. But in this case the programme undergoes an
error.

Can you please elaborate a bit on the solution which you say about writing
the whole OntModel to a Graph? The TDB I use is initialized (create) using
an owl file read locally, which stands to be the conceptual model for our
application and i dont want to modify that file, but at the same time want
to use inferencing.




On Fri, Oct 11, 2013 at 6:22 PM, Andy Seaborne <an...@apache.org> wrote:

> Hi there,
>
> If you query the dataset directly, do you see any triples?  You will see
> different results if you query via the inference model.  The inference is
> not done in the database but in the inference engine associate with the
> OntModel.  The dtabase does not contain the deductions.
>
> You can store the inferred results to a database by writing the whole
> OntModel to a graph in the database.
>
>         Andy
>
>
> On 10/10/13 13:21, Dibyanshu Jaiswal wrote:
>
>> Here is a sample code of the above stated problem.
>>
>> public class ReadTDB {
>>
>>      public static void main(String[] args){
>>
>>      // open TDB dataset
>>      String directory = "./MyDatabases/OntologyTDB"; // My TDB created
>> beforehand
>>      Dataset dataset = TDBFactory.createDataset(**directory);
>>      //Read Model from the tdb
>>      Model tdb = dataset.getDefaultModel();
>>
>>      // read the OntModel from the Model
>>      OntModel m = ModelFactory.**createOntologyModel(
>> OntModelSpec.OWL_MEM,
>> tdb );
>>
>>      String sparqlQueryString=null;
>>
>>       sparqlQueryString = "SELECT ?s WHERE { ?s <
>> http://www.w3.org/2000/01/rdf-**schema#subClassOf<http://www.w3.org/2000/01/rdf-schema#subClassOf>>
>> <
>> http://www.owl-ontologies.com/**unnamed.owl#ABCD<http://www.owl-ontologies.com/unnamed.owl#ABCD>>
>> }";
>>
>>
>>      Query query = QueryFactory.create(**sparqlQueryString) ;
>>      QueryExecution qexec = QueryExecutionFactory.create(**query, m) ;
>>  //LINE
>> OF CONSIDERATION
>>      ResultSet results = qexec.execSelect() ;
>>
>>      ResultSetFormatter.out(**results) ;
>>
>>      tdb.close();
>>      dataset.close();
>>
>>      }
>>
>> }
>>
>>
>>
>>
>> As per the above code snippet given, and the line marked as "LINE OF
>> CONSIDERATION", when i pass the OntModel m as the parameter the results
>> are
>> in accordence to the inference mechanisms (such as Transitive relations)
>> but if I change the parameter to dataset i.e.
>> QueryExecution qexec = QueryExecutionFactory.create(**query, dataset) ;
>> and hence execute the query, the results are not the same.
>> As per my observation the a Query made to the Dataset/TDB directly is
>> unable to provide inference mechanisms as provided by the OntModel, even
>> when the TDB creation is done as follows:
>>
>> public static OntModel createTDBFromOWL(){
>>
>>          Dataset dataset =
>> TDBFactory.createDataset("./**MyDatabases/OntologyTDB") ;
>>          Model m = dataset.getDefaultModel();
>>          OntModel om =ModelFactory.**createOntologyModel(
>> OntModelSpec.OWL_MEM_RULE_INF, m );
>>          FileManager.get().readModel( om,"./OWLs/MyOWLfile.owl");
>>          return om;
>>
>>      }
>>
>
>
>
>
>>
>>
>>
>> Is there some way to create a Dataset object, which is Inference enabled,
>> similar to creation of an OntModel like:
>>    OntModel om =ModelFactory.**createOntologyModel(
>> OntModelSpec.OWL_MEM_RULE_INF, m );
>> so that the dataset supports inferenceing mechanisms?
>>
>>
>> On Thu, Oct 10, 2013 at 3:51 PM, Andy Seaborne <an...@apache.org> wrote:
>>
>>  On 10/10/13 10:12, Dibyanshu Jaiswal wrote:
>>>
>>>  Hi!
>>>>
>>>> I am new to semantic web technologies, and have started with RDF/OWL for
>>>> making web applications.
>>>> Currently i have a requirement for accessing a Ontology (OntModel with
>>>> OWLModelSpec.OWL_MEM_RULE_INF) from an OWL file.I am also able to Store
>>>> in
>>>> local TDB, all done by JENA 2.11.0. Thanks to the Nice API and Tutorial
>>>> provided for the same.
>>>> I need to fire SPARQL queries on the model to get some fruitful results.
>>>> Once the TDB is created, in order to fire search query on the same, the
>>>> results are not as expected.
>>>>
>>>> As per my SPARQL qurey, when directly made to the TDB Dataset, does not
>>>> returns results (i.e. results with inference rules) accordingly. Where
>>>> as
>>>> If the same query is fired on the OntModel (loaded from the TDB, with
>>>> OWLModelSpec.OWL_MEM_RULE_INF ) itself the results are found to be as
>>>> expected.
>>>>
>>>>
>>> Generally, showing the details of what you are doing makes it easier for
>>> people to provide answers.  The details matter :-)
>>>
>>>
>>>
>>>  How do I solve the problem of making queries directly to the Dataset and
>>>> not to OntModel with inference rule enabled?
>>>>
>>>>
>>> Inference is a characteristic of the model (in RDF inference is within
>>> models/graphs, not between graphs).
>>>
>>> You need to create an ont model backed by a graph from the TDB store.
>>>
>>>          Andy
>>>
>>>
>>>   Please help!!
>>>
>>>> Thanks in Advance!!
>>>>
>>>>
>>>>
>>>
>>
>>
>


-- 
*Dibyanshu Jaiswal*
Mb: +91 9038304989
Mb: +91 9674272265

Re: Problem inferencing from Jena TDB API

Posted by Andy Seaborne <an...@apache.org>.
Hi there,

If you query the dataset directly, do you see any triples?  You will see 
different results if you query via the inference model.  The inference 
is not done in the database but in the inference engine associate with 
the OntModel.  The dtabase does not contain the deductions.

You can store the inferred results to a database by writing the whole 
OntModel to a graph in the database.

	Andy

On 10/10/13 13:21, Dibyanshu Jaiswal wrote:
> Here is a sample code of the above stated problem.
>
> public class ReadTDB {
>
>      public static void main(String[] args){
>
>      // open TDB dataset
>      String directory = "./MyDatabases/OntologyTDB"; // My TDB created
> beforehand
>      Dataset dataset = TDBFactory.createDataset(directory);
>      //Read Model from the tdb
>      Model tdb = dataset.getDefaultModel();
>
>      // read the OntModel from the Model
>      OntModel m = ModelFactory.createOntologyModel( OntModelSpec.OWL_MEM,
> tdb );
>
>      String sparqlQueryString=null;
>
>       sparqlQueryString = "SELECT ?s WHERE { ?s <
> http://www.w3.org/2000/01/rdf-schema#subClassOf> <
> http://www.owl-ontologies.com/unnamed.owl#ABCD> }";
>
>
>      Query query = QueryFactory.create(sparqlQueryString) ;
>      QueryExecution qexec = QueryExecutionFactory.create(query, m) ;  //LINE
> OF CONSIDERATION
>      ResultSet results = qexec.execSelect() ;
>
>      ResultSetFormatter.out(results) ;
>
>      tdb.close();
>      dataset.close();
>
>      }
>
> }
>
>
>
>
> As per the above code snippet given, and the line marked as "LINE OF
> CONSIDERATION", when i pass the OntModel m as the parameter the results are
> in accordence to the inference mechanisms (such as Transitive relations)
> but if I change the parameter to dataset i.e.
> QueryExecution qexec = QueryExecutionFactory.create(query, dataset) ;
> and hence execute the query, the results are not the same.
> As per my observation the a Query made to the Dataset/TDB directly is
> unable to provide inference mechanisms as provided by the OntModel, even
> when the TDB creation is done as follows:
>
> public static OntModel createTDBFromOWL(){
>
>          Dataset dataset =
> TDBFactory.createDataset("./MyDatabases/OntologyTDB") ;
>          Model m = dataset.getDefaultModel();
>          OntModel om =ModelFactory.createOntologyModel(
> OntModelSpec.OWL_MEM_RULE_INF, m );
>          FileManager.get().readModel( om,"./OWLs/MyOWLfile.owl");
>          return om;
>
>      }



>
>
>
>
> Is there some way to create a Dataset object, which is Inference enabled,
> similar to creation of an OntModel like:
>    OntModel om =ModelFactory.createOntologyModel(
> OntModelSpec.OWL_MEM_RULE_INF, m );
> so that the dataset supports inferenceing mechanisms?
>
>
> On Thu, Oct 10, 2013 at 3:51 PM, Andy Seaborne <an...@apache.org> wrote:
>
>> On 10/10/13 10:12, Dibyanshu Jaiswal wrote:
>>
>>> Hi!
>>>
>>> I am new to semantic web technologies, and have started with RDF/OWL for
>>> making web applications.
>>> Currently i have a requirement for accessing a Ontology (OntModel with
>>> OWLModelSpec.OWL_MEM_RULE_INF) from an OWL file.I am also able to Store in
>>> local TDB, all done by JENA 2.11.0. Thanks to the Nice API and Tutorial
>>> provided for the same.
>>> I need to fire SPARQL queries on the model to get some fruitful results.
>>> Once the TDB is created, in order to fire search query on the same, the
>>> results are not as expected.
>>>
>>> As per my SPARQL qurey, when directly made to the TDB Dataset, does not
>>> returns results (i.e. results with inference rules) accordingly. Where as
>>> If the same query is fired on the OntModel (loaded from the TDB, with
>>> OWLModelSpec.OWL_MEM_RULE_INF ) itself the results are found to be as
>>> expected.
>>>
>>
>> Generally, showing the details of what you are doing makes it easier for
>> people to provide answers.  The details matter :-)
>>
>>
>>
>>> How do I solve the problem of making queries directly to the Dataset and
>>> not to OntModel with inference rule enabled?
>>>
>>
>> Inference is a characteristic of the model (in RDF inference is within
>> models/graphs, not between graphs).
>>
>> You need to create an ont model backed by a graph from the TDB store.
>>
>>          Andy
>>
>>
>>   Please help!!
>>> Thanks in Advance!!
>>>
>>>
>>
>
>


Re: Problem inferencing from Jena TDB API

Posted by Dibyanshu Jaiswal <dj...@gmail.com>.
Here is a sample code of the above stated problem.

public class ReadTDB {

    public static void main(String[] args){

    // open TDB dataset
    String directory = "./MyDatabases/OntologyTDB"; // My TDB created
beforehand
    Dataset dataset = TDBFactory.createDataset(directory);
    //Read Model from the tdb
    Model tdb = dataset.getDefaultModel();

    // read the OntModel from the Model
    OntModel m = ModelFactory.createOntologyModel( OntModelSpec.OWL_MEM,
tdb );

    String sparqlQueryString=null;

     sparqlQueryString = "SELECT ?s WHERE { ?s <
http://www.w3.org/2000/01/rdf-schema#subClassOf> <
http://www.owl-ontologies.com/unnamed.owl#ABCD> }";


    Query query = QueryFactory.create(sparqlQueryString) ;
    QueryExecution qexec = QueryExecutionFactory.create(query, m) ;  //LINE
OF CONSIDERATION
    ResultSet results = qexec.execSelect() ;

    ResultSetFormatter.out(results) ;

    tdb.close();
    dataset.close();

    }

}




As per the above code snippet given, and the line marked as "LINE OF
CONSIDERATION", when i pass the OntModel m as the parameter the results are
in accordence to the inference mechanisms (such as Transitive relations)
but if I change the parameter to dataset i.e.
QueryExecution qexec = QueryExecutionFactory.create(query, dataset) ;
and hence execute the query, the results are not the same.
As per my observation the a Query made to the Dataset/TDB directly is
unable to provide inference mechanisms as provided by the OntModel, even
when the TDB creation is done as follows:

public static OntModel createTDBFromOWL(){

        Dataset dataset =
TDBFactory.createDataset("./MyDatabases/OntologyTDB") ;
        Model m = dataset.getDefaultModel();
        OntModel om =ModelFactory.createOntologyModel(
OntModelSpec.OWL_MEM_RULE_INF, m );
        FileManager.get().readModel( om,"./OWLs/MyOWLfile.owl");
        return om;

    }




Is there some way to create a Dataset object, which is Inference enabled,
similar to creation of an OntModel like:
  OntModel om =ModelFactory.createOntologyModel(
OntModelSpec.OWL_MEM_RULE_INF, m );
so that the dataset supports inferenceing mechanisms?


On Thu, Oct 10, 2013 at 3:51 PM, Andy Seaborne <an...@apache.org> wrote:

> On 10/10/13 10:12, Dibyanshu Jaiswal wrote:
>
>> Hi!
>>
>> I am new to semantic web technologies, and have started with RDF/OWL for
>> making web applications.
>> Currently i have a requirement for accessing a Ontology (OntModel with
>> OWLModelSpec.OWL_MEM_RULE_INF) from an OWL file.I am also able to Store in
>> local TDB, all done by JENA 2.11.0. Thanks to the Nice API and Tutorial
>> provided for the same.
>> I need to fire SPARQL queries on the model to get some fruitful results.
>> Once the TDB is created, in order to fire search query on the same, the
>> results are not as expected.
>>
>> As per my SPARQL qurey, when directly made to the TDB Dataset, does not
>> returns results (i.e. results with inference rules) accordingly. Where as
>> If the same query is fired on the OntModel (loaded from the TDB, with
>> OWLModelSpec.OWL_MEM_RULE_INF ) itself the results are found to be as
>> expected.
>>
>
> Generally, showing the details of what you are doing makes it easier for
> people to provide answers.  The details matter :-)
>
>
>
>> How do I solve the problem of making queries directly to the Dataset and
>> not to OntModel with inference rule enabled?
>>
>
> Inference is a characteristic of the model (in RDF inference is within
> models/graphs, not between graphs).
>
> You need to create an ont model backed by a graph from the TDB store.
>
>         Andy
>
>
>  Please help!!
>> Thanks in Advance!!
>>
>>
>


-- 
*Dibyanshu Jaiswal*
Mb: +91 9038304989
Mb: +91 9674272265

Re: Problem inferencing from Jena TDB API

Posted by Andy Seaborne <an...@apache.org>.
On 10/10/13 10:12, Dibyanshu Jaiswal wrote:
> Hi!
>
> I am new to semantic web technologies, and have started with RDF/OWL for
> making web applications.
> Currently i have a requirement for accessing a Ontology (OntModel with
> OWLModelSpec.OWL_MEM_RULE_INF) from an OWL file.I am also able to Store in
> local TDB, all done by JENA 2.11.0. Thanks to the Nice API and Tutorial
> provided for the same.
> I need to fire SPARQL queries on the model to get some fruitful results.
> Once the TDB is created, in order to fire search query on the same, the
> results are not as expected.
>
> As per my SPARQL qurey, when directly made to the TDB Dataset, does not
> returns results (i.e. results with inference rules) accordingly. Where as
> If the same query is fired on the OntModel (loaded from the TDB, with
> OWLModelSpec.OWL_MEM_RULE_INF ) itself the results are found to be as
> expected.

Generally, showing the details of what you are doing makes it easier for 
people to provide answers.  The details matter :-)

>
> How do I solve the problem of making queries directly to the Dataset and
> not to OntModel with inference rule enabled?

Inference is a characteristic of the model (in RDF inference is within 
models/graphs, not between graphs).

You need to create an ont model backed by a graph from the TDB store.

	Andy

> Please help!!
> Thanks in Advance!!
>