You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@jena.apache.org by DAVID PATTERSON <pa...@me.com> on 2012/04/30 16:19:51 UTC

Using OntModel with TDB and Fuseki

For now, I'm successfully building a plain model, reading a set of .ttl files into it, creating a TDB database and using it with Fuseki.

I'd like to try using an OntModel or InfModel to start getting some additional entailments in the data.

My current code is like:

            Dataset ds = TDBFactory.createDataset( newLocation ) ;   // a string
            Model model = ds.getDefaultModel() ;
           
            RDFReader rdfRdr = model.getReader( "TTL" );
            for ( String fn : files )
            {
                InputStream dataStream = new FileInputStream( fn );
                rdfRdr.read( model, dataStream, "" );
                dataStream.close();
            }
            // Close the dataset.
            ds.close();

What should I do to get a more powerful model?

Thanks.

Dave Patterson 

Re: Using OntModel with TDB and Fuseki

Posted by Andy Seaborne <an...@apache.org>.
On 30/04/12 17:47, DAVID PATTERSON wrote:
> Thanks. I'll experiment for now.
>
> Dave P
>
> On Apr 30, 2012, at 12:05 PM, Dave Reynolds <da...@gmail.com>
> wrote:
>
>> Hi Dave,
>>
>> When you create an OntModel you can pass in a base model.
>> So in this case just pass in your "model" (though the dataset will need
>> to be open still).
>>
>> Your choice of OntModelSpec to pass to the OntModel constructor will
>> determine whether any inference is enabled or not.
>>
>> Bewarned that inference over a TDB-backed model will be even slower than
>> inference over a memory-backed model. For this reason a common pattern
>> is to compute the inferences you want, in memory, and then add those as
>> a graph in the TDB and use a non-inference model to access the union.
>> Precise details vary widely according to what you are trying to do.
>>
>> Dave

If you just want RDFS (subclass, subproperty, range, domain) you can 
process the data first with "riotcmd.infer" which expands the data for 
loading so you can run raw on the materialised inferences.

But it's limited to RDFS (it will easily extend to any inference which 
relies on one A-box triple at a time - one triple from the data stream).

	andy

>>
>>
>> On 30/04/12 15:19, DAVID PATTERSON wrote:
>> >
>> > For now, I'm successfully building a plain model, reading a set of .ttl
>> > files into it, creating a TDB database and using it with Fuseki.
>> >
>> > I'd like to try using an OntModel or InfModel to start getting some
>> > additional entailments in the data.
>> >
>> > My current code is like:
>> >
>> > Dataset ds = TDBFactory.createDataset( newLocation ) ; // a string
>> > Model model = ds.getDefaultModel() ;
>> >
>> > RDFReader rdfRdr = model.getReader( "TTL" );
>> > for ( String fn : files )
>> > {
>> > InputStream dataStream = new FileInputStream( fn );
>> > rdfRdr.read( model, dataStream, "" );
>> > dataStream.close();
>> > }
>> > // Close the dataset.
>> > ds.close();
>> >
>> > What should I do to get a more powerful model?
>> >
>> > Thanks.
>> >
>> > Dave Patterson
>>


Re: Using OntModel with TDB and Fuseki

Posted by DAVID PATTERSON <pa...@mac.com>.
Thanks. I'll experiment for now.

Dave P

On Apr 30, 2012, at 12:05 PM, Dave Reynolds <da...@gmail.com> wrote:

> Hi Dave,
>
> When you create an OntModel you can pass in a base model.
> So in this case just pass in your "model" (though the dataset will need
> to be open still).
>
> Your choice of OntModelSpec to pass to the OntModel constructor will
> determine whether any inference is enabled or not.
>
> Bewarned that inference over a TDB-backed model will be even slower than
> inference over a memory-backed model. For this reason a common pattern
> is to compute the inferences you want, in memory, and then add those as
> a graph in the TDB and use a non-inference model to access the union.
> Precise details vary widely according to what you are trying to do.
>
> Dave
>
>
> On 30/04/12 15:19, DAVID PATTERSON wrote:
> >
> > For now, I'm successfully building a plain model, reading a set of .ttl
> > files into it, creating a TDB database and using it with Fuseki.
> >
> > I'd like to try using an OntModel or InfModel to start getting some
> > additional entailments in the data.
> >
> > My current code is like:
> >
> > Dataset ds = TDBFactory.createDataset( newLocation ) ; // a string
> > Model model = ds.getDefaultModel() ;
> >
> > RDFReader rdfRdr = model.getReader( "TTL" );
> > for ( String fn : files )
> > {
> > InputStream dataStream = new FileInputStream( fn );
> > rdfRdr.read( model, dataStream, "" );
> > dataStream.close();
> > }
> > // Close the dataset.
> > ds.close();
> >
> > What should I do to get a more powerful model?
> >
> > Thanks.
> >
> > Dave Patterson
>

Re: Using OntModel with TDB and Fuseki

Posted by Dave Reynolds <da...@gmail.com>.
Hi Dave,

When you create an OntModel you can pass in a base model.
So in this case just pass in your "model" (though the dataset will need 
to be open still).

Your choice of OntModelSpec to pass to the OntModel constructor will 
determine whether any inference is enabled or not.

Bewarned that inference over a TDB-backed model will be even slower than 
inference over a memory-backed model. For this reason a common pattern 
is to compute the inferences you want, in memory, and then add those as 
a graph in the TDB and use a non-inference model to access the union. 
Precise details vary widely according to what you are trying to do.

Dave


On 30/04/12 15:19, DAVID PATTERSON wrote:
>
> For now, I'm successfully building a plain model, reading a set of .ttl
> files into it, creating a TDB database and using it with Fuseki.
>
> I'd like to try using an OntModel or InfModel to start getting some
> additional entailments in the data.
>
> My current code is like:
>
> Dataset ds = TDBFactory.createDataset( newLocation ) ; // a string
> Model model = ds.getDefaultModel() ;
>
> RDFReader rdfRdr = model.getReader( "TTL" );
> for ( String fn : files )
> {
> InputStream dataStream = new FileInputStream( fn );
> rdfRdr.read( model, dataStream, "" );
> dataStream.close();
> }
> // Close the dataset.
> ds.close();
>
> What should I do to get a more powerful model?
>
> Thanks.
>
> Dave Patterson