You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@jena.apache.org by "Wagner, Anna" <wa...@iib.tu-darmstadt.de> on 2018/01/02 10:15:06 UTC

AW: Reasoning over multiple graphs following the same schema

Thank you very much for your help again. I've done some more tests to get a more detailed description of my problem. 
First of all, I've noticed, I forgot to tell you, that I'm using the Version 3.4.0. and not the most current one since I had problems uploading data using the web interface (parsing error). I don't know whether you have been working on inferencing and TDB in the last updates, so maybe my problem has already been fixed. Please tell me if it is.
Second, my tests showed that the unionizing does actually work, but only for one named graph (the first one I've uploaded). For any other - later added - graphs the unionizing neither works for simple non-inferenced queries nor more complex inference-based ones. If I add the new graphs to the default graph by not naming them, the querying and inferencing works fine, however.
Now, as you've asked, some more detailed data:

1) My config file looks like this:

# Licensed under the terms of http://www.apache.org/licenses/LICENSE-2.0
@prefix : <#> .
@prefix fuseki: <http://jena.apache.org/fuseki#> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix tdb: <http://jena.hpl.hp.com/2008/tdb#> .
@prefix ja: <http://jena.hpl.hp.com/2005/11/Assembler#> .
[] rdf:type fuseki:Server ;
 fuseki:services (
 <#service1>
 ) .
# Custom code.
[] ja:loadClass "com.hp.hpl.jena.tdb.TDB" .
# TDB
tdb:DatasetTDB rdfs:subClassOf ja:RDFDataset .
tdb:GraphTDB rdfs:subClassOf ja:Model .
## ---------------------------------------------------------------
## Service with only SPARQL query on an inference model.
## Inference model bbase data in TDB.
<#service1> rdf:type fuseki:Service ;
 fuseki:name "ViKoDB" ; # http://host/inf
 fuseki:serviceQuery "sparql" ; # SPARQL query service
## fuseki:serviceUpdate "update" ;
 fuseki:serviceUpload "upload" ; # Non-SPARQL upload service
 fuseki:serviceReadWriteGraphStore "data" ; # SPARQL Graph store protocol (read and write)
 ## A separate ead-only graph store endpoint:
 fuseki:serviceReadGraphStore "get" ; # SPARQL Graph store protocol (read only)y
 fuseki:dataset <#dataset> ;
 .
<#dataset> rdf:type ja:RDFDataset ;
 ja:defaultGraph <#model_inf> ;
 .
<#model_inf> a ja:InfModel ;
 ja:baseModel  <#tdbGraph> ;
 ja:reasoner [
 ja:reasonerURL <http://jena.hpl.hp.com/2003/OWLFBRuleReasoner> ;
 ja:schema <#solconpro_model>
] .
<#solconpro_model> a ja:MemoryModel ;
 ja:content [ja:externalContent <file:solconpro.ttl>]
.
<#tdbDataset> rdf:type tdb:DatasetTDB ;
 tdb:location "DB" ;
 ## If the unionDefaultGraph is used, then the "update" service should be removed.
 ## The unionDefaultGraph is read only.
 tdb:unionDefaultGraph true ;
 .
<#tdbGraph> rdf:type tdb:GraphTDB ;
 tdb:dataset <#tdbDataset> . 

With the solconpro.ttl file in the same folder. Here is an excerpt of this file (my base ontology):

@prefix scp: <http://www.solconpro.de/ontologies/scp#> .
@prefix owl: <http://www.w3.org/2002/07/owl#> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .

<http://www.solconpro.de/ontologies/scp>
  rdf:type owl:Ontology ;
.
scp:Assembly
  rdf:type owl:Class ;
  rdfs:subClassOf scp:Component ;
  owl:disjointWith scp:Element ;
.
scp:Component
  rdf:type owl:Class ;
.  
scp:Element
  rdf:type owl:Class ;
  rdfs:subClassOf scp:Component ;
.

2) Here is some example data (v1.ttl): 
@prefix : <http://www.solconpro.de/ontologies/v1#> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix scp: <http://www.solconpro.de/ontologies/scp#> .

<http://www.solconpro.de/ontologies/v1>
  rdf:type owl:Ontology ;
  owl:imports <http://www.solconpro.de/ontologies/scp> ;
.
:Co_Module
  rdf:type scp:Assembly ;
  rdf:type owl:NamedIndividual ;
.
:Co_BackingMaterial
  rdf:type scp:Element ;
  rdf:type owl:NamedIndividual ;
.
:Co_Diodes
  rdf:type scp:Element ;
  rdf:type owl:NamedIndividual ;
.
And v2.ttl:
@prefix : <http://www.solconpro.de/ontologies/v2#> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix scp: <http://www.solconpro.de/ontologies/scp#> .

<http://www.solconpro.de/ontologies/v2>
  rdf:type owl:Ontology ;
  owl:imports <http://www.solconpro.de/ontologies/scp> ;
.
:Co_System
  rdf:type scp:Assembly ;
  rdf:type owl:NamedIndividual ;
.
:Co_Cell
  rdf:type scp:Element ;
  rdf:type owl:NamedIndividual ;
.
:Co_Interconnectors
  rdf:type scp:Element ;
  rdf:type owl:NamedIndividual ;
.
I want to store this data in named graphs (e.g. "V1" and "V2") in my dataset "ViKoDB" as defined in my config.

3) As one of my use cases for this application, I want to get all objects that classify as "Component" (meaning also "Element" and "Assembly" entities, since these classes derive from the "Component" class). My query to do so looks like this:
PREFIX scp : <http://www.solconpro.de/ontologies/scp#>
SELECT ?co
WHERE{
?co a scp:Component .
}
As result I would expect to get a list like this:
{
  "head": {
    "vars": [ "co" ]
  } ,
  "results": {
    "bindings": [
      {
        "co": { "type": "uri" , "value": "http://www.solconpro.de/ontologies/ v1#Co_Module " }
      } ,
      {
        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_BackingMaterial " }
      } ,
      {
        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_Diodes " }
      } ,
      {
        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_System " }
      } ,
      {
        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_Cell " }
      } ,
      {
        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_Interconnectors " }
      } 
    ]
  }
}
Instead, however, I get a half-empty list containing only entries from the V1 graph:
{
  "head": {
    "vars": [ "co" ]
  } ,
  "results": {
    "bindings": [
      {
        "co": { "type": "uri" , "value": "http://www.solconpro.de/ontologies/ v1#Co_Module " }
      } ,
      {
        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_BackingMaterial " }
      } ,
      {
        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_Diodes " }
      } 
    ]
  }
}
 If I load the second ttl file into the default graph, I do get the list as above.


-----Ursprüngliche Nachricht-----
Von: ajs6f [mailto:ajs6f@apache.org] 
Gesendet: Dienstag, 26. Dezember 2017 16:34
An: users@jena.apache.org
Betreff: Re: Reasoning over multiple graphs following the same schema

I saw your example in the first email, but it included a good deal of other material. We need to isolate your problem. Can you provide a complete example with

1) your Fuseki config
2) some sample data (doesn't have to be much, just enough to show the problem)
3) the queries you are run, what you expect to get, and what you get instead?

As for the named graph question, Dave Reynolds (who knows enormously more about inference in Jena than do I) can correct me if needed, but I don't think Jena is capable of what you are asking for (dynamically allocated inference using only assembler RDF) right now. If you are able to write some Java, you could get this done fairly well using the Java API. It's a worthwhile function and you are welcome to file a ticket describing this.

A quick and "hacky" way of doing this might be to have a single named graph that is set up for inference, and then replace the contents of that graph with the triples you need to use for a given query using Fuseki's support for SPARQL Graph Protocol. Obviously, that's not going to work in concurrency.

Maybe you can tell us more about what you are trying to do? We may be able to suggest a design more conducive to your goals.

ajs6f

> On Dec 22, 2017, at 4:34 AM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
> 
> Thank you for your quick response.
> 
> The idea is to use SPARQL directly against Fuseki and allow the user to upload new graphs into the dataset as named graphs using the web-interface. I've already looked into that example, I think. At least my config file includes this setup. The reasoning does work using SPARQL queries against Fuseki, but only for the default graph and not the named graphs in that dataset. Additionally, the unionising of the named graphs into the default graph does not work, which I can't explain. The definition of my tdbDataset is the following (I've sent the complete config at the bottom of my first mail): 
> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If the  unionDefaultGraph is used, then the "update" service should be removed. <-- I also did that
> 	## The unionDefaultGraph is read only.
> 	tdb:unionDefaultGraph true ;
> .
> 
> Regarding the second example you sent: As I understand, this would allow me to add named graphs in the config file, but what about those I would like to add later on? Is it possible to include/create a reasoner that works on the entire dataset - including named graphs that are added - or do I have to add all graphs in the config file at the start of the Fuseki server?
> 
> -----Ursprüngliche Nachricht-----
> Von: ajs6f [mailto:ajs6f@apache.org]
> Gesendet: Donnerstag, 21. Dezember 2017 22:29
> An: users@jena.apache.org
> Betreff: Re: Reasoning over multiple graphs following the same schema
> 
> _How_ are you trying to use this setup: are you expecting to use the Java API to query this setup, or just use SPARQL directly against Fuseki?
> 
> If you use the Java API, you can set up whatever ontological models you want over whatever named graphs or unions you want (although performance may or may not be what you want).
> 
> If you expect to just send SPARQL against Fuseki and have the inference occur automatically as part of the process of answering your queries, it's a different story... there is an example here:
> 
> https://github.com/apache/jena/blob/master/jena-fuseki2/examples/servi
> ce-inference-2.ttl
> 
> of what I think you are trying to do. Perhaps you can try it directly?
> 
> As for using the named graphs, we do have examples in our test of assembler RDF incantations like:
> 
> <#graph> rdf:type tdb:GraphTDB ;
>    tdb:location "/my/tdb/DB" ;
>    tdb:graphName "http://example.com/graph"  .
> 
> so it might be worth trying something like that, or something like that with the tdb:dataset property, a la:
> 
> <#tdbGraph> rdf:type tdb:GraphTDB ;
>     tdb:dataset <#tdbDataset> ;
>     tdb:graphName "http://example.com/graph" .
> 
> Then you could use the same
> 
> <#model_inf> a ja:InfModel ;
>    ja:baseModel  <#tdbGraph> ; etc.
> 
> That may or may not work, and if it doesn't, we can try and figure out 
> why and whether it would be possible to add that feature. (Andy or 
> Dave may have some insight already for this.)
> 
> ajs6f
> 
>> On Dec 21, 2017, at 1:12 PM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
>> 
>> Hi there,
>> 
>> hopefully this question wasn't already asked a hundred times, but I haven't found anything quite the same in the archives.
>> First off: my use case is that I want to store multiple named graphs within one dataset, all of which are based on the same schema. Now I'd like to use a reasoner for all of these named graphs. To make it even more complicated, it would also be a nice feature, if all these named graphs would be unionised in the default graph, so I could also use the reasoner for all of them at the same time.
>> Now here's what I've done so far: the base for my Fuseki is a TDB according to this example I've found (http://krr.cs.vu.nl/wp-content/uploads/2013/09/protege-fuseki-yasgui-manual.pdf), the inferring is working fine for the default graph (the default graph only, though). So I tried to unionise the named graphs in the default graph, thinking this would allow me to maybe just store the basic schema in the default graph and infer through the entire data instead of the named graphs.
>> And that's my problem right now: somehow (I'm guessing as always the bug is sitting right in front of the monitor, but I really can't find my mistake, unfortunately), the unionising does not work. I've enabled it within the DatasetTDB in the config file, but I cannot even call not-inferred information from the named graphs using the default graph. Could you maybe explain to me what I'm doing wrong here? Also, I still have the problem, that I'd like to infer my named graphs only, too. But to my understanding from the mailinglist archive this is not possible, isn't it?
>> 
>> Thank you very much for your help
>> Anna
>> 
>> This is my config.ttl, setting up the Fuseki works without problems, the effects are just not the way I imagined.
>> 
>> # Licensed under the terms of
>> http://www.apache.org/licenses/LICENSE-2.0
>> @prefix : <#> .
>> @prefix fuseki: <http://jena.apache.org/fuseki#> .
>> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
>> @prefix tdb: <http://jena.hpl.hp.com/2008/tdb#> .
>> @prefix ja: <http://jena.hpl.hp.com/2005/11/Assembler#> .
>> [] rdf:type fuseki:Server ;
>> fuseki:services (
>> <#service1>
>> ) .
>> # Custom code.
>> [] ja:loadClass "com.hp.hpl.jena.tdb.TDB" .
>> # TDB
>> tdb:DatasetTDB rdfs:subClassOf ja:RDFDataset .
>> tdb:GraphTDB rdfs:subClassOf ja:Model .
>> ## ---------------------------------------------------------------
>> ## Service with only SPARQL query on an inference model.
>> ## Inference model bbase data in TDB.
>> <#service1> rdf:type fuseki:Service ; fuseki:name "ViKoDB" ; # 
>> http://host/inf fuseki:serviceQuery "sparql"
>> ; # SPARQL query service ## fuseki:serviceUpdate "update" ; 
>> fuseki:serviceUpload "upload" ; # Non-SPARQL upload service 
>> fuseki:serviceReadWriteGraphStore "data" ; # SPARQL Graph store 
>> protocol (read and write) ## A separate read-only graph store endpoint:
>> fuseki:serviceReadGraphStore "get" ; # SPARQL Graph store protocol 
>> (read only) fuseki:dataset <#dataset> ; .
>> <#dataset> rdf:type ja:RDFDataset ;
>> ja:defaultGraph <#model_inf> ;
>> .
>> <#model_inf> a ja:InfModel ;
>> ja:baseModel  <#tdbGraph> ;
>> ja:reasoner [
>> ja:reasonerURL <http://jena.hpl.hp.com/2003/OWLFBRuleReasoner> ; 
>> ja:schema <#solconpro_model> #this is me adding a local ttl file 
>> containing my basic schema to the ReasonerFactory ] .
>> <#solconpro_model> a ja:MemoryModel ; ja:content [ja:externalContent 
>> <file:solconpro.ttl>] .
>> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If the 
>> unionDefaultGraph is used, then the "update" service should be removed.
>> ## The unionDefaultGraph is read only.
>> tdb:unionDefaultGraph true ;
>> .
>> <#tdbGraph> rdf:type tdb:GraphTDB ;
>> tdb:dataset <#tdbDataset> .
>> 
>> _____________________________________________
>> Anna Wagner, M.Sc.
>> Wissenschaftliche Mitarbeiterin
>> Technische Universität Darmstadt
>> Institut für Numerische Methoden und Informatik im Bauwesen 
>> Franziska-Braun-Str. 7
>> D-64287 Darmstadt
>> Germany
>> 
>> Tel:         +49 (0)6151 - 16 21335
>> Fax:        +49 (0)6151 - 16 21339
>> 
>> wagner@iib.tu-darmstadt.de
>> http://www.iib.tu-darmstadt.de
>> 
> 


Re: Reasoning over multiple graphs following the same schema

Posted by ajs6f <aj...@apache.org>.
There are at least two questions here. One is whether inference works over a TDB union graph, and the other is how to get the answers you want from your data.

As for the first, I suspect that the default graph being used for the inference is actually the underlying TDB default graph (not the union graph), which is presumably empty. Can you try (without changing your config!) adding the triples that you are currently putting into named graphs into the default graph instead and see if querying the default graph returns them?

As for the second, the inference in which you are interested is type subsumption, so if you put your ontology into a non-union default graph and your data into named graphs, it should be handled by SPARQL property paths' * operator with no server-side inference at all, something along the lines of:

PREFIX scp: <http://www.solconpro.de/ontologies/scp#>
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#> 
SELECT ?co
WHERE {
     ?type rdfs:subClassOf* scp:Component . 
     GRAPH ?g { ?co a ?type . } 
}


ajs6f

> On Jan 6, 2018, at 8:58 AM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
> 
> The parsing problem was regarding the ttl syntax in general (on 3.5), the error you found was my mistake while reducing the larger file to a small example. It turns out that this error does not occur on the 3.6, which we've installed now.
> 
> However, the problem with the UnionDefaultGraph still remains. If I upload graphs as named graphs within the dataset, they are not affected by queries towards the default graph of the dataset, disregarding whether inferencing is used or not. In the example I've given (exactly the same config and data files, except of the missing namespace declaration) I'd expect to get all scp:Elements from the two example graphs by querying:
> PREFIX scp : <http://www.solconpro.de/ontologies/scp#>
> SELECT ?elem
> WHERE{
> ?elem a scp:Component .
> }
> But it returns an empty resultset. If I add 
> FROM <http://172.20.4.101:3030/ViKoDB/data/V1>
> to the query, it returns :Co_BackingMaterial and :Co_Diodes. 
> 
> Might the reason for this problem be, that the unionizing does not work while using a reasoner?
> 
> PS: Happy new year :)
> 
> 
> -----Ursprüngliche Nachricht-----
> Von: ajs6f [mailto:ajs6f@apache.org] 
> Gesendet: Mittwoch, 3. Januar 2018 18:04
> An: users@jena.apache.org
> Betreff: Re: Reasoning over multiple graphs following the same schema
> 
> Using an older version of Jena is not ideal. What, specifically, were your problems with using 3.6.0? I'd rather not debug an older version if it can be avoided, and parsing problems are likely to be much easier to solve than problems with complex assembler schemes.
> 
> With regards to parsing problems I can tell you that the sample Turtle data you show below isn't valid. It's missing the namespace declaration for OWL.
> 
> A
>> On Jan 2, 2018, at 5:15 AM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
>> 
>> Thank you very much for your help again. I've done some more tests to get a more detailed description of my problem. 
>> First of all, I've noticed, I forgot to tell you, that I'm using the Version 3.4.0. and not the most current one since I had problems uploading data using the web interface (parsing error). I don't know whether you have been working on inferencing and TDB in the last updates, so maybe my problem has already been fixed. Please tell me if it is.
>> Second, my tests showed that the unionizing does actually work, but only for one named graph (the first one I've uploaded). For any other - later added - graphs the unionizing neither works for simple non-inferenced queries nor more complex inference-based ones. If I add the new graphs to the default graph by not naming them, the querying and inferencing works fine, however.
>> Now, as you've asked, some more detailed data:
>> 
>> 1) My config file looks like this:
>> 
>> # Licensed under the terms of 
>> http://www.apache.org/licenses/LICENSE-2.0
>> @prefix : <#> .
>> @prefix fuseki: <http://jena.apache.org/fuseki#> .
>> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
>> @prefix tdb: <http://jena.hpl.hp.com/2008/tdb#> .
>> @prefix ja: <http://jena.hpl.hp.com/2005/11/Assembler#> .
>> [] rdf:type fuseki:Server ;
>> fuseki:services (
>> <#service1>
>> ) .
>> # Custom code.
>> [] ja:loadClass "com.hp.hpl.jena.tdb.TDB" .
>> # TDB
>> tdb:DatasetTDB rdfs:subClassOf ja:RDFDataset .
>> tdb:GraphTDB rdfs:subClassOf ja:Model .
>> ## ---------------------------------------------------------------
>> ## Service with only SPARQL query on an inference model.
>> ## Inference model bbase data in TDB.
>> <#service1> rdf:type fuseki:Service ;
>> fuseki:name "ViKoDB" ; # http://host/inf fuseki:serviceQuery "sparql" 
>> ; # SPARQL query service ## fuseki:serviceUpdate "update" ; 
>> fuseki:serviceUpload "upload" ; # Non-SPARQL upload service 
>> fuseki:serviceReadWriteGraphStore "data" ; # SPARQL Graph store 
>> protocol (read and write) ## A separate ead-only graph store endpoint:
>> fuseki:serviceReadGraphStore "get" ; # SPARQL Graph store protocol 
>> (read only)y fuseki:dataset <#dataset> ; .
>> <#dataset> rdf:type ja:RDFDataset ;
>> ja:defaultGraph <#model_inf> ;
>> .
>> <#model_inf> a ja:InfModel ;
>> ja:baseModel  <#tdbGraph> ;
>> ja:reasoner [
>> ja:reasonerURL <http://jena.hpl.hp.com/2003/OWLFBRuleReasoner> ; 
>> ja:schema <#solconpro_model> ] .
>> <#solconpro_model> a ja:MemoryModel ;
>> ja:content [ja:externalContent <file:solconpro.ttl>] .
>> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If the 
>> unionDefaultGraph is used, then the "update" service should be removed.
>> ## The unionDefaultGraph is read only.
>> tdb:unionDefaultGraph true ;
>> .
>> <#tdbGraph> rdf:type tdb:GraphTDB ;
>> tdb:dataset <#tdbDataset> . 
>> 
>> With the solconpro.ttl file in the same folder. Here is an excerpt of this file (my base ontology):
>> 
>> @prefix scp: <http://www.solconpro.de/ontologies/scp#> .
>> @prefix owl: <http://www.w3.org/2002/07/owl#> .
>> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
>> 
>> <http://www.solconpro.de/ontologies/scp>
>> rdf:type owl:Ontology ;
>> .
>> scp:Assembly
>> rdf:type owl:Class ;
>> rdfs:subClassOf scp:Component ;
>> owl:disjointWith scp:Element ;
>> .
>> scp:Component
>> rdf:type owl:Class ;
>> .  
>> scp:Element
>> rdf:type owl:Class ;
>> rdfs:subClassOf scp:Component ;
>> .
>> 
>> 2) Here is some example data (v1.ttl): 
>> @prefix : <http://www.solconpro.de/ontologies/v1#> .
>> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
>> @prefix scp: <http://www.solconpro.de/ontologies/scp#> .
>> 
>> <http://www.solconpro.de/ontologies/v1>
>> rdf:type owl:Ontology ;
>> owl:imports <http://www.solconpro.de/ontologies/scp> ; .
>> :Co_Module
>> rdf:type scp:Assembly ;
>> rdf:type owl:NamedIndividual ;
>> .
>> :Co_BackingMaterial
>> rdf:type scp:Element ;
>> rdf:type owl:NamedIndividual ;
>> .
>> :Co_Diodes
>> rdf:type scp:Element ;
>> rdf:type owl:NamedIndividual ;
>> .
>> And v2.ttl:
>> @prefix : <http://www.solconpro.de/ontologies/v2#> .
>> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
>> @prefix scp: <http://www.solconpro.de/ontologies/scp#> .
>> 
>> <http://www.solconpro.de/ontologies/v2>
>> rdf:type owl:Ontology ;
>> owl:imports <http://www.solconpro.de/ontologies/scp> ; .
>> :Co_System
>> rdf:type scp:Assembly ;
>> rdf:type owl:NamedIndividual ;
>> .
>> :Co_Cell
>> rdf:type scp:Element ;
>> rdf:type owl:NamedIndividual ;
>> .
>> :Co_Interconnectors
>> rdf:type scp:Element ;
>> rdf:type owl:NamedIndividual ;
>> .
>> I want to store this data in named graphs (e.g. "V1" and "V2") in my dataset "ViKoDB" as defined in my config.
>> 
>> 3) As one of my use cases for this application, I want to get all objects that classify as "Component" (meaning also "Element" and "Assembly" entities, since these classes derive from the "Component" class). My query to do so looks like this:
>> PREFIX scp : <http://www.solconpro.de/ontologies/scp#>
>> SELECT ?co
>> WHERE{
>> ?co a scp:Component .
>> }
>> As result I would expect to get a list like this:
>> {
>> "head": {
>>   "vars": [ "co" ]
>> } ,
>> "results": {
>>   "bindings": [
>>     {
>>       "co": { "type": "uri" , "value": "http://www.solconpro.de/ontologies/ v1#Co_Module " }
>>     } ,
>>     {
>>       "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_BackingMaterial " }
>>     } ,
>>     {
>>       "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_Diodes " }
>>     } ,
>>     {
>>       "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_System " }
>>     } ,
>>     {
>>       "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_Cell " }
>>     } ,
>>     {
>>       "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_Interconnectors " }
>>     } 
>>   ]
>> }
>> }
>> Instead, however, I get a half-empty list containing only entries from the V1 graph:
>> {
>> "head": {
>>   "vars": [ "co" ]
>> } ,
>> "results": {
>>   "bindings": [
>>     {
>>       "co": { "type": "uri" , "value": "http://www.solconpro.de/ontologies/ v1#Co_Module " }
>>     } ,
>>     {
>>       "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_BackingMaterial " }
>>     } ,
>>     {
>>       "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_Diodes " }
>>     } 
>>   ]
>> }
>> }
>> If I load the second ttl file into the default graph, I do get the list as above.
>> 
>> 
>> -----Ursprüngliche Nachricht-----
>> Von: ajs6f [mailto:ajs6f@apache.org]
>> Gesendet: Dienstag, 26. Dezember 2017 16:34
>> An: users@jena.apache.org
>> Betreff: Re: Reasoning over multiple graphs following the same schema
>> 
>> I saw your example in the first email, but it included a good deal of 
>> other material. We need to isolate your problem. Can you provide a 
>> complete example with
>> 
>> 1) your Fuseki config
>> 2) some sample data (doesn't have to be much, just enough to show the 
>> problem)
>> 3) the queries you are run, what you expect to get, and what you get instead?
>> 
>> As for the named graph question, Dave Reynolds (who knows enormously more about inference in Jena than do I) can correct me if needed, but I don't think Jena is capable of what you are asking for (dynamically allocated inference using only assembler RDF) right now. If you are able to write some Java, you could get this done fairly well using the Java API. It's a worthwhile function and you are welcome to file a ticket describing this.
>> 
>> A quick and "hacky" way of doing this might be to have a single named graph that is set up for inference, and then replace the contents of that graph with the triples you need to use for a given query using Fuseki's support for SPARQL Graph Protocol. Obviously, that's not going to work in concurrency.
>> 
>> Maybe you can tell us more about what you are trying to do? We may be able to suggest a design more conducive to your goals.
>> 
>> ajs6f
>> 
>>> On Dec 22, 2017, at 4:34 AM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
>>> 
>>> Thank you for your quick response.
>>> 
>>> The idea is to use SPARQL directly against Fuseki and allow the user to upload new graphs into the dataset as named graphs using the web-interface. I've already looked into that example, I think. At least my config file includes this setup. The reasoning does work using SPARQL queries against Fuseki, but only for the default graph and not the named graphs in that dataset. Additionally, the unionising of the named graphs into the default graph does not work, which I can't explain. The definition of my tdbDataset is the following (I've sent the complete config at the bottom of my first mail): 
>>> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If the  unionDefaultGraph is used, then the "update" service should be removed. <-- I also did that
>>> 	## The unionDefaultGraph is read only.
>>> 	tdb:unionDefaultGraph true ;
>>> .
>>> 
>>> Regarding the second example you sent: As I understand, this would allow me to add named graphs in the config file, but what about those I would like to add later on? Is it possible to include/create a reasoner that works on the entire dataset - including named graphs that are added - or do I have to add all graphs in the config file at the start of the Fuseki server?
>>> 
>>> -----Ursprüngliche Nachricht-----
>>> Von: ajs6f [mailto:ajs6f@apache.org]
>>> Gesendet: Donnerstag, 21. Dezember 2017 22:29
>>> An: users@jena.apache.org
>>> Betreff: Re: Reasoning over multiple graphs following the same schema
>>> 
>>> _How_ are you trying to use this setup: are you expecting to use the Java API to query this setup, or just use SPARQL directly against Fuseki?
>>> 
>>> If you use the Java API, you can set up whatever ontological models you want over whatever named graphs or unions you want (although performance may or may not be what you want).
>>> 
>>> If you expect to just send SPARQL against Fuseki and have the inference occur automatically as part of the process of answering your queries, it's a different story... there is an example here:
>>> 
>>> https://github.com/apache/jena/blob/master/jena-fuseki2/examples/serv
>>> i
>>> ce-inference-2.ttl
>>> 
>>> of what I think you are trying to do. Perhaps you can try it directly?
>>> 
>>> As for using the named graphs, we do have examples in our test of assembler RDF incantations like:
>>> 
>>> <#graph> rdf:type tdb:GraphTDB ;
>>>  tdb:location "/my/tdb/DB" ;
>>>  tdb:graphName "http://example.com/graph"  .
>>> 
>>> so it might be worth trying something like that, or something like that with the tdb:dataset property, a la:
>>> 
>>> <#tdbGraph> rdf:type tdb:GraphTDB ;
>>>   tdb:dataset <#tdbDataset> ;
>>>   tdb:graphName "http://example.com/graph" .
>>> 
>>> Then you could use the same
>>> 
>>> <#model_inf> a ja:InfModel ;
>>>  ja:baseModel  <#tdbGraph> ; etc.
>>> 
>>> That may or may not work, and if it doesn't, we can try and figure 
>>> out why and whether it would be possible to add that feature. (Andy 
>>> or Dave may have some insight already for this.)
>>> 
>>> ajs6f
>>> 
>>>> On Dec 21, 2017, at 1:12 PM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
>>>> 
>>>> Hi there,
>>>> 
>>>> hopefully this question wasn't already asked a hundred times, but I haven't found anything quite the same in the archives.
>>>> First off: my use case is that I want to store multiple named graphs within one dataset, all of which are based on the same schema. Now I'd like to use a reasoner for all of these named graphs. To make it even more complicated, it would also be a nice feature, if all these named graphs would be unionised in the default graph, so I could also use the reasoner for all of them at the same time.
>>>> Now here's what I've done so far: the base for my Fuseki is a TDB according to this example I've found (http://krr.cs.vu.nl/wp-content/uploads/2013/09/protege-fuseki-yasgui-manual.pdf), the inferring is working fine for the default graph (the default graph only, though). So I tried to unionise the named graphs in the default graph, thinking this would allow me to maybe just store the basic schema in the default graph and infer through the entire data instead of the named graphs.
>>>> And that's my problem right now: somehow (I'm guessing as always the bug is sitting right in front of the monitor, but I really can't find my mistake, unfortunately), the unionising does not work. I've enabled it within the DatasetTDB in the config file, but I cannot even call not-inferred information from the named graphs using the default graph. Could you maybe explain to me what I'm doing wrong here? Also, I still have the problem, that I'd like to infer my named graphs only, too. But to my understanding from the mailinglist archive this is not possible, isn't it?
>>>> 
>>>> Thank you very much for your help
>>>> Anna
>>>> 
>>>> This is my config.ttl, setting up the Fuseki works without problems, the effects are just not the way I imagined.
>>>> 
>>>> # Licensed under the terms of
>>>> http://www.apache.org/licenses/LICENSE-2.0
>>>> @prefix : <#> .
>>>> @prefix fuseki: <http://jena.apache.org/fuseki#> .
>>>> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>>>> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
>>>> @prefix tdb: <http://jena.hpl.hp.com/2008/tdb#> .
>>>> @prefix ja: <http://jena.hpl.hp.com/2005/11/Assembler#> .
>>>> [] rdf:type fuseki:Server ;
>>>> fuseki:services (
>>>> <#service1>
>>>> ) .
>>>> # Custom code.
>>>> [] ja:loadClass "com.hp.hpl.jena.tdb.TDB" .
>>>> # TDB
>>>> tdb:DatasetTDB rdfs:subClassOf ja:RDFDataset .
>>>> tdb:GraphTDB rdfs:subClassOf ja:Model .
>>>> ## ---------------------------------------------------------------
>>>> ## Service with only SPARQL query on an inference model.
>>>> ## Inference model bbase data in TDB.
>>>> <#service1> rdf:type fuseki:Service ; fuseki:name "ViKoDB" ; # 
>>>> http://host/inf fuseki:serviceQuery "sparql"
>>>> ; # SPARQL query service ## fuseki:serviceUpdate "update" ; 
>>>> fuseki:serviceUpload "upload" ; # Non-SPARQL upload service 
>>>> fuseki:serviceReadWriteGraphStore "data" ; # SPARQL Graph store 
>>>> protocol (read and write) ## A separate read-only graph store endpoint:
>>>> fuseki:serviceReadGraphStore "get" ; # SPARQL Graph store protocol 
>>>> (read only) fuseki:dataset <#dataset> ; .
>>>> <#dataset> rdf:type ja:RDFDataset ;
>>>> ja:defaultGraph <#model_inf> ;
>>>> .
>>>> <#model_inf> a ja:InfModel ;
>>>> ja:baseModel  <#tdbGraph> ;
>>>> ja:reasoner [
>>>> ja:reasonerURL <http://jena.hpl.hp.com/2003/OWLFBRuleReasoner> ; 
>>>> ja:schema <#solconpro_model> #this is me adding a local ttl file 
>>>> containing my basic schema to the ReasonerFactory ] .
>>>> <#solconpro_model> a ja:MemoryModel ; ja:content [ja:externalContent 
>>>> <file:solconpro.ttl>] .
>>>> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If 
>>>> the unionDefaultGraph is used, then the "update" service should be removed.
>>>> ## The unionDefaultGraph is read only.
>>>> tdb:unionDefaultGraph true ;
>>>> .
>>>> <#tdbGraph> rdf:type tdb:GraphTDB ;
>>>> tdb:dataset <#tdbDataset> .
>>>> 
>>>> _____________________________________________
>>>> Anna Wagner, M.Sc.
>>>> Wissenschaftliche Mitarbeiterin
>>>> Technische Universität Darmstadt
>>>> Institut für Numerische Methoden und Informatik im Bauwesen 
>>>> Franziska-Braun-Str. 7
>>>> D-64287 Darmstadt
>>>> Germany
>>>> 
>>>> Tel:         +49 (0)6151 - 16 21335
>>>> Fax:        +49 (0)6151 - 16 21339
>>>> 
>>>> wagner@iib.tu-darmstadt.de
>>>> http://www.iib.tu-darmstadt.de
>>>> 
>>> 
>> 
> 


AW: Reasoning over multiple graphs following the same schema

Posted by "Wagner, Anna" <wa...@iib.tu-darmstadt.de>.
The parsing problem was regarding the ttl syntax in general (on 3.5), the error you found was my mistake while reducing the larger file to a small example. It turns out that this error does not occur on the 3.6, which we've installed now.

However, the problem with the UnionDefaultGraph still remains. If I upload graphs as named graphs within the dataset, they are not affected by queries towards the default graph of the dataset, disregarding whether inferencing is used or not. In the example I've given (exactly the same config and data files, except of the missing namespace declaration) I'd expect to get all scp:Elements from the two example graphs by querying:
PREFIX scp : <http://www.solconpro.de/ontologies/scp#>
SELECT ?elem
WHERE{
?elem a scp:Component .
}
But it returns an empty resultset. If I add 
FROM <http://172.20.4.101:3030/ViKoDB/data/V1>
to the query, it returns :Co_BackingMaterial and :Co_Diodes. 

Might the reason for this problem be, that the unionizing does not work while using a reasoner?

PS: Happy new year :)


-----Ursprüngliche Nachricht-----
Von: ajs6f [mailto:ajs6f@apache.org] 
Gesendet: Mittwoch, 3. Januar 2018 18:04
An: users@jena.apache.org
Betreff: Re: Reasoning over multiple graphs following the same schema

Using an older version of Jena is not ideal. What, specifically, were your problems with using 3.6.0? I'd rather not debug an older version if it can be avoided, and parsing problems are likely to be much easier to solve than problems with complex assembler schemes.

With regards to parsing problems I can tell you that the sample Turtle data you show below isn't valid. It's missing the namespace declaration for OWL.

A
> On Jan 2, 2018, at 5:15 AM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
> 
> Thank you very much for your help again. I've done some more tests to get a more detailed description of my problem. 
> First of all, I've noticed, I forgot to tell you, that I'm using the Version 3.4.0. and not the most current one since I had problems uploading data using the web interface (parsing error). I don't know whether you have been working on inferencing and TDB in the last updates, so maybe my problem has already been fixed. Please tell me if it is.
> Second, my tests showed that the unionizing does actually work, but only for one named graph (the first one I've uploaded). For any other - later added - graphs the unionizing neither works for simple non-inferenced queries nor more complex inference-based ones. If I add the new graphs to the default graph by not naming them, the querying and inferencing works fine, however.
> Now, as you've asked, some more detailed data:
> 
> 1) My config file looks like this:
> 
> # Licensed under the terms of 
> http://www.apache.org/licenses/LICENSE-2.0
> @prefix : <#> .
> @prefix fuseki: <http://jena.apache.org/fuseki#> .
> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
> @prefix tdb: <http://jena.hpl.hp.com/2008/tdb#> .
> @prefix ja: <http://jena.hpl.hp.com/2005/11/Assembler#> .
> [] rdf:type fuseki:Server ;
> fuseki:services (
> <#service1>
> ) .
> # Custom code.
> [] ja:loadClass "com.hp.hpl.jena.tdb.TDB" .
> # TDB
> tdb:DatasetTDB rdfs:subClassOf ja:RDFDataset .
> tdb:GraphTDB rdfs:subClassOf ja:Model .
> ## ---------------------------------------------------------------
> ## Service with only SPARQL query on an inference model.
> ## Inference model bbase data in TDB.
> <#service1> rdf:type fuseki:Service ;
> fuseki:name "ViKoDB" ; # http://host/inf fuseki:serviceQuery "sparql" 
> ; # SPARQL query service ## fuseki:serviceUpdate "update" ; 
> fuseki:serviceUpload "upload" ; # Non-SPARQL upload service 
> fuseki:serviceReadWriteGraphStore "data" ; # SPARQL Graph store 
> protocol (read and write) ## A separate ead-only graph store endpoint:
> fuseki:serviceReadGraphStore "get" ; # SPARQL Graph store protocol 
> (read only)y fuseki:dataset <#dataset> ; .
> <#dataset> rdf:type ja:RDFDataset ;
> ja:defaultGraph <#model_inf> ;
> .
> <#model_inf> a ja:InfModel ;
> ja:baseModel  <#tdbGraph> ;
> ja:reasoner [
> ja:reasonerURL <http://jena.hpl.hp.com/2003/OWLFBRuleReasoner> ; 
> ja:schema <#solconpro_model> ] .
> <#solconpro_model> a ja:MemoryModel ;
> ja:content [ja:externalContent <file:solconpro.ttl>] .
> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If the 
> unionDefaultGraph is used, then the "update" service should be removed.
> ## The unionDefaultGraph is read only.
> tdb:unionDefaultGraph true ;
> .
> <#tdbGraph> rdf:type tdb:GraphTDB ;
> tdb:dataset <#tdbDataset> . 
> 
> With the solconpro.ttl file in the same folder. Here is an excerpt of this file (my base ontology):
> 
> @prefix scp: <http://www.solconpro.de/ontologies/scp#> .
> @prefix owl: <http://www.w3.org/2002/07/owl#> .
> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
> 
> <http://www.solconpro.de/ontologies/scp>
>  rdf:type owl:Ontology ;
> .
> scp:Assembly
>  rdf:type owl:Class ;
>  rdfs:subClassOf scp:Component ;
>  owl:disjointWith scp:Element ;
> .
> scp:Component
>  rdf:type owl:Class ;
> .  
> scp:Element
>  rdf:type owl:Class ;
>  rdfs:subClassOf scp:Component ;
> .
> 
> 2) Here is some example data (v1.ttl): 
> @prefix : <http://www.solconpro.de/ontologies/v1#> .
> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
> @prefix scp: <http://www.solconpro.de/ontologies/scp#> .
> 
> <http://www.solconpro.de/ontologies/v1>
>  rdf:type owl:Ontology ;
>  owl:imports <http://www.solconpro.de/ontologies/scp> ; .
> :Co_Module
>  rdf:type scp:Assembly ;
>  rdf:type owl:NamedIndividual ;
> .
> :Co_BackingMaterial
>  rdf:type scp:Element ;
>  rdf:type owl:NamedIndividual ;
> .
> :Co_Diodes
>  rdf:type scp:Element ;
>  rdf:type owl:NamedIndividual ;
> .
> And v2.ttl:
> @prefix : <http://www.solconpro.de/ontologies/v2#> .
> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
> @prefix scp: <http://www.solconpro.de/ontologies/scp#> .
> 
> <http://www.solconpro.de/ontologies/v2>
>  rdf:type owl:Ontology ;
>  owl:imports <http://www.solconpro.de/ontologies/scp> ; .
> :Co_System
>  rdf:type scp:Assembly ;
>  rdf:type owl:NamedIndividual ;
> .
> :Co_Cell
>  rdf:type scp:Element ;
>  rdf:type owl:NamedIndividual ;
> .
> :Co_Interconnectors
>  rdf:type scp:Element ;
>  rdf:type owl:NamedIndividual ;
> .
> I want to store this data in named graphs (e.g. "V1" and "V2") in my dataset "ViKoDB" as defined in my config.
> 
> 3) As one of my use cases for this application, I want to get all objects that classify as "Component" (meaning also "Element" and "Assembly" entities, since these classes derive from the "Component" class). My query to do so looks like this:
> PREFIX scp : <http://www.solconpro.de/ontologies/scp#>
> SELECT ?co
> WHERE{
> ?co a scp:Component .
> }
> As result I would expect to get a list like this:
> {
>  "head": {
>    "vars": [ "co" ]
>  } ,
>  "results": {
>    "bindings": [
>      {
>        "co": { "type": "uri" , "value": "http://www.solconpro.de/ontologies/ v1#Co_Module " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_BackingMaterial " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_Diodes " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_System " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_Cell " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_Interconnectors " }
>      } 
>    ]
>  }
> }
> Instead, however, I get a half-empty list containing only entries from the V1 graph:
> {
>  "head": {
>    "vars": [ "co" ]
>  } ,
>  "results": {
>    "bindings": [
>      {
>        "co": { "type": "uri" , "value": "http://www.solconpro.de/ontologies/ v1#Co_Module " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_BackingMaterial " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_Diodes " }
>      } 
>    ]
>  }
> }
> If I load the second ttl file into the default graph, I do get the list as above.
> 
> 
> -----Ursprüngliche Nachricht-----
> Von: ajs6f [mailto:ajs6f@apache.org]
> Gesendet: Dienstag, 26. Dezember 2017 16:34
> An: users@jena.apache.org
> Betreff: Re: Reasoning over multiple graphs following the same schema
> 
> I saw your example in the first email, but it included a good deal of 
> other material. We need to isolate your problem. Can you provide a 
> complete example with
> 
> 1) your Fuseki config
> 2) some sample data (doesn't have to be much, just enough to show the 
> problem)
> 3) the queries you are run, what you expect to get, and what you get instead?
> 
> As for the named graph question, Dave Reynolds (who knows enormously more about inference in Jena than do I) can correct me if needed, but I don't think Jena is capable of what you are asking for (dynamically allocated inference using only assembler RDF) right now. If you are able to write some Java, you could get this done fairly well using the Java API. It's a worthwhile function and you are welcome to file a ticket describing this.
> 
> A quick and "hacky" way of doing this might be to have a single named graph that is set up for inference, and then replace the contents of that graph with the triples you need to use for a given query using Fuseki's support for SPARQL Graph Protocol. Obviously, that's not going to work in concurrency.
> 
> Maybe you can tell us more about what you are trying to do? We may be able to suggest a design more conducive to your goals.
> 
> ajs6f
> 
>> On Dec 22, 2017, at 4:34 AM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
>> 
>> Thank you for your quick response.
>> 
>> The idea is to use SPARQL directly against Fuseki and allow the user to upload new graphs into the dataset as named graphs using the web-interface. I've already looked into that example, I think. At least my config file includes this setup. The reasoning does work using SPARQL queries against Fuseki, but only for the default graph and not the named graphs in that dataset. Additionally, the unionising of the named graphs into the default graph does not work, which I can't explain. The definition of my tdbDataset is the following (I've sent the complete config at the bottom of my first mail): 
>> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If the  unionDefaultGraph is used, then the "update" service should be removed. <-- I also did that
>> 	## The unionDefaultGraph is read only.
>> 	tdb:unionDefaultGraph true ;
>> .
>> 
>> Regarding the second example you sent: As I understand, this would allow me to add named graphs in the config file, but what about those I would like to add later on? Is it possible to include/create a reasoner that works on the entire dataset - including named graphs that are added - or do I have to add all graphs in the config file at the start of the Fuseki server?
>> 
>> -----Ursprüngliche Nachricht-----
>> Von: ajs6f [mailto:ajs6f@apache.org]
>> Gesendet: Donnerstag, 21. Dezember 2017 22:29
>> An: users@jena.apache.org
>> Betreff: Re: Reasoning over multiple graphs following the same schema
>> 
>> _How_ are you trying to use this setup: are you expecting to use the Java API to query this setup, or just use SPARQL directly against Fuseki?
>> 
>> If you use the Java API, you can set up whatever ontological models you want over whatever named graphs or unions you want (although performance may or may not be what you want).
>> 
>> If you expect to just send SPARQL against Fuseki and have the inference occur automatically as part of the process of answering your queries, it's a different story... there is an example here:
>> 
>> https://github.com/apache/jena/blob/master/jena-fuseki2/examples/serv
>> i
>> ce-inference-2.ttl
>> 
>> of what I think you are trying to do. Perhaps you can try it directly?
>> 
>> As for using the named graphs, we do have examples in our test of assembler RDF incantations like:
>> 
>> <#graph> rdf:type tdb:GraphTDB ;
>>   tdb:location "/my/tdb/DB" ;
>>   tdb:graphName "http://example.com/graph"  .
>> 
>> so it might be worth trying something like that, or something like that with the tdb:dataset property, a la:
>> 
>> <#tdbGraph> rdf:type tdb:GraphTDB ;
>>    tdb:dataset <#tdbDataset> ;
>>    tdb:graphName "http://example.com/graph" .
>> 
>> Then you could use the same
>> 
>> <#model_inf> a ja:InfModel ;
>>   ja:baseModel  <#tdbGraph> ; etc.
>> 
>> That may or may not work, and if it doesn't, we can try and figure 
>> out why and whether it would be possible to add that feature. (Andy 
>> or Dave may have some insight already for this.)
>> 
>> ajs6f
>> 
>>> On Dec 21, 2017, at 1:12 PM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
>>> 
>>> Hi there,
>>> 
>>> hopefully this question wasn't already asked a hundred times, but I haven't found anything quite the same in the archives.
>>> First off: my use case is that I want to store multiple named graphs within one dataset, all of which are based on the same schema. Now I'd like to use a reasoner for all of these named graphs. To make it even more complicated, it would also be a nice feature, if all these named graphs would be unionised in the default graph, so I could also use the reasoner for all of them at the same time.
>>> Now here's what I've done so far: the base for my Fuseki is a TDB according to this example I've found (http://krr.cs.vu.nl/wp-content/uploads/2013/09/protege-fuseki-yasgui-manual.pdf), the inferring is working fine for the default graph (the default graph only, though). So I tried to unionise the named graphs in the default graph, thinking this would allow me to maybe just store the basic schema in the default graph and infer through the entire data instead of the named graphs.
>>> And that's my problem right now: somehow (I'm guessing as always the bug is sitting right in front of the monitor, but I really can't find my mistake, unfortunately), the unionising does not work. I've enabled it within the DatasetTDB in the config file, but I cannot even call not-inferred information from the named graphs using the default graph. Could you maybe explain to me what I'm doing wrong here? Also, I still have the problem, that I'd like to infer my named graphs only, too. But to my understanding from the mailinglist archive this is not possible, isn't it?
>>> 
>>> Thank you very much for your help
>>> Anna
>>> 
>>> This is my config.ttl, setting up the Fuseki works without problems, the effects are just not the way I imagined.
>>> 
>>> # Licensed under the terms of
>>> http://www.apache.org/licenses/LICENSE-2.0
>>> @prefix : <#> .
>>> @prefix fuseki: <http://jena.apache.org/fuseki#> .
>>> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>>> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
>>> @prefix tdb: <http://jena.hpl.hp.com/2008/tdb#> .
>>> @prefix ja: <http://jena.hpl.hp.com/2005/11/Assembler#> .
>>> [] rdf:type fuseki:Server ;
>>> fuseki:services (
>>> <#service1>
>>> ) .
>>> # Custom code.
>>> [] ja:loadClass "com.hp.hpl.jena.tdb.TDB" .
>>> # TDB
>>> tdb:DatasetTDB rdfs:subClassOf ja:RDFDataset .
>>> tdb:GraphTDB rdfs:subClassOf ja:Model .
>>> ## ---------------------------------------------------------------
>>> ## Service with only SPARQL query on an inference model.
>>> ## Inference model bbase data in TDB.
>>> <#service1> rdf:type fuseki:Service ; fuseki:name "ViKoDB" ; # 
>>> http://host/inf fuseki:serviceQuery "sparql"
>>> ; # SPARQL query service ## fuseki:serviceUpdate "update" ; 
>>> fuseki:serviceUpload "upload" ; # Non-SPARQL upload service 
>>> fuseki:serviceReadWriteGraphStore "data" ; # SPARQL Graph store 
>>> protocol (read and write) ## A separate read-only graph store endpoint:
>>> fuseki:serviceReadGraphStore "get" ; # SPARQL Graph store protocol 
>>> (read only) fuseki:dataset <#dataset> ; .
>>> <#dataset> rdf:type ja:RDFDataset ;
>>> ja:defaultGraph <#model_inf> ;
>>> .
>>> <#model_inf> a ja:InfModel ;
>>> ja:baseModel  <#tdbGraph> ;
>>> ja:reasoner [
>>> ja:reasonerURL <http://jena.hpl.hp.com/2003/OWLFBRuleReasoner> ; 
>>> ja:schema <#solconpro_model> #this is me adding a local ttl file 
>>> containing my basic schema to the ReasonerFactory ] .
>>> <#solconpro_model> a ja:MemoryModel ; ja:content [ja:externalContent 
>>> <file:solconpro.ttl>] .
>>> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If 
>>> the unionDefaultGraph is used, then the "update" service should be removed.
>>> ## The unionDefaultGraph is read only.
>>> tdb:unionDefaultGraph true ;
>>> .
>>> <#tdbGraph> rdf:type tdb:GraphTDB ;
>>> tdb:dataset <#tdbDataset> .
>>> 
>>> _____________________________________________
>>> Anna Wagner, M.Sc.
>>> Wissenschaftliche Mitarbeiterin
>>> Technische Universität Darmstadt
>>> Institut für Numerische Methoden und Informatik im Bauwesen 
>>> Franziska-Braun-Str. 7
>>> D-64287 Darmstadt
>>> Germany
>>> 
>>> Tel:         +49 (0)6151 - 16 21335
>>> Fax:        +49 (0)6151 - 16 21339
>>> 
>>> wagner@iib.tu-darmstadt.de
>>> http://www.iib.tu-darmstadt.de
>>> 
>> 
> 


Re: Reasoning over multiple graphs following the same schema

Posted by ajs6f <aj...@apache.org>.
Using an older version of Jena is not ideal. What, specifically, were your problems with using 3.6.0? I'd rather not debug an older version if it can be avoided, and parsing problems are likely to be much easier to solve than problems with complex assembler schemes.

With regards to parsing problems I can tell you that the sample Turtle data you show below isn't valid. It's missing the namespace declaration for OWL.

A
> On Jan 2, 2018, at 5:15 AM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
> 
> Thank you very much for your help again. I've done some more tests to get a more detailed description of my problem. 
> First of all, I've noticed, I forgot to tell you, that I'm using the Version 3.4.0. and not the most current one since I had problems uploading data using the web interface (parsing error). I don't know whether you have been working on inferencing and TDB in the last updates, so maybe my problem has already been fixed. Please tell me if it is.
> Second, my tests showed that the unionizing does actually work, but only for one named graph (the first one I've uploaded). For any other - later added - graphs the unionizing neither works for simple non-inferenced queries nor more complex inference-based ones. If I add the new graphs to the default graph by not naming them, the querying and inferencing works fine, however.
> Now, as you've asked, some more detailed data:
> 
> 1) My config file looks like this:
> 
> # Licensed under the terms of http://www.apache.org/licenses/LICENSE-2.0
> @prefix : <#> .
> @prefix fuseki: <http://jena.apache.org/fuseki#> .
> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
> @prefix tdb: <http://jena.hpl.hp.com/2008/tdb#> .
> @prefix ja: <http://jena.hpl.hp.com/2005/11/Assembler#> .
> [] rdf:type fuseki:Server ;
> fuseki:services (
> <#service1>
> ) .
> # Custom code.
> [] ja:loadClass "com.hp.hpl.jena.tdb.TDB" .
> # TDB
> tdb:DatasetTDB rdfs:subClassOf ja:RDFDataset .
> tdb:GraphTDB rdfs:subClassOf ja:Model .
> ## ---------------------------------------------------------------
> ## Service with only SPARQL query on an inference model.
> ## Inference model bbase data in TDB.
> <#service1> rdf:type fuseki:Service ;
> fuseki:name "ViKoDB" ; # http://host/inf
> fuseki:serviceQuery "sparql" ; # SPARQL query service
> ## fuseki:serviceUpdate "update" ;
> fuseki:serviceUpload "upload" ; # Non-SPARQL upload service
> fuseki:serviceReadWriteGraphStore "data" ; # SPARQL Graph store protocol (read and write)
> ## A separate ead-only graph store endpoint:
> fuseki:serviceReadGraphStore "get" ; # SPARQL Graph store protocol (read only)y
> fuseki:dataset <#dataset> ;
> .
> <#dataset> rdf:type ja:RDFDataset ;
> ja:defaultGraph <#model_inf> ;
> .
> <#model_inf> a ja:InfModel ;
> ja:baseModel  <#tdbGraph> ;
> ja:reasoner [
> ja:reasonerURL <http://jena.hpl.hp.com/2003/OWLFBRuleReasoner> ;
> ja:schema <#solconpro_model>
> ] .
> <#solconpro_model> a ja:MemoryModel ;
> ja:content [ja:externalContent <file:solconpro.ttl>]
> .
> <#tdbDataset> rdf:type tdb:DatasetTDB ;
> tdb:location "DB" ;
> ## If the unionDefaultGraph is used, then the "update" service should be removed.
> ## The unionDefaultGraph is read only.
> tdb:unionDefaultGraph true ;
> .
> <#tdbGraph> rdf:type tdb:GraphTDB ;
> tdb:dataset <#tdbDataset> . 
> 
> With the solconpro.ttl file in the same folder. Here is an excerpt of this file (my base ontology):
> 
> @prefix scp: <http://www.solconpro.de/ontologies/scp#> .
> @prefix owl: <http://www.w3.org/2002/07/owl#> .
> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
> 
> <http://www.solconpro.de/ontologies/scp>
>  rdf:type owl:Ontology ;
> .
> scp:Assembly
>  rdf:type owl:Class ;
>  rdfs:subClassOf scp:Component ;
>  owl:disjointWith scp:Element ;
> .
> scp:Component
>  rdf:type owl:Class ;
> .  
> scp:Element
>  rdf:type owl:Class ;
>  rdfs:subClassOf scp:Component ;
> .
> 
> 2) Here is some example data (v1.ttl): 
> @prefix : <http://www.solconpro.de/ontologies/v1#> .
> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
> @prefix scp: <http://www.solconpro.de/ontologies/scp#> .
> 
> <http://www.solconpro.de/ontologies/v1>
>  rdf:type owl:Ontology ;
>  owl:imports <http://www.solconpro.de/ontologies/scp> ;
> .
> :Co_Module
>  rdf:type scp:Assembly ;
>  rdf:type owl:NamedIndividual ;
> .
> :Co_BackingMaterial
>  rdf:type scp:Element ;
>  rdf:type owl:NamedIndividual ;
> .
> :Co_Diodes
>  rdf:type scp:Element ;
>  rdf:type owl:NamedIndividual ;
> .
> And v2.ttl:
> @prefix : <http://www.solconpro.de/ontologies/v2#> .
> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
> @prefix scp: <http://www.solconpro.de/ontologies/scp#> .
> 
> <http://www.solconpro.de/ontologies/v2>
>  rdf:type owl:Ontology ;
>  owl:imports <http://www.solconpro.de/ontologies/scp> ;
> .
> :Co_System
>  rdf:type scp:Assembly ;
>  rdf:type owl:NamedIndividual ;
> .
> :Co_Cell
>  rdf:type scp:Element ;
>  rdf:type owl:NamedIndividual ;
> .
> :Co_Interconnectors
>  rdf:type scp:Element ;
>  rdf:type owl:NamedIndividual ;
> .
> I want to store this data in named graphs (e.g. "V1" and "V2") in my dataset "ViKoDB" as defined in my config.
> 
> 3) As one of my use cases for this application, I want to get all objects that classify as "Component" (meaning also "Element" and "Assembly" entities, since these classes derive from the "Component" class). My query to do so looks like this:
> PREFIX scp : <http://www.solconpro.de/ontologies/scp#>
> SELECT ?co
> WHERE{
> ?co a scp:Component .
> }
> As result I would expect to get a list like this:
> {
>  "head": {
>    "vars": [ "co" ]
>  } ,
>  "results": {
>    "bindings": [
>      {
>        "co": { "type": "uri" , "value": "http://www.solconpro.de/ontologies/ v1#Co_Module " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_BackingMaterial " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_Diodes " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_System " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_Cell " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v2#Co_Interconnectors " }
>      } 
>    ]
>  }
> }
> Instead, however, I get a half-empty list containing only entries from the V1 graph:
> {
>  "head": {
>    "vars": [ "co" ]
>  } ,
>  "results": {
>    "bindings": [
>      {
>        "co": { "type": "uri" , "value": "http://www.solconpro.de/ontologies/ v1#Co_Module " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_BackingMaterial " }
>      } ,
>      {
>        "co": { "type": "uri" , "value": " http://www.solconpro.de/ontologies/ v1#Co_Diodes " }
>      } 
>    ]
>  }
> }
> If I load the second ttl file into the default graph, I do get the list as above.
> 
> 
> -----Ursprüngliche Nachricht-----
> Von: ajs6f [mailto:ajs6f@apache.org] 
> Gesendet: Dienstag, 26. Dezember 2017 16:34
> An: users@jena.apache.org
> Betreff: Re: Reasoning over multiple graphs following the same schema
> 
> I saw your example in the first email, but it included a good deal of other material. We need to isolate your problem. Can you provide a complete example with
> 
> 1) your Fuseki config
> 2) some sample data (doesn't have to be much, just enough to show the problem)
> 3) the queries you are run, what you expect to get, and what you get instead?
> 
> As for the named graph question, Dave Reynolds (who knows enormously more about inference in Jena than do I) can correct me if needed, but I don't think Jena is capable of what you are asking for (dynamically allocated inference using only assembler RDF) right now. If you are able to write some Java, you could get this done fairly well using the Java API. It's a worthwhile function and you are welcome to file a ticket describing this.
> 
> A quick and "hacky" way of doing this might be to have a single named graph that is set up for inference, and then replace the contents of that graph with the triples you need to use for a given query using Fuseki's support for SPARQL Graph Protocol. Obviously, that's not going to work in concurrency.
> 
> Maybe you can tell us more about what you are trying to do? We may be able to suggest a design more conducive to your goals.
> 
> ajs6f
> 
>> On Dec 22, 2017, at 4:34 AM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
>> 
>> Thank you for your quick response.
>> 
>> The idea is to use SPARQL directly against Fuseki and allow the user to upload new graphs into the dataset as named graphs using the web-interface. I've already looked into that example, I think. At least my config file includes this setup. The reasoning does work using SPARQL queries against Fuseki, but only for the default graph and not the named graphs in that dataset. Additionally, the unionising of the named graphs into the default graph does not work, which I can't explain. The definition of my tdbDataset is the following (I've sent the complete config at the bottom of my first mail): 
>> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If the  unionDefaultGraph is used, then the "update" service should be removed. <-- I also did that
>> 	## The unionDefaultGraph is read only.
>> 	tdb:unionDefaultGraph true ;
>> .
>> 
>> Regarding the second example you sent: As I understand, this would allow me to add named graphs in the config file, but what about those I would like to add later on? Is it possible to include/create a reasoner that works on the entire dataset - including named graphs that are added - or do I have to add all graphs in the config file at the start of the Fuseki server?
>> 
>> -----Ursprüngliche Nachricht-----
>> Von: ajs6f [mailto:ajs6f@apache.org]
>> Gesendet: Donnerstag, 21. Dezember 2017 22:29
>> An: users@jena.apache.org
>> Betreff: Re: Reasoning over multiple graphs following the same schema
>> 
>> _How_ are you trying to use this setup: are you expecting to use the Java API to query this setup, or just use SPARQL directly against Fuseki?
>> 
>> If you use the Java API, you can set up whatever ontological models you want over whatever named graphs or unions you want (although performance may or may not be what you want).
>> 
>> If you expect to just send SPARQL against Fuseki and have the inference occur automatically as part of the process of answering your queries, it's a different story... there is an example here:
>> 
>> https://github.com/apache/jena/blob/master/jena-fuseki2/examples/servi
>> ce-inference-2.ttl
>> 
>> of what I think you are trying to do. Perhaps you can try it directly?
>> 
>> As for using the named graphs, we do have examples in our test of assembler RDF incantations like:
>> 
>> <#graph> rdf:type tdb:GraphTDB ;
>>   tdb:location "/my/tdb/DB" ;
>>   tdb:graphName "http://example.com/graph"  .
>> 
>> so it might be worth trying something like that, or something like that with the tdb:dataset property, a la:
>> 
>> <#tdbGraph> rdf:type tdb:GraphTDB ;
>>    tdb:dataset <#tdbDataset> ;
>>    tdb:graphName "http://example.com/graph" .
>> 
>> Then you could use the same
>> 
>> <#model_inf> a ja:InfModel ;
>>   ja:baseModel  <#tdbGraph> ; etc.
>> 
>> That may or may not work, and if it doesn't, we can try and figure out 
>> why and whether it would be possible to add that feature. (Andy or 
>> Dave may have some insight already for this.)
>> 
>> ajs6f
>> 
>>> On Dec 21, 2017, at 1:12 PM, Wagner, Anna <wa...@iib.tu-darmstadt.de> wrote:
>>> 
>>> Hi there,
>>> 
>>> hopefully this question wasn't already asked a hundred times, but I haven't found anything quite the same in the archives.
>>> First off: my use case is that I want to store multiple named graphs within one dataset, all of which are based on the same schema. Now I'd like to use a reasoner for all of these named graphs. To make it even more complicated, it would also be a nice feature, if all these named graphs would be unionised in the default graph, so I could also use the reasoner for all of them at the same time.
>>> Now here's what I've done so far: the base for my Fuseki is a TDB according to this example I've found (http://krr.cs.vu.nl/wp-content/uploads/2013/09/protege-fuseki-yasgui-manual.pdf), the inferring is working fine for the default graph (the default graph only, though). So I tried to unionise the named graphs in the default graph, thinking this would allow me to maybe just store the basic schema in the default graph and infer through the entire data instead of the named graphs.
>>> And that's my problem right now: somehow (I'm guessing as always the bug is sitting right in front of the monitor, but I really can't find my mistake, unfortunately), the unionising does not work. I've enabled it within the DatasetTDB in the config file, but I cannot even call not-inferred information from the named graphs using the default graph. Could you maybe explain to me what I'm doing wrong here? Also, I still have the problem, that I'd like to infer my named graphs only, too. But to my understanding from the mailinglist archive this is not possible, isn't it?
>>> 
>>> Thank you very much for your help
>>> Anna
>>> 
>>> This is my config.ttl, setting up the Fuseki works without problems, the effects are just not the way I imagined.
>>> 
>>> # Licensed under the terms of
>>> http://www.apache.org/licenses/LICENSE-2.0
>>> @prefix : <#> .
>>> @prefix fuseki: <http://jena.apache.org/fuseki#> .
>>> @prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>>> @prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
>>> @prefix tdb: <http://jena.hpl.hp.com/2008/tdb#> .
>>> @prefix ja: <http://jena.hpl.hp.com/2005/11/Assembler#> .
>>> [] rdf:type fuseki:Server ;
>>> fuseki:services (
>>> <#service1>
>>> ) .
>>> # Custom code.
>>> [] ja:loadClass "com.hp.hpl.jena.tdb.TDB" .
>>> # TDB
>>> tdb:DatasetTDB rdfs:subClassOf ja:RDFDataset .
>>> tdb:GraphTDB rdfs:subClassOf ja:Model .
>>> ## ---------------------------------------------------------------
>>> ## Service with only SPARQL query on an inference model.
>>> ## Inference model bbase data in TDB.
>>> <#service1> rdf:type fuseki:Service ; fuseki:name "ViKoDB" ; # 
>>> http://host/inf fuseki:serviceQuery "sparql"
>>> ; # SPARQL query service ## fuseki:serviceUpdate "update" ; 
>>> fuseki:serviceUpload "upload" ; # Non-SPARQL upload service 
>>> fuseki:serviceReadWriteGraphStore "data" ; # SPARQL Graph store 
>>> protocol (read and write) ## A separate read-only graph store endpoint:
>>> fuseki:serviceReadGraphStore "get" ; # SPARQL Graph store protocol 
>>> (read only) fuseki:dataset <#dataset> ; .
>>> <#dataset> rdf:type ja:RDFDataset ;
>>> ja:defaultGraph <#model_inf> ;
>>> .
>>> <#model_inf> a ja:InfModel ;
>>> ja:baseModel  <#tdbGraph> ;
>>> ja:reasoner [
>>> ja:reasonerURL <http://jena.hpl.hp.com/2003/OWLFBRuleReasoner> ; 
>>> ja:schema <#solconpro_model> #this is me adding a local ttl file 
>>> containing my basic schema to the ReasonerFactory ] .
>>> <#solconpro_model> a ja:MemoryModel ; ja:content [ja:externalContent 
>>> <file:solconpro.ttl>] .
>>> <#tdbDataset> rdf:type tdb:DatasetTDB ; tdb:location "DB" ; ## If the 
>>> unionDefaultGraph is used, then the "update" service should be removed.
>>> ## The unionDefaultGraph is read only.
>>> tdb:unionDefaultGraph true ;
>>> .
>>> <#tdbGraph> rdf:type tdb:GraphTDB ;
>>> tdb:dataset <#tdbDataset> .
>>> 
>>> _____________________________________________
>>> Anna Wagner, M.Sc.
>>> Wissenschaftliche Mitarbeiterin
>>> Technische Universität Darmstadt
>>> Institut für Numerische Methoden und Informatik im Bauwesen 
>>> Franziska-Braun-Str. 7
>>> D-64287 Darmstadt
>>> Germany
>>> 
>>> Tel:         +49 (0)6151 - 16 21335
>>> Fax:        +49 (0)6151 - 16 21339
>>> 
>>> wagner@iib.tu-darmstadt.de
>>> http://www.iib.tu-darmstadt.de
>>> 
>> 
>