You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@jena.apache.org by ki...@apache.org on 2023/04/11 14:26:32 UTC
[jena-site] 01/06: Add syntax highlighting to ShEx, SHACL, RDFS, and RDF Connection pages
This is an automated email from the ASF dual-hosted git repository.
kinow pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/jena-site.git
commit ebad816d7f764314a98a19117adf38192b92844b
Author: Bruno P. Kinoshita <ki...@users.noreply.github.com>
AuthorDate: Sun Apr 9 15:11:22 2023 +0200
Add syntax highlighting to ShEx, SHACL, RDFS, and RDF Connection pages
---
source/documentation/rdfconnection/__index.md | 186 ++++++++++++++------------
source/documentation/rdfs/__index.md | 58 ++++----
source/documentation/shacl/__index.md | 152 ++++++++++++---------
source/documentation/shex/__index.md | 61 ++++-----
4 files changed, 250 insertions(+), 207 deletions(-)
diff --git a/source/documentation/rdfconnection/__index.md b/source/documentation/rdfconnection/__index.md
index 5465f5d08..0270f3b14 100644
--- a/source/documentation/rdfconnection/__index.md
+++ b/source/documentation/rdfconnection/__index.md
@@ -4,16 +4,12 @@ slug: index
---
`RDFConnection` provides a unified set of operations for working on RDF
-with SPARQL operations. It provides <a
-href="https://www.w3.org/TR/sparql11-query/">SPARQL Query</a>, <a
-href="https://www.w3.org/TR/sparql11-update/">SPARQL Update</a> and the <a
-href="https://www.w3.org/TR/sparql11-http-rdf-update/">SPARQL Graph
-Store</a> operations. The interface is uniform - the same interface
-applies to local data and to remote data using HTTP and the SPARQL
-protocols ( <a href="https://www.w3.org/TR/sparql11-protocol/">SPARQL
-protocol</a> and <a
-href="https://www.w3.org/TR/sparql11-http-rdf-update/">SPARQL Graph Store
-Protocol</a>).
+with SPARQL operations. It provides [SPARQL Query](https://www.w3.org/TR/sparql11-query/),
+[SPARQL Update](https://www.w3.org/TR/sparql11-update/) and the
+[SPARQL Graph Store](https://www.w3.org/TR/sparql11-http-rdf-update/) operations.
+The interface is uniform - the same interface applies to local data and to remote
+data using HTTP and the SPARQL protocols ([SPARQL protocol](https://www.w3.org/TR/sparql11-protocol/))
+and [SPARQL Graph Store Protocol](https://www.w3.org/TR/sparql11-http-rdf-update/)).
## Outline
@@ -24,28 +20,32 @@ passing styles, as well the more basic sequence of methods calls.
For example: using `try-resources` to manage the connection, and perform two operations, one to load
some data, and one to make a query can be written as:
- try ( RDFConnection conn = RDFConnection.connect(...) ) {
- conn.load("data.ttl") ;
- conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs) -> {
- Resource subject = qs.getResource("s") ;
- System.out.println("Subject: " + subject) ;
- }) ;
- }
+```java
+try ( RDFConnection conn = RDFConnection.connect(...) ) {
+ conn.load("data.ttl") ;
+ conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs) -> {
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: " + subject) ;
+ }) ;
+}
+```
This could have been written as (approximately -- the error handling is better
in the example above):
- RDFConnection conn = RDFConnection.connect(...)
- conn.load("data.ttl") ;
- QueryExecution qExec = conn.query("SELECT DISTINCT ?s { ?s ?p ?o }") ;
- ResultSet rs = qExec.execSelect() ;
- while(rs.hasNext()) {
- QuerySolution qs = rs.next() ;
- Resource subject = qs.getResource("s") ;
- System.out.println("Subject: " + subject) ;
- }
- qExec.close() ;
- conn.close() ;
+```java
+RDFConnection conn = RDFConnection.connect(...)
+conn.load("data.ttl") ;
+QueryExecution qExec = conn.query("SELECT DISTINCT ?s { ?s ?p ?o }") ;
+ResultSet rs = qExec.execSelect() ;
+while(rs.hasNext()) {
+ QuerySolution qs = rs.next() ;
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: " + subject) ;
+}
+qExec.close() ;
+conn.close() ;
+```
## Transactions
@@ -58,31 +58,35 @@ to excessive overhead.
The `Txn` class provides a Java8-style transaction API. Transactions are
code passed in the `Txn` library that handles the transaction lifecycle.
- try ( RDFConnection conn = RDFConnection.connect(...) ) {
- Txn.execWrite(conn, () -> {
- conn.load("data1.ttl") ;
- conn.load("data2.ttl") ;
- conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs) ->
- Resource subject = qs.getResource("s") ;
- System.out.println("Subject: " + subject) ;
- }) ;
- }) ;
- }
+```java
+try ( RDFConnection conn = RDFConnection.connect(...) ) {
+ Txn.execWrite(conn, () -> {
+ conn.load("data1.ttl") ;
+ conn.load("data2.ttl") ;
+ conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs) ->
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: " + subject) ;
+ }) ;
+ }) ;
+}
+```
The traditional style of explicit `begin`, `commit`, `abort` is also available.
- try ( RDFConnection conn = RDFConnection.connect(...) ) {
- conn.begin(ReadWrite.WRITE) ;
- try {
- conn.load("data1.ttl") ;
- conn.load("data2.ttl") ;
- conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs) -> {
- Resource subject = qs.getResource("s") ;
- System.out.println("Subject: " + subject) ;
- }) ;
- conn.commit() ;
- } finally { conn.end() ; }
- }
+```java
+try ( RDFConnection conn = RDFConnection.connect(...) ) {
+ conn.begin(ReadWrite.WRITE) ;
+ try {
+ conn.load("data1.ttl") ;
+ conn.load("data2.ttl") ;
+ conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs) -> {
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: " + subject) ;
+ }) ;
+ conn.commit() ;
+ } finally { conn.end() ; }
+}
+```
The use of `try-finally` ensures that transactions are properly finished.
The `conn.end()` provides an abort in case an exception occurs in the
@@ -111,8 +115,10 @@ builder to construct `RDFConnectionRemote`s.
At its simplest, it is:
- RDFConnectionRemoteBuilder builder = RDFConnection.create()
- .destination("http://host/triplestore");
+```java
+RDFConnectionRemoteBuilder builder = RDFConnection.create()
+ .destination("http://host/triplestore");
+```
which uses default settings used by `RDFConenctionFactory.connect`.
@@ -128,11 +134,13 @@ and providing detailed configuration with
### Fuseki Specific Connection
-If the remote destination is a Apache Jena Fuseki server, then the
-default general settings work but it is possible to have a specialised connection
+If the remote destination is an Apache Jena Fuseki server, then the
+default general settings work, but it is possible to have a specialised connection
- RDFConnectionRemoteBuilder builder = RDFConnectionFuseki.create()
- .destination("http://host/fuseki");
+```java
+RDFConnectionRemoteBuilder builder = RDFConnectionFuseki.create()
+ .destination("http://host/fuseki");
+```
which uses settings tuned to Fuseki, including round-trip handling of
blank nodes.
@@ -142,8 +150,8 @@ See [example
## Graph Store Protocol
-The <a href="https://www.w3.org/TR/sparql11-http-rdf-update/">SPARQL Graph
-Store Protocol</a> (GSP) is a set of operations to work on whole graphs in a
+The [SPARQL Graph Store Protocol](https://www.w3.org/TR/sparql11-http-rdf-update/)
+(GSP) is a set of operations to work on whole graphs in a
dataset. It provides a standardised way to manage the data in a dataset.
The operations are to fetch a graph, set the RDF data in a graph,
@@ -151,10 +159,12 @@ add more RDF data into a graph, and delete a graph from a dataset.
For example: load two files:
- try ( RDFConnection conn = RDFConnection.connect(...) ) {
- conn.load("data1.ttl") ;
- conn.load("data2.nt") ;
- }
+```java
+try ( RDFConnection conn = RDFConnection.connect(...) ) {
+ conn.load("data1.ttl") ;
+ conn.load("data2.nt") ;
+}
+```
The file extension is used to determine the syntax.
@@ -167,7 +177,9 @@ In addition, `RDFConnection` provides an extension to give the same style
of operation to work on a whole dataset (deleting the dataset is not
provided).
- conn.loadDataset("data-complete.trig") ;
+```java
+conn.loadDataset("data-complete.trig") ;
+```
### Local vs Remote
@@ -181,11 +193,11 @@ a remote connection and is useful for testing.
* Read-only – the models and datasets are made read-only but any changes
to the underlying RDF data by changes by another route will be visible.
This provides a form of checking for large datasets when "copy" is impractical.
-* None – the models and datasets are passed back with no additional wrappers
+* None – the models and datasets are passed back with no additional wrappers,
and they can be updated with the changes being made the underlying dataset.
The default for a local `RDFConnection` is "none". When used with TDB,
-accessing returned models must be done with <a href="../txn">transactions</a>
+accessing returned models must be done with [transactions](../txn)
in this mode.
## Query Usage
@@ -200,29 +212,33 @@ retain it across the lifetime of the transaction or `QueryExecution`, then
the application should create a copy which is not attached to any external system
with `ResultSetFactory.copyResults`.
- try ( RDFConnection conn = RDFConnection.connect("https://...") ) {
- ResultSet safeCopy =
- Txn.execReadReturn(conn, () -> {
- // Process results by row:
- conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs) -> {
- Resource subject = qs.getResource("s") ;
- System.out.println("Subject: "+subject) ;
- }) ;
- ResultSet rs = conn.query("SELECT * { ?s ?p ?o }").execSelect() ;
- return ResultSetFactory.copyResults(rs) ;
- }) ;
- }
+```java
+try ( RDFConnection conn = RDFConnection.connect("https://...") ) {
+ ResultSet safeCopy =
+ Txn.execReadReturn(conn, () -> {
+ // Process results by row:
+ conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs) -> {
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: "+subject) ;
+ }) ;
+ ResultSet rs = conn.query("SELECT * { ?s ?p ?o }").execSelect() ;
+ return ResultSetFactory.copyResults(rs) ;
+ }) ;
+}
+```
## Update Usage
SPARQL Update operations can be performed and mixed with other operations.
- try ( RDFConnection conn = RDFConnection.connect(...) ) {
- Txn.execWrite(conn, () -> {
- conn.update("DELETE DATA { ... }" ) ;
- conn.load("data.ttl") ;
- }) ;
- }
+```java
+try ( RDFConnection conn = RDFConnection.connect(...) ) {
+ Txn.execWrite(conn, () -> {
+ conn.update("DELETE DATA { ... }" ) ;
+ conn.load("data.ttl") ;
+ }) ;
+}
+```
## Dataset operations
@@ -246,5 +262,5 @@ operations are visible to the called code.
## Examples
-* for simple usage examples see <a href="https://github.com/apache/jena/tree/main/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/examples">https://github.com/apache/jena/tree/main/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/examples</a>.
-* for example of how to use with StreamRDF see <a href="https://github.com/apache/jena/blob/main/jena-examples/src/main/java/org/apache/jena/example/streaming/StreamRDFToConnection.java">https://github.com/apache/jena/blob/main/jena-examples/src/main/java/org/apache/jena/example/streaming/StreamRDFToConnection.java</a>.
+* for simple usage examples see <https://github.com/apache/jena/tree/main/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/examples>.
+* for example of how to use with StreamRDF see <https://github.com/apache/jena/blob/main/jena-examples/src/main/java/org/apache/jena/example/streaming/StreamRDFToConnection.java>.
diff --git a/source/documentation/rdfs/__index.md b/source/documentation/rdfs/__index.md
index 8251e8fc4..bc88940ab 100644
--- a/source/documentation/rdfs/__index.md
+++ b/source/documentation/rdfs/__index.md
@@ -21,29 +21,29 @@ The vocabulary can not be changed during the lifetime of the RDFS dataset.
The API provides operation to build RDF-enabled datasets from data storage and vocabularies:
Example:
-```
- DatasetGraph data = ...
- // Load the vocabulary
- Graph vocab = RDFDataMgr.loadGraph("vocabulary.ttl");
- // Create a DatasetGraph with RDFS
- DatasetGraph dsg = datasetRDFS(DatasetGraph data, Graph vocab );
- // (Optional) Present as a Dataset.
- Dataset dataset = DatasetFactory.wrap(dsg);
+```java
+DatasetGraph data = ...
+// Load the vocabulary
+Graph vocab = RDFDataMgr.loadGraph("vocabulary.ttl");
+// Create a DatasetGraph with RDFS
+DatasetGraph dsg = datasetRDFS(DatasetGraph data, Graph vocab );
+// (Optional) Present as a Dataset.
+Dataset dataset = DatasetFactory.wrap(dsg);
```
The vocabulary is processed to produce datastructure needed for processing the
-data eficiently at run time. This is the `SetupRDFS` class that can be created
+data efficiently at run time. This is the `SetupRDFS` class that can be created
and shared; it is thread-safe.
-```
- SetupRDFS setup = setupRDFS(vocab);
+```java
+SetupRDFS setup = setupRDFS(vocab);
```
### Assembler: RDFS Dataset
Datasets with RDFS can be built with an assembler:
-```
+```turtle
<#rdfsDS> rdf:type ja:DatasetRDFS ;
ja:rdfsSchema <vocabulary.ttl>;
ja:dataset <#baseDataset> ;
@@ -59,14 +59,14 @@ where `<#baseDataset>` is the definition of the dataset to be enriched.
It is possible to build a single `Model`:
-```
- <#rdfsGraph> rdf:type ja:GraphRDFS ;
- ja:rdfsSchema <vocabulary.ttl>;
- ja:graph <#baseGraph> ;
- .
+```turtle
+<#rdfsGraph> rdf:type ja:GraphRDFS ;
+ ja:rdfsSchema <vocabulary.ttl>;
+ ja:graph <#baseGraph> ;
+ .
- <#baseGraph> rdf:type ja:MemoryModel;
- ...
+<#baseGraph> rdf:type ja:MemoryModel;
+ ...
```
More generally, inference models can be defined using the Jena Inference and Rule
@@ -79,13 +79,14 @@ The files for this example are available at:
[jena-fuseki2/examples/rdfs](https://github.com/apache/jena/tree/main/jena-fuseki2/examples/rdfs).
From the command line (here, loading data from a file into an in-memory dataset):
-```
+
+```bash
fuseki-server --data data.trig --rdfs vocabulary.ttl /dataset
```
or from a configuration file with an RDFS Dataset:
-```
+```turtle
PREFIX : <#>
PREFIX fuseki: <http://jena.apache.org/fuseki#>
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
@@ -122,13 +123,14 @@ PREFIX ja: <http://jena.hpl.hp.com/2005/11/Assembler#>
With the [SOH](/documentation/fuseki2/soh.html) tools, a query (asking for plain
text output):
-```
+
+```bash
s-query --service http://localhost:3030/dataset --output=text --file query.rq
```
or with `curl`:
-```
+```bash
curl --data @query.rq \
--header 'Accept: text/plain' \
--header 'Content-type: application/sparql-query' \
@@ -151,8 +153,9 @@ will return:
### Files
-data.trig:
-```
+`data.trig`:
+
+```turtle
PREFIX : <http://example/>
PREFIX ns: <http://example/ns#>
@@ -161,7 +164,7 @@ PREFIX ns: <http://example/ns#>
`vocabulary.ttl`:
-```
+```turtle
PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
@@ -178,7 +181,8 @@ ns:p rdfs:range ns:T1 .
```
`query.rq`:
-```
+
+```sparql
PREFIX : <http://example/>
PREFIX ns: <http://example/ns#>
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
diff --git a/source/documentation/shacl/__index.md b/source/documentation/shacl/__index.md
index a83ead13f..019ddd9bc 100644
--- a/source/documentation/shacl/__index.md
+++ b/source/documentation/shacl/__index.md
@@ -18,8 +18,10 @@ argument.
To validate:
-<pre>shacl validate --shapes <i>SHAPES.ttl</i> --data <i>DATA.ttl</i></pre>
-<pre>shacl v -s <i>SHAPES.ttl</i> -d <i>DATA.ttl</i></pre>
+```bash
+shacl validate --shapes SHAPES.ttl --data DATA.ttl
+shacl v -s SHAPES.ttl -d DATA.ttl
+```
The shapes and data files can be the same; the `--shapes` is optional and
defaults to the same as `--data`. This includes running individual W3C Working
@@ -27,12 +29,16 @@ Group tests.
To parse a file:
-<pre>shacl parse <i>FILE</i></pre>
-<pre>shacl p <i>FILE</i></pre>
+```bash
+shacl parse FILE
+shacl p FILE
+```
which writes out a text format.
-<pre>shacl p <i>--out=FMT</i> <i>FILE</i></pre>
+```bash
+shacl p --out=FMT FILE
+```
writes out in `text`(`t`), `compact`(`c`), `rdf`(`r`) formats. Multiple formats can be given,
separated by "," and format `all` outputs all 3 formats.
@@ -41,24 +47,26 @@ separated by "," and format `all` outputs all 3 formats.
Fuseki has a new service operation `fuseki:shacl`:
-<pre>
-<#serviceWithShacl> rdf:type fuseki:Service ;
+```turtle
+<#serviceWithShacl>; rdf:type fuseki:Service ;
rdfs:label "Dataset with SHACL validation" ;
fuseki:name "<i>ds</i>" ;
fuseki:serviceReadWriteGraphStore "" ;
fuseki:endpoint [ fuseki:operation fuseki:shacl ; fuseki:name "shacl" ] ;
- fuseki:dataset <#dataset> ;
+ fuseki:dataset <#dataset> ;
.
-</pre>
+```
This requires a "new style" endpoint declaration: see
"[Fuseki Endpoint Configuration](/documentation/fuseki2/fuseki-config-endpoint.html)".
This is not installed into a dataset setup by default; a configuration file using
-```
+
+```turtle
fuseki:endpoint [ fuseki:operation fuseki:shacl ;
fuseki:name "shacl" ];
```
+
is necessary (or programmatic setup for Fuseki Main).
The service accepts a shapes graph posted as RDF to <tt>/<i>ds</i>/shacl</tt> with
@@ -73,15 +81,19 @@ Further, an argument <tt>target=<i>uri</i></tt> validates a specific node in the
Upload data in file `fu-data.ttl`:
- curl -XPOST --data-binary @fu-data.ttl \
- --header 'Content-type: text/turtle' \
- 'http://localhost:3030/ds?default'
+```bash
+curl -XPOST --data-binary @fu-data.ttl \
+ --header 'Content-type: text/turtle' \
+ 'http://localhost:3030/ds?default'
+```
Validate with shapes in `fu-shapes.ttl` and get back a validation report:
- curl -XPOST --data-binary @fu-shapes.ttl \
- --header 'Content-type: text/turtle' \
- 'http://localhost:3030/ds/shacl?graph=default'
+```bash
+curl -XPOST --data-binary @fu-shapes.ttl \
+ --header 'Content-type: text/turtle' \
+ 'http://localhost:3030/ds/shacl?graph=default'
+```
## API
@@ -92,27 +104,29 @@ The package `org.apache.jena.shacl` has the main classes.
## API Examples
-https://github.com/apache/jena/tree/main/jena-examples/src/main/java/shacl/examples/
+<https://github.com/apache/jena/tree/main/jena-examples/src/main/java/shacl/examples/>
Example
[`Shacl01_validateGraph`](
https://github.com/apache/jena/tree/main/jena-shacl/src/main/java/org/apache/jena/shacl/examples/Shacl01_validateGraph.java)
shows validation and printing of the validation report in a text form and in RDF:
- public static void main(String ...args) {
- String SHAPES = "shapes.ttl";
- String DATA = "data1.ttl";
+```java
+public static void main(String ...args) {
+ String SHAPES = "shapes.ttl";
+ String DATA = "data1.ttl";
- Graph shapesGraph = RDFDataMgr.loadGraph(SHAPES);
- Graph dataGraph = RDFDataMgr.loadGraph(DATA);
+ Graph shapesGraph = RDFDataMgr.loadGraph(SHAPES);
+ Graph dataGraph = RDFDataMgr.loadGraph(DATA);
- Shapes shapes = Shapes.parse(shapesGraph);
+ Shapes shapes = Shapes.parse(shapesGraph);
- ValidationReport report = ShaclValidator.get().validate(shapes, dataGraph);
- ShLib.printReport(report);
- System.out.println();
- RDFDataMgr.write(System.out, report.getModel(), Lang.TTL);
- }
+ ValidationReport report = ShaclValidator.get().validate(shapes, dataGraph);
+ ShLib.printReport(report);
+ System.out.println();
+ RDFDataMgr.write(System.out, report.getModel(), Lang.TTL);
+}
+```
Example
[`Shacl02_validateTransaction`](https://github.com/apache/jena/tree/main/jena-shacl/src/main/java/org/apache/jena/shacl/examples/Shacl02_validateTransaction.java)
@@ -128,11 +142,13 @@ for both reading and writing.
The file extensions for SHACL-C are `.shc` and `.shaclc` and there is a registered language
constant `Lang.SHACLC`.
- RDFDataMgr.load("shapes.shc");
+```java
+RDFDataMgr.load("shapes.shc");
- RDFDataMgr.read("file:compactShapes", Lang.SHACLC);
+RDFDataMgr.read("file:compactShapes", Lang.SHACLC);
- RDFDataMgr.write(System.out, shapesGraph, Lang.SHACLC);
+RDFDataMgr.write(System.out, shapesGraph, Lang.SHACLC);
+```
SHACL-C is managed by the SHACL Community Group. It does not cover all possible shapes.
When outputting SHACL-C, SHACL shapes not expressible in SHACL-C will cause an
@@ -158,7 +174,7 @@ SPARQL-based targets allow the target nodes to be calculated with a SPARQL
See [SPARQL-based targets](https://w3c.github.io/shacl/shacl-af/#SPARQLTarget)
for details.
-```
+```turtle
ex:example
sh:target [
a sh:SPARQLTarget ;
@@ -178,7 +194,8 @@ When given a `ValidationListener` the SHACL validation code emits events at each
* when validation of a constraint begins, ends and yields positive or negative results
For example, the following listener will just record all events in a List:
-```
+
+```java
public class RecordingValidationListener implements ValidationListener {
private final List<ValidationEvent> events = new ArrayList<>();
@@ -191,47 +208,52 @@ public class RecordingValidationListener implements ValidationListener {
}
}
```
+
The listener must be passed to the constructor of the `ValidationContext`.
The following example validates the `dataGraph` according to the `shapesGraph` using the ValidationListener above:
-```
- Graph shapesGraph = RDFDataMgr.loadGraph(shapesGraphUri); //assuming shapesGraphUri points to an RDF file
- Graph dataGraph = RDFDataMgr.loadGraph(dataGraphUri); //assuming dataGraphUri points to an RDF file
- RecordingValidationListener listener = new RecordingValidationListener(); // see above
- Shapes shapes = Shapes.parse(shapesGraph);
- ValidationContext vCtx = ValidationContext.create(shapes, dataGraph, listener); // pass listener here
- for (Shape shape : shapes.getTargetShapes()) {
- Collection<Node> focusNodes = VLib.focusNodes(dataGraph, shape);
- for (Node focusNode : focusNodes) {
- VLib.validateShape(vCtx, dataGraph, shape, focusNode);
- }
+
+```java
+Graph shapesGraph = RDFDataMgr.loadGraph(shapesGraphUri); //assuming shapesGraphUri points to an RDF file
+Graph dataGraph = RDFDataMgr.loadGraph(dataGraphUri); //assuming dataGraphUri points to an RDF file
+RecordingValidationListener listener = new RecordingValidationListener(); // see above
+Shapes shapes = Shapes.parse(shapesGraph);
+ValidationContext vCtx = ValidationContext.create(shapes, dataGraph, listener); // pass listener here
+for (Shape shape : shapes.getTargetShapes()) {
+ Collection<Node> focusNodes = VLib.focusNodes(dataGraph, shape);
+ for (Node focusNode : focusNodes) {
+ VLib.validateShape(vCtx, dataGraph, shape, focusNode);
}
- List<ValidationEvent> actualEvents = listener.getEvents(); // all events have been recorded
+}
+List<ValidationEvent> actualEvents = listener.getEvents(); // all events have been recorded
```
The events thus generated might look like this (`event.toString()`, one per line):
-```
- FocusNodeValidationStartedEvent{focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape]}
- ConstraintEvaluationForNodeShapeStartedEvent{constraint=ClassConstraint[<http://datashapes.org/sh/tests/core/node/class-001.test#Person>], focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape]}
- ConstraintEvaluatedOnFocusNodeEvent{constraint=ClassConstraint[<http://datashapes.org/sh/tests/core/node/class-001.test#Person>], focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape], valid=true}
- ConstraintEvaluationForNodeShapeFinishedEvent{constraint=ClassConstraint[<http://datashapes.org/sh/tests/core/node/class-001.test#Person>], focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape]}
- FocusNodeValidationFinishedEvent{focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape]}
+
+```turtle
+FocusNodeValidationStartedEvent{focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape]}
+ConstraintEvaluationForNodeShapeStartedEvent{constraint=ClassConstraint[<http://datashapes.org/sh/tests/core/node/class-001.test#Person>], focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape]}
+ConstraintEvaluatedOnFocusNodeEvent{constraint=ClassConstraint[<http://datashapes.org/sh/tests/core/node/class-001.test#Person>], focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape], valid=true}
+ConstraintEvaluationForNodeShapeFinishedEvent{constraint=ClassConstraint[<http://datashapes.org/sh/tests/core/node/class-001.test#Person>], focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape]}
+FocusNodeValidationFinishedEvent{focusNode=http://datashapes.org/sh/tests/core/node/class-001.test#Someone, shape=NodeShape[http://datashapes.org/sh/tests/core/node/class-001.test#TestShape]}
[...]
```
+
Many use cases can be addressed with the `HandlerBasedValidationListener`, which allows for registering event handlers on a per-event basis.
For example:
-```
- ValidationListener myListener = HandlerBasedValidationListener
- .builder()
- .forEventType(FocusNodeValidationStartedEvent.class)
- .addSimpleHandler(e -> {
- // ...
+
+```java
+ValidationListener myListener = HandlerBasedValidationListener
+ .builder()
+ .forEventType(FocusNodeValidationStartedEvent.class)
+ .addSimpleHandler(e -> {
+ // ...
+ })
+ .forEventType(ConstraintEvaluatedEvent.class)
+ .addHandler(c -> c
+ .iff(EventPredicates.isValid()) // use a Predicate<ValidationEvent> to select events
+ .handle(e -> {
+ // ...
})
- .forEventType(ConstraintEvaluatedEvent.class)
- .addHandler(c -> c
- .iff(EventPredicates.isValid()) // use a Predicate<ValidationEvent> to select events
- .handle(e -> {
- // ...
- })
- )
- .build();
+ )
+ .build();
```
diff --git a/source/documentation/shex/__index.md b/source/documentation/shex/__index.md
index 16b794e26..c8fb02afe 100644
--- a/source/documentation/shex/__index.md
+++ b/source/documentation/shex/__index.md
@@ -6,11 +6,8 @@ slug: index
`jena-shex` is an implementation of the
[ShEx (Shape Expressions)](https://shex.io) language.
-<p>
-<i>This implementation is experimental, starting with Jena 4.2.0.
-Please send usage reports and experience to </i>
-<tt>users@jena.apache.org</tt>.
-</p>
+This implementation is experimental, starting with Jena 4.2.0.
+Please send usage reports and experience to <tt>users@jena.apache.org</tt>.
## Status
@@ -25,18 +22,22 @@ scoped to the file, and not retained after the file has been read.
## Command line
-The command `shex` introduces shex operations; it takes a sub-command
+The command `shex` introduces ShEx operations; it takes a sub-command
argument.
To validate:
-<pre>shex validate --schema SCHEMA.shex --map MAP.smap --data DATA.ttl</pre>
-<pre>shex v -s SCHEMA.shex -m MAP.smap -d data.ttl</pre>
+```bash
+shex validate --schema SCHEMA.shex --map MAP.smap --data DATA.ttl
+shex v -s SCHEMA.shex -m MAP.smap -d data.ttl
+```
To parse a file:
-<pre>shex parse <i>FILE</i></pre>
-<pre>shex p <i>FILE</i></pre>
+```bash
+shex parse FILE
+shex p FILE
+````
which writes out the parser results in a text format.
@@ -57,30 +58,30 @@ The package `org.apache.jena.shex` has the main classes.
Examples:
-https://github.com/apache/jena/tree/main/jena-examples/src/main/java/shex/examples/
+<https://github.com/apache/jena/tree/main/jena-examples/src/main/java/shex/examples/>
-```
- public static void main(String ...args) {
- String SHAPES = "examples/schema.shex";
- String SHAPES_MAP = "examples/shape-map.shexmap";
- String DATA = "examples/data.ttl";
+```java
+public static void main(String ...args) {
+ String SHAPES = "examples/schema.shex";
+ String SHAPES_MAP = "examples/shape-map.shexmap";
+ String DATA = "examples/data.ttl";
- System.out.println("Read data");
- Graph dataGraph = RDFDataMgr.loadGraph(DATA);
+ System.out.println("Read data");
+ Graph dataGraph = RDFDataMgr.loadGraph(DATA);
- System.out.println("Read schema");
- ShexSchema shapes = Shex.readSchema(SHAPES);
+ System.out.println("Read schema");
+ ShexSchema shapes = Shex.readSchema(SHAPES);
- // Shapes map.
- System.out.println("Read shapes map");
- ShapeMap shapeMap = Shex.readShapeMap(SHAPES_MAP);
+ // Shapes map.
+ System.out.println("Read shapes map");
+ ShapeMap shapeMap = Shex.readShapeMap(SHAPES_MAP);
- // ShexReport
- System.out.println("Validate");
- ShexReport report = ShexValidator.get().validate(dataGraph, shapes, shapeMap);
+ // ShexReport
+ System.out.println("Validate");
+ ShexReport report = ShexValidator.get().validate(dataGraph, shapes, shapeMap);
- System.out.println();
- // Print report.
- ShexLib.printReport(report);
- }
+ System.out.println();
+ // Print report.
+ ShexLib.printReport(report);
+}
```