You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@jena.apache.org by an...@apache.org on 2016/12/22 17:55:02 UTC
[02/11] jena git commit: JENA-1267: Module jena-rdfconnection
JENA-1267: Module jena-rdfconnection
Project: http://git-wip-us.apache.org/repos/asf/jena/repo
Commit: http://git-wip-us.apache.org/repos/asf/jena/commit/2e26b78e
Tree: http://git-wip-us.apache.org/repos/asf/jena/tree/2e26b78e
Diff: http://git-wip-us.apache.org/repos/asf/jena/diff/2e26b78e
Branch: refs/heads/master
Commit: 2e26b78edcf1b92e1b8b22acc53e9294fcc9f5ea
Parents: c10dff4
Author: Andy Seaborne <an...@apache.org>
Authored: Fri Dec 16 15:02:10 2016 +0000
Committer: Andy Seaborne <an...@apache.org>
Committed: Fri Dec 16 15:02:10 2016 +0000
----------------------------------------------------------------------
jena-rdfconnection/Documentation.md | 205 ++++++++
jena-rdfconnection/LICENSE | 177 +++++++
jena-rdfconnection/NOTICE | 5 +
jena-rdfconnection/README.md | 6 +
jena-rdfconnection/pom.xml | 163 +++++++
.../rdfconnection/JenaConnectionException.java | 29 ++
.../org/apache/jena/rdfconnection/RDFConn.java | 40 ++
.../jena/rdfconnection/RDFConnection.java | 362 ++++++++++++++
.../rdfconnection/RDFConnectionFactory.java | 84 ++++
.../jena/rdfconnection/RDFConnectionLocal.java | 286 +++++++++++
.../rdfconnection/RDFConnectionModular.java | 199 ++++++++
.../jena/rdfconnection/RDFConnectionRemote.java | 478 +++++++++++++++++++
.../RDFDatasetAccessConnection.java | 57 +++
.../rdfconnection/RDFDatasetConnection.java | 150 ++++++
.../rdfconnection/SparqlQueryConnection.java | 105 ++++
.../rdfconnection/SparqlUpdateConnection.java | 53 ++
.../examples/RDFConnectionExample1.java | 45 ++
.../examples/RDFConnectionExample2.java | 67 +++
.../examples/RDFConnectionExample3.java | 42 ++
jena-rdfconnection/src/main/resources/LICENSE | 177 +++++++
jena-rdfconnection/src/main/resources/NOTICE | 5 +
.../AbstractTestRDFConnection.java | 382 +++++++++++++++
.../jena/rdfconnection/TS_RDFConnection.java | 32 ++
.../TestRDFConnectionLocalMRSW.java | 38 ++
.../TestRDFConnectionLocalTxnMem.java | 38 ++
.../src/test/resources/log4j.properties | 40 ++
.../testing/RDFConnection/data.trig | 16 +
.../testing/RDFConnection/data.ttl | 6 +
28 files changed, 3287 insertions(+)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/Documentation.md
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/Documentation.md b/jena-rdfconnection/Documentation.md
new file mode 100644
index 0000000..af5ff6a
--- /dev/null
+++ b/jena-rdfconnection/Documentation.md
@@ -0,0 +1,205 @@
+# RDF Connection : SPARQL operations API
+
+`RDFConnection` provides a unified set of operations for working on RDF
+with SPARQL operations. It provides <a
+href="http://www.w3.org/TR/sparql11-query/">SPARQL Query</a>, <a
+href="http://www.w3.org/TR/sparql11-update/">SPARQL Update</a> and the <a
+href="http://www.w3.org/TR/sparql11-http-rdf-update/">SPARQL Graph
+Store</a> operations. The interface is uniform - the same interface
+applies to local data and to remote data using HTTP and the SPARQL
+protocols ( <a href="http://www.w3.org/TR/sparql11-protocol/">SPARQL
+protocol</a> and <a
+href="http://www.w3.org/TR/sparql11-http-rdf-update/">SPARQL Graph Store
+Protocol</a>).
+
+## Outline
+
+`RDFConnection` provides a number of different styles for working with RDF
+data in Java. It provides support for try-resource and functional code
+passing styles, as well the more basic sequence of methods calls.
+
+`try-resources` to manage the connection, and two operations, one to load
+some data, and one to make a query:
+
+```
+try ( RDFConnection conn = RDFConnectionFactory.connect(...) ) {
+ conn.load("data.ttl") ;
+ conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs)->
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: "+subject) ;
+ }) ;
+}
+```
+This could have been written as (approximately -- the error handling is better
+in the example above):
+
+```
+RDFConnection conn = RDFConnectionFactory.connect(...)
+conn.load("data.ttl") ;
+QueryExecution qExec = conn.query("SELECT DISTINCT ?s { ?s ?p ?o }") ;
+ResultSet rs = qExec.execSelect() ;
+while(rs.hasNext()) {
+ QuerySolution qs = rs.next() ;
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: "+subject) ;
+}
+qExec.close() ;
+conn.close() ;
+```
+
+Jena also provides a separate
+[SPARQL over JDBC driver](http://jena.staging.apache.org/documentation/jdbc/index.html)
+library.
+
+## Transactions
+
+Transactions are the preferred way to work with RDF data.
+Operations on an `RDFConnection` outside of an application-controlled
+transaction will cause the system to add one for the duration of the
+operation. This "autocommit" feature may lead to inefficient operations due
+to excessive overhead.
+
+The `Txn` class provides a Java8-style transaction API. Transactions are
+code passed in the `Txn` library that handles the transaction lifecycle.
+
+```
+try ( RDFConnection conn = RDFConnectionFactory.connect(...) ) {
+ Txn.execWrite(conn, ()-> {
+ conn.load("data1.ttl") ;
+ conn.load("data2.ttl") ;
+ conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs)->
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: "+subject) ;
+ }) ;
+ }) ;
+}
+```
+
+The traditional style of explicit `begin`, `commit`, `abort` is also available.
+
+```
+try ( RDFConnection conn = RDFConnectionFactory.connect(...) ) {
+ conn.begin(ReadWrite.WRITE) ;
+ try {
+ conn.load("data1.ttl") ;
+ conn.load("data2.ttl") ;
+ conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs)->
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: "+subject) ;
+ }) ;
+ conn.commit() ;
+ } finally { conn.end() ; }
+}
+```
+
+The use of `try-finally` ensures that transactions are properly finished.
+The `conn.end()` provides an abort in case an exception occurs in the
+transaction and a commit has not been issued. The use of `try-finally`
+ensures that transactions are properly finished.
+
+`Txn` is wrapping these steps up and calling the application supplied code
+for the transaction body.
+
+### Remote Transactions
+
+SPARQL does not define a remote transaction standard protocol. Each remote
+operation shuld be atomic (all happens or nothing happens) - this is the
+responsibility of the remote server.
+
+An `RDFConenction` will at least provide the client-side locking features.
+This means that overlapping operations that chnage data are naturally
+handled by the transaction pattern within a single JVM.
+
+## Graph Store Protocol
+
+The <a href="http://www.w3.org/TR/sparql11-http-rdf-update/">SPARQL Graph
+Store Protocol</a> is a set of operations to work on whole graphs in a
+dataset. It provides a standardised way to manage the data in a dataset.
+
+The operations are to fetch a graph, set the RDF data in a graph,
+add more RDF data into a graph, and delete a graph from a dataset.
+
+For example: load two files:
+```
+ try ( RDFConnection conn = RDFConnectionFactory.connect(...) ) {
+ conn.load("data1.ttl") ;
+ conn.load("data2.nt") ;
+ }
+```
+The file extension is used to determine the syntax.
+
+There is also a set of scripts to help do these operations from the command
+line with <a href="http://jena.apache.org/documentation/fuseki2/soh.html"
+>SOH</a>. It is possible to write curl scripts as well. The SPARQL Graph
+Store Protocol provides a standardised way to manage the data in a dataset.
+
+In addition, `RDFConnection` provides an extension to give the same style
+of operation to work on a whole dataset (deleting the dataset is not
+provided).
+
+```
+ conn.loadDataset("data-complete.trig") ;
+```
+
+## Query Usage
+
+`RDFConnection` provides methods for each of the SPARQL query forms (`SELECT`,
+`CONSTRUCT`, `DESCRIBE`, `ASK`) as well as a way to get the lower level
+`QueryExecution` for specialized configuration.
+
+When creating an `QueryExecution` explicitly, care shoud be taken to close
+it. If the application wishes to capture the result set from a SELECT query and
+retain it across the lifetime of the transaction or `QueryExecution`, then
+the application create a copy which is not attached to any external system
+with `ResultSetFactory.copyResults`.
+
+```
+ try ( RDFConnection conn = RDFConnectionFactory.connect("foo") ) {
+ ResultSet safeCopy =
+ Txn.execReadReturn(conn, ()-> {
+ // Process results by row:
+ conn.querySelect("SELECT DISTINCT ?s { ?s ?p ?o }", (qs)->{
+ Resource subject = qs.getResource("s") ;
+ System.out.println("Subject: "+subject) ;
+ }) ;
+ ResultSet rs = conn.query("SELECT * { ?s ?p ?o }").execSelect() ;
+ return ResultSetFactory.copyResults(rs) ;
+ }) ;
+ }
+```
+
+## Update Usage
+
+SPARQL Update opertions can be performed and mixed with other operations.
+
+```
+ try ( RDFConnection conn = RDFConnectionFactory.connect(...) ) {
+ Txn.execWrite(conn, ()-> {
+ conn.update("DELETE DATA { ... }" ) ;
+ conn.load("data.ttl") ;
+ }) ;
+```
+
+## Dataset operations
+
+In addition to the SPARQL Graph Store Protocol, operations on whole
+datasets are provided for fetching (HTTP GET), adding data (HTTP POST) and
+settign the data (HTTP PUT) on a dataset URL. This assumes the remote
+server supported these REST-style operations. Apache Jena Fuseki does
+provide these.
+
+## Subinterfaces
+
+To help structure code, the `RDFConnection` consists of a number of
+different interfaces. An `RDFConnection` can be passed to application code
+as one of these interfaces so that only certain subsets of the full
+operations are visible to the called code.
+
+* query via `SparqlQueryConnection`
+* update via `SparqlUpdateConnection`
+* graph store protocol `RDFDatasetAccessConnection` (read operations),
+ and `RDFDatasetConnection` (read and write operations).
+
+## Examples
+
+https://github.com/afs/jena-rdfconnection/tree/master/src/main/java/rdfconnection/examples
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/LICENSE
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/LICENSE b/jena-rdfconnection/LICENSE
new file mode 100644
index 0000000..f433b1a
--- /dev/null
+++ b/jena-rdfconnection/LICENSE
@@ -0,0 +1,177 @@
+
+ Apache License
+ Version 2.0, January 2004
+ http://www.apache.org/licenses/
+
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+ 1. Definitions.
+
+ "License" shall mean the terms and conditions for use, reproduction,
+ and distribution as defined by Sections 1 through 9 of this document.
+
+ "Licensor" shall mean the copyright owner or entity authorized by
+ the copyright owner that is granting the License.
+
+ "Legal Entity" shall mean the union of the acting entity and all
+ other entities that control, are controlled by, or are under common
+ control with that entity. For the purposes of this definition,
+ "control" means (i) the power, direct or indirect, to cause the
+ direction or management of such entity, whether by contract or
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
+ outstanding shares, or (iii) beneficial ownership of such entity.
+
+ "You" (or "Your") shall mean an individual or Legal Entity
+ exercising permissions granted by this License.
+
+ "Source" form shall mean the preferred form for making modifications,
+ including but not limited to software source code, documentation
+ source, and configuration files.
+
+ "Object" form shall mean any form resulting from mechanical
+ transformation or translation of a Source form, including but
+ not limited to compiled object code, generated documentation,
+ and conversions to other media types.
+
+ "Work" shall mean the work of authorship, whether in Source or
+ Object form, made available under the License, as indicated by a
+ copyright notice that is included in or attached to the work
+ (an example is provided in the Appendix below).
+
+ "Derivative Works" shall mean any work, whether in Source or Object
+ form, that is based on (or derived from) the Work and for which the
+ editorial revisions, annotations, elaborations, or other modifications
+ represent, as a whole, an original work of authorship. For the purposes
+ of this License, Derivative Works shall not include works that remain
+ separable from, or merely link (or bind by name) to the interfaces of,
+ the Work and Derivative Works thereof.
+
+ "Contribution" shall mean any work of authorship, including
+ the original version of the Work and any modifications or additions
+ to that Work or Derivative Works thereof, that is intentionally
+ submitted to Licensor for inclusion in the Work by the copyright owner
+ or by an individual or Legal Entity authorized to submit on behalf of
+ the copyright owner. For the purposes of this definition, "submitted"
+ means any form of electronic, verbal, or written communication sent
+ to the Licensor or its representatives, including but not limited to
+ communication on electronic mailing lists, source code control systems,
+ and issue tracking systems that are managed by, or on behalf of, the
+ Licensor for the purpose of discussing and improving the Work, but
+ excluding communication that is conspicuously marked or otherwise
+ designated in writing by the copyright owner as "Not a Contribution."
+
+ "Contributor" shall mean Licensor and any individual or Legal Entity
+ on behalf of whom a Contribution has been received by Licensor and
+ subsequently incorporated within the Work.
+
+ 2. Grant of Copyright License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ copyright license to reproduce, prepare Derivative Works of,
+ publicly display, publicly perform, sublicense, and distribute the
+ Work and such Derivative Works in Source or Object form.
+
+ 3. Grant of Patent License. Subject to the terms and conditions of
+ this License, each Contributor hereby grants to You a perpetual,
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+ (except as stated in this section) patent license to make, have made,
+ use, offer to sell, sell, import, and otherwise transfer the Work,
+ where such license applies only to those patent claims licensable
+ by such Contributor that are necessarily infringed by their
+ Contribution(s) alone or by combination of their Contribution(s)
+ with the Work to which such Contribution(s) was submitted. If You
+ institute patent litigation against any entity (including a
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
+ or a Contribution incorporated within the Work constitutes direct
+ or contributory patent infringement, then any patent licenses
+ granted to You under this License for that Work shall terminate
+ as of the date such litigation is filed.
+
+ 4. Redistribution. You may reproduce and distribute copies of the
+ Work or Derivative Works thereof in any medium, with or without
+ modifications, and in Source or Object form, provided that You
+ meet the following conditions:
+
+ (a) You must give any other recipients of the Work or
+ Derivative Works a copy of this License; and
+
+ (b) You must cause any modified files to carry prominent notices
+ stating that You changed the files; and
+
+ (c) You must retain, in the Source form of any Derivative Works
+ that You distribute, all copyright, patent, trademark, and
+ attribution notices from the Source form of the Work,
+ excluding those notices that do not pertain to any part of
+ the Derivative Works; and
+
+ (d) If the Work includes a "NOTICE" text file as part of its
+ distribution, then any Derivative Works that You distribute must
+ include a readable copy of the attribution notices contained
+ within such NOTICE file, excluding those notices that do not
+ pertain to any part of the Derivative Works, in at least one
+ of the following places: within a NOTICE text file distributed
+ as part of the Derivative Works; within the Source form or
+ documentation, if provided along with the Derivative Works; or,
+ within a display generated by the Derivative Works, if and
+ wherever such third-party notices normally appear. The contents
+ of the NOTICE file are for informational purposes only and
+ do not modify the License. You may add Your own attribution
+ notices within Derivative Works that You distribute, alongside
+ or as an addendum to the NOTICE text from the Work, provided
+ that such additional attribution notices cannot be construed
+ as modifying the License.
+
+ You may add Your own copyright statement to Your modifications and
+ may provide additional or different license terms and conditions
+ for use, reproduction, or distribution of Your modifications, or
+ for any such Derivative Works as a whole, provided Your use,
+ reproduction, and distribution of the Work otherwise complies with
+ the conditions stated in this License.
+
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
+ any Contribution intentionally submitted for inclusion in the Work
+ by You to the Licensor shall be under the terms and conditions of
+ this License, without any additional terms or conditions.
+ Notwithstanding the above, nothing herein shall supersede or modify
+ the terms of any separate license agreement you may have executed
+ with Licensor regarding such Contributions.
+
+ 6. Trademarks. This License does not grant permission to use the trade
+ names, trademarks, service marks, or product names of the Licensor,
+ except as required for reasonable and customary use in describing the
+ origin of the Work and reproducing the content of the NOTICE file.
+
+ 7. Disclaimer of Warranty. Unless required by applicable law or
+ agreed to in writing, Licensor provides the Work (and each
+ Contributor provides its Contributions) on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+ implied, including, without limitation, any warranties or conditions
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+ PARTICULAR PURPOSE. You are solely responsible for determining the
+ appropriateness of using or redistributing the Work and assume any
+ risks associated with Your exercise of permissions under this License.
+
+ 8. Limitation of Liability. In no event and under no legal theory,
+ whether in tort (including negligence), contract, or otherwise,
+ unless required by applicable law (such as deliberate and grossly
+ negligent acts) or agreed to in writing, shall any Contributor be
+ liable to You for damages, including any direct, indirect, special,
+ incidental, or consequential damages of any character arising as a
+ result of this License or out of the use or inability to use the
+ Work (including but not limited to damages for loss of goodwill,
+ work stoppage, computer failure or malfunction, or any and all
+ other commercial damages or losses), even if such Contributor
+ has been advised of the possibility of such damages.
+
+ 9. Accepting Warranty or Additional Liability. While redistributing
+ the Work or Derivative Works thereof, You may choose to offer,
+ and charge a fee for, acceptance of support, warranty, indemnity,
+ or other liability obligations and/or rights consistent with this
+ License. However, in accepting such obligations, You may act only
+ on Your own behalf and on Your sole responsibility, not on behalf
+ of any other Contributor, and only if You agree to indemnify,
+ defend, and hold each Contributor harmless for any liability
+ incurred by, or claims asserted against, such Contributor by reason
+ of your accepting any such warranty or additional liability.
+
+ END OF TERMS AND CONDITIONS
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/NOTICE
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/NOTICE b/jena-rdfconnection/NOTICE
new file mode 100644
index 0000000..81363e4
--- /dev/null
+++ b/jena-rdfconnection/NOTICE
@@ -0,0 +1,5 @@
+Apache Jena - RDF Connection
+Copyright 2016 The Apache Software Foundation
+
+This code is licensed under an Apache Software License.
+See LICENSE.
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/README.md
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/README.md b/jena-rdfconnection/README.md
new file mode 100644
index 0000000..1fee5e2
--- /dev/null
+++ b/jena-rdfconnection/README.md
@@ -0,0 +1,6 @@
+RDF Connection
+==============
+
+An API for SPARQL client functionality.
+
+Works for local and remote data.
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/pom.xml
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/pom.xml b/jena-rdfconnection/pom.xml
new file mode 100644
index 0000000..c58cce4
--- /dev/null
+++ b/jena-rdfconnection/pom.xml
@@ -0,0 +1,163 @@
+<?xml version="1.0" encoding="utf-8"?>
+
+<!--
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+
+ See the NOTICE file distributed with this work for additional
+ information regarding copyright ownership.
+-->
+
+<project xmlns="http://maven.apache.org/POM/4.0.0"
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
+
+ <modelVersion>4.0.0</modelVersion>
+
+ <groupId>org.apache.jena</groupId>
+ <artifactId>jena-rdfconnection</artifactId>
+ <packaging>jar</packaging>
+ <name>Apache Jena - RDF Connection</name>
+ <version>3.2.0-SNAPSHOT</version>
+
+ <description>RDF Connection</description>
+
+ <parent>
+ <groupId>org.apache.jena</groupId>
+ <artifactId>jena-parent</artifactId>
+ <version>18-SNAPSHOT</version>
+ <relativePath>../jena-parent</relativePath>
+ </parent>
+
+ <licenses>
+ <license>
+ <name>Apache 2.0 License</name>
+ <url>http://www.apache.org/licenses/LICENSE-2.0</url>
+ </license>
+ </licenses>
+
+ <dependencies>
+ <dependency>
+ <groupId>org.apache.jena</groupId>
+ <artifactId>jena-arq</artifactId>
+ <version>3.2.0-SNAPSHOT</version>
+ </dependency>
+
+ <dependency>
+ <groupId>org.apache.jena</groupId>
+ <artifactId>jena-base</artifactId>
+ <version>3.2.0-SNAPSHOT</version>
+ <classifier>tests</classifier>
+ <scope>test</scope>
+ </dependency>
+
+ <dependency>
+ <groupId>org.apache.jena</groupId>
+ <artifactId>jena-arq</artifactId>
+ <version>3.2.0-SNAPSHOT</version>
+ <classifier>tests</classifier>
+ <scope>test</scope>
+ </dependency>
+
+ <dependency>
+ <groupId>org.apache.jena</groupId>
+ <artifactId>jena-core</artifactId>
+ <classifier>tests</classifier>
+ <version>3.2.0-SNAPSHOT</version>
+ <scope>test</scope>
+ </dependency>
+
+ <!-- Testing -->
+ <!-- Test is also done in jena-integration-tests -->
+
+ <dependency>
+ <groupId>org.slf4j</groupId>
+ <artifactId>slf4j-log4j12</artifactId>
+ <scope>test</scope>
+ </dependency>
+
+ <dependency>
+ <groupId>log4j</groupId>
+ <artifactId>log4j</artifactId>
+ <scope>test</scope>
+ </dependency>
+
+ <dependency>
+ <groupId>junit</groupId>
+ <artifactId>junit</artifactId>
+ <scope>test</scope>
+ </dependency>
+ </dependencies>
+
+ <build>
+ <plugins>
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-compiler-plugin</artifactId>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-surefire-plugin</artifactId>
+ <configuration>
+ <includes>
+ <include>**/TS_*.java</include>
+ </includes>
+ </configuration>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-jar-plugin</artifactId>
+ <executions>
+ <execution>
+ <goals>
+ <goal>test-jar</goal>
+ </goals>
+ </execution>
+ </executions>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-javadoc-plugin</artifactId>
+ <configuration>
+ <version>true</version>
+ <show>public</show>
+ <quiet>true</quiet>
+ <encoding>UTF-8</encoding>
+ <windowtitle>Apache Jena RDF Connection</windowtitle>
+ <doctitle>Apache Jena RDF Connection ${project.version}</doctitle>
+ <bottom>Licenced under the Apache License, Version 2.0</bottom>
+ </configuration>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.maven.plugins</groupId>
+ <artifactId>maven-resources-plugin</artifactId>
+ </plugin>
+
+ <plugin>
+ <groupId>org.apache.rat</groupId>
+ <artifactId>apache-rat-plugin</artifactId>
+ <configuration>
+ <excludes>
+ <exclude>Documentation.md</exclude>
+ <exclude>README.*</exclude>
+ </excludes>
+ </configuration>
+ </plugin>
+ </plugins>
+
+ </build>
+
+</project>
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/JenaConnectionException.java
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/JenaConnectionException.java b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/JenaConnectionException.java
new file mode 100644
index 0000000..3bc6296
--- /dev/null
+++ b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/JenaConnectionException.java
@@ -0,0 +1,29 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.jena.rdfconnection;
+
+import org.apache.jena.shared.JenaException ;
+
+/** Exceptions related to {@link RDFConnection} */
+public class JenaConnectionException extends JenaException {
+ public JenaConnectionException() { super(); }
+ public JenaConnectionException(String message) { super(message); }
+ public JenaConnectionException(Throwable cause) { super(cause) ; }
+ public JenaConnectionException(String message, Throwable cause) { super(message, cause) ; }
+}
\ No newline at end of file
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConn.java
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConn.java b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConn.java
new file mode 100644
index 0000000..27d8f96
--- /dev/null
+++ b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConn.java
@@ -0,0 +1,40 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.jena.rdfconnection;
+
+/** package-wide utilities etc */
+/*package*/ class RDFConn {
+ private static String dftName = "default" ;
+
+ /*package*/ static boolean isDefault(String name) {
+ return name == null || name.equals(dftName) ;
+ }
+
+ /*package*/ static String queryStringForGraph(String graphName) {
+ return
+ (RDFConn.isDefault(graphName) )
+ ? "?default"
+ : "?graph="+graphName ;
+ }
+
+ /*package*/ static String urlForGraph(String graphStoreProtocolService, String graphName) {
+ return graphStoreProtocolService + queryStringForGraph(graphName) ;
+ }
+
+}
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnection.java
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnection.java b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnection.java
new file mode 100644
index 0000000..2610bb3
--- /dev/null
+++ b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnection.java
@@ -0,0 +1,362 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.jena.rdfconnection;
+
+import java.util.function.Consumer ;
+
+import org.apache.jena.query.* ;
+import org.apache.jena.rdf.model.Model ;
+import org.apache.jena.sparql.core.Transactional ;
+import org.apache.jena.system.Txn ;
+import org.apache.jena.update.Update ;
+import org.apache.jena.update.UpdateFactory ;
+import org.apache.jena.update.UpdateRequest ;
+
+/**
+ * Interface for SPARQL operations on a datasets, whether local or remote.
+ * Operations can performed via this interface or via the various
+ * interfaces for a subset of the operations.
+ *
+ * <ul>
+ * <li>query ({@link SparqlQueryConnection})
+ * <li>update ({@link SparqlUpdateConnection})
+ * <li>graph store protocol ({@link RDFDatasetConnection}).
+ * </ul>
+ *
+ * For remote operations, the
+ * <a href="http://www.w3.org/TR/sparql11-protocol/">SPARQL Protocol</a> is used
+ * for query and updates and
+ * <a href="http://www.w3.org/TR/sparql11-http-rdf-update/">SPARQL Graph Store
+ * Protocol</a> for the graph operations and in addition, there are analogous
+ * operations on datasets (fetch, load, put; but not delete).
+ *
+ * {@code RDFConnection} provides transaction boundaries. If not in a
+ * transaction, an implicit transactional wrapper is applied ("autocommit").
+ *
+ * Remote SPARQL operations are atomic but without additional capabilities from
+ * the remote server, multiple operations are not combined into a single
+ * transaction.
+ *
+ * Not all implementations may implement all operations. See the implementation
+ * notes for details.
+ *
+ * @see RDFConnectionFactory
+ * @see RDFConnectionLocal
+ * @see RDFConnectionRemote
+ */
+public interface RDFConnection
+ // Default implementations could be pushed up but then they can't be mentioned here.
+ extends
+ SparqlQueryConnection, SparqlUpdateConnection, RDFDatasetConnection,
+ Transactional, AutoCloseable
+ {
+ // Inherits interfaces : re-mentioned to get the javadoc in one place.
+
+ // ---- SparqlQueryConnection
+
+ /**
+ * Execute a SELECT query and process the ResultSet with the handler code.
+ * @param query
+ * @param resultSetAction
+ */
+ @Override
+ public default void queryResultSet(String query, Consumer<ResultSet> resultSetAction) {
+ queryResultSet(QueryFactory.create(query), resultSetAction) ;
+ }
+
+ /**
+ * Execute a SELECT query and process the ResultSet with the handler code.
+ * @param query
+ * @param resultSetAction
+ */
+ @Override
+ public default void queryResultSet(Query query, Consumer<ResultSet> resultSetAction) {
+ if ( ! query.isSelectType() )
+ throw new JenaConnectionException("Query is not a SELECT query") ;
+
+ Txn.executeRead(this, ()->{
+ try ( QueryExecution qExec = query(query) ) {
+ ResultSet rs = qExec.execSelect() ;
+ resultSetAction.accept(rs);
+ }
+ } ) ;
+ }
+
+ /**
+ * Execute a SELECT query and process the rows of the results with the handler code.
+ * @param query
+ * @param rowAction
+ */
+ @Override
+ public default void querySelect(String query, Consumer<QuerySolution> rowAction) {
+ querySelect(QueryFactory.create(query), rowAction) ;
+ }
+
+ /**
+ * Execute a SELECT query and process the rows of the results with the handler code.
+ * @param query
+ * @param rowAction
+ */
+ @Override
+ public default void querySelect(Query query, Consumer<QuerySolution> rowAction) {
+ if ( ! query.isSelectType() )
+ throw new JenaConnectionException("Query is not a SELECT query") ;
+ Txn.executeRead(this, ()->{
+ try ( QueryExecution qExec = query(query) ) {
+ qExec.execSelect().forEachRemaining(rowAction);
+ }
+ } ) ;
+ }
+
+ /** Execute a CONSTRUCT query and return as a Model */
+ @Override
+ public default Model queryConstruct(String query) {
+ return queryConstruct(QueryFactory.create(query)) ;
+ }
+
+ /** Execute a CONSTRUCT query and return as a Model */
+ @Override
+ public default Model queryConstruct(Query query) {
+ return
+ Txn.calculateRead(this, ()->{
+ try ( QueryExecution qExec = query(query) ) {
+ return qExec.execConstruct() ;
+ }
+ } ) ;
+ }
+
+ /** Execute a DESCRIBE query and return as a Model */
+ @Override
+ public default Model queryDescribe(String query) {
+ return queryDescribe(QueryFactory.create(query)) ;
+ }
+
+ /** Execute a DESCRIBE query and return as a Model */
+ @Override
+ public default Model queryDescribe(Query query) {
+ return
+ Txn.calculateRead(this, ()->{
+ try ( QueryExecution qExec = query(query) ) {
+ return qExec.execDescribe() ;
+ }
+ } ) ;
+ }
+
+ /** Execute a ASK query and return a boolean */
+ @Override
+ public default boolean queryAsk(String query) {
+ return queryAsk(QueryFactory.create(query)) ;
+ }
+
+ /** Execute a ASK query and return a boolean */
+ @Override
+ public default boolean queryAsk(Query query) {
+ return
+ Txn.calculateRead(this, ()->{
+ try ( QueryExecution qExec = query(query) ) {
+ return qExec.execAsk() ;
+ }
+ } ) ;
+ }
+
+ /** Setup a SPARQL query execution.
+ *
+ * See also {@link #querySelect(Query, Consumer)}, {@link #queryConstruct(Query)},
+ * {@link #queryDescribe(Query)}, {@link #queryAsk(Query)}
+ * for ways to execute queries for of a specific form.
+ *
+ * @param query
+ * @return QueryExecution
+ */
+ @Override
+ public QueryExecution query(Query query) ;
+
+ /** Setup a SPARQL query execution.
+ *
+ * See also {@link #querySelect(String, Consumer)}, {@link #queryConstruct(String)},
+ * {@link #queryDescribe(String)}, {@link #queryAsk(String)}
+ * for ways to execute queries for of a specific form.
+ *
+ * @param queryString
+ * @return QueryExecution
+ */
+ @Override
+ public default QueryExecution query(String queryString) {
+ return query(QueryFactory.create(queryString)) ;
+ }
+
+ // ---- SparqlUpdateConnection
+
+ /** Execute a SPARQL Update.
+ *
+ * @param update
+ */
+ @Override
+ public default void update(Update update) {
+ update(new UpdateRequest(update)) ;
+ }
+
+ /** Execute a SPARQL Update.
+ *
+ * @param update
+ */
+ @Override
+ public void update(UpdateRequest update) ;
+
+ /** Execute a SPARQL Update.
+ *
+ * @param updateString
+ */
+ @Override
+ public default void update(String updateString) {
+ update(UpdateFactory.create(updateString)) ;
+ }
+
+ // ---- RDFDatasetConnection
+
+ /** Load (add, append) RDF into a named graph in a dataset.
+ * This is SPARQL Graph Store Protocol HTTP POST or equivalent.
+ *
+ * @param graphName Graph name (null or "default" for the default graph)
+ * @param file File of the data.
+ */
+ @Override
+ public void load(String graphName, String file) ;
+
+ /** Load (add, append) RDF into the default graph of a dataset.
+ * This is SPARQL Graph Store Protocol HTTP POST or equivalent.
+ *
+ * @param file File of the data.
+ */
+ @Override
+ public void load(String file) ;
+
+ /** Load (add, append) RDF into a named graph in a dataset.
+ * This is SPARQL Graph Store Protocol HTTP POST or equivalent.
+ *
+ * @param graphName Graph name (null or "default" for the default graph)
+ * @param model Data.
+ */
+ @Override
+ public void load(String graphName, Model model) ;
+
+ /** Load (add, append) RDF into the default graph of a dataset.
+ * This is SPARQL Graph Store Protocol HTTP POST or equivalent.
+ *
+ * @param model Data.
+ */
+ @Override
+ public void load(Model model) ;
+
+ /** Set the contents of a named graph of a dataset.
+ * Any existing data is lost.
+ * This is SPARQL Graph Store Protocol HTTP PUT or equivalent.
+ *
+ * @param graphName Graph name (null or "default" for the default graph)
+ * @param file File of the data.
+ */
+ @Override
+ public void put(String graphName, String file) ;
+
+ /** Set the contents of the default graph of a dataset.
+ * Any existing data is lost.
+ * This is SPARQL Graph Store Protocol HTTP PUT or equivalent.
+ *
+ * @param file File of the data.
+ */
+ @Override
+ public void put(String file) ;
+
+ /** Set the contents of a named graph of a dataset.
+ * Any existing data is lost.
+ * This is SPARQL Graph Store Protocol HTTP PUT or equivalent.
+ *
+ * @param graphName Graph name (null or "default" for the default graph)
+ * @param model Data.
+ */
+ @Override
+ public void put(String graphName, Model model) ;
+
+ /** Set the contents of the default graph of a dataset.
+ * Any existing data is lost.
+ * This is SPARQL Graph Store Protocol HTTP PUT or equivalent.
+ *
+ * @param model Data.
+ */
+ @Override
+ public void put( Model model) ;
+
+ /**
+ * Delete a graph from the dataset.
+ * Null or "default" measn the default graph, which is cleared, not removed.
+ *
+ * @param graphName
+ */
+ @Override
+ public void delete(String graphName) ;
+
+ /**
+ * Remove all data from the default graph.
+ */
+ @Override
+ public void delete() ;
+
+ /* Load (add, append) RDF triple or quad data into a dataset. Triples wil go into the default graph.
+ * This is not a SPARQL Graph Store Protocol operation.
+ * It is an HTTP POST equivalent to the dataset.
+ */
+ @Override
+ public void loadDataset(String file) ;
+
+ /* Load (add, append) RDF triple or quad data into a dataset. Triples wil go into the default graph.
+ * This is not a SPARQL Graph Store Protocol operation.
+ * It is an HTTP POST equivalent to the dataset.
+ */
+ @Override
+ public void loadDataset(Dataset dataset) ;
+
+ /* Set RDF triple or quad data as the dataset contents.
+ * Triples will go into the default graph, quads in named graphs.
+ * This is not a SPARQL Graph Store Protocol operation.
+ * It is an HTTP PUT equivalent to the dataset.
+ */
+ @Override
+ public void putDataset(String file) ;
+
+ /* Set RDF triple or quad data as the dataset contents.
+ * Triples will go into the default graph, quads in named graphs.
+ * This is not a SPARQL Graph Store Protocol operation.
+ * It is an HTTP PUT equivalent to the dataset.
+ */
+ @Override
+ public void putDataset(Dataset dataset) ;
+
+ // /** Clear the dataset - remove all named graphs, clear the default graph. */
+ // public void clearDataset() ;
+
+
+ /** Test whether this connection is closed or not */
+ @Override
+ public boolean isClosed() ;
+
+ /** Close this connection. Use with try-resource. */
+ @Override
+ public void close() ;
+}
+
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionFactory.java
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionFactory.java b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionFactory.java
new file mode 100644
index 0000000..ee78a8d
--- /dev/null
+++ b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionFactory.java
@@ -0,0 +1,84 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.jena.rdfconnection;
+
+import org.apache.jena.query.Dataset ;
+import org.apache.jena.system.JenaSystem ;
+
+// Pool stuff
+public class RDFConnectionFactory {
+ static { JenaSystem.init(); }
+
+ /** Create a connection to a remote location by URL.
+ * This is the URL for the dataset.
+ *
+ * This call assumes the names of services as:
+ * <ul>
+ * <li>SPARQL Query endpoint : "sparql"
+ * <li>SPARQL Update endpoint : "update"
+ * <li>SPARQL Graph Store Protocol : "data"
+ * </ul>
+ * These are the default names in <a href="http://jena.apache.org/documentation/fuseki2">Fuseki</a>
+ * Other names can be specificied using {@link #connect(String, String, String, String)}
+ *
+ * @param destination
+ * @return RDFConnection
+ * @see #connect(String, String, String, String)
+ */
+ public static RDFConnection connect(String destination) {
+ return new RDFConnectionRemote(destination) ;
+ }
+
+ /** Create a connection to a remote location by URL.
+ * This is the URL for the dataset.
+ *
+ * This call requires specifying the names of the service.
+ */
+ public static RDFConnection connect(String queryServiceEndpoint,
+ String updateServiceEndpoint,
+ String graphStoreProtocolEndpoint) {
+ return new RDFConnectionRemote(queryServiceEndpoint, updateServiceEndpoint, graphStoreProtocolEndpoint) ;
+ }
+
+
+ /** Create a connection to a remote location by URL.
+ * This is the URL for the dataset.
+ *
+ * This call requires specifying the names of the servicerelative to the dataset URL.
+ *
+ */
+ public static RDFConnection connect(String datasetURL,
+ String queryServiceEndpoint,
+ String updateServiceEndpoint,
+ String graphStoreProtocolEndpoint) {
+ return new RDFConnectionRemote(datasetURL, queryServiceEndpoint, updateServiceEndpoint, graphStoreProtocolEndpoint) ;
+ }
+
+ /**
+ * Connect to a local (same JVM) dataset.
+ * @param dataset
+ * @return RDFConnection
+ */
+ public static RDFConnection connect(Dataset dataset) {
+ return new RDFConnectionLocal(dataset) ;
+ }
+
+ //public RDFConnection getFromPool() ;
+}
+
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionLocal.java
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionLocal.java b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionLocal.java
new file mode 100644
index 0000000..2f7be62
--- /dev/null
+++ b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionLocal.java
@@ -0,0 +1,286 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.jena.rdfconnection;
+
+import java.util.Objects ;
+
+import org.apache.jena.graph.Graph ;
+import org.apache.jena.query.* ;
+import org.apache.jena.rdf.model.Model ;
+import org.apache.jena.rdf.model.ModelFactory ;
+import org.apache.jena.riot.Lang ;
+import org.apache.jena.riot.RDFDataMgr ;
+import org.apache.jena.riot.RDFLanguages ;
+import org.apache.jena.sparql.ARQException ;
+import org.apache.jena.sparql.core.DatasetGraph ;
+import org.apache.jena.sparql.core.DatasetGraphFactory ;
+import org.apache.jena.sparql.core.DatasetGraphReadOnly ;
+import org.apache.jena.sparql.graph.GraphReadOnly ;
+import org.apache.jena.system.Txn ;
+import org.apache.jena.update.UpdateExecutionFactory ;
+import org.apache.jena.update.UpdateRequest ;
+
+/**
+ *
+ */
+
+public class RDFConnectionLocal implements RDFConnection {
+ // XXX Expose copy-mode
+
+
+ private ThreadLocal<Boolean> transactionActive = ThreadLocal.withInitial(()->false) ;
+ private static boolean isolateByCopy = true ;
+ private Dataset dataset ;
+
+ public RDFConnectionLocal(Dataset dataset) {
+ this.dataset = dataset ;
+ }
+
+ @Override
+ public QueryExecution query(Query query) {
+ checkOpen() ;
+ return Txn.calculateRead(dataset, ()->QueryExecutionFactory.create(query, dataset)) ;
+ }
+
+ @Override
+ public void update(UpdateRequest update) {
+ checkOpen() ;
+ Txn.executeWrite(dataset, ()->UpdateExecutionFactory.create(update, dataset).execute() ) ;
+ }
+
+ @Override
+ public void load(String graph, String file) {
+ checkOpen() ;
+ doPutPost(graph, file, false) ;
+ }
+
+ @Override
+ public void load(String file) {
+ checkOpen() ;
+ doPutPost(null, file, false) ;
+ }
+
+ @Override
+ public void load(String graphName, Model model) {
+ checkOpen() ;
+ Txn.executeWrite(dataset, ()-> {
+ Model modelDst = modelFor(graphName) ;
+ modelDst.add(model) ;
+ }) ;
+ }
+
+ @Override
+ public void load(Model model) {
+ load(null, model) ;
+ }
+
+ /**
+ * There may be differences between local and remote behaviour. A local
+ * connection may return direct references to a dataset so updates on
+ * returned
+ */
+
+ @Override
+ public Model fetch(String graph) {
+ return Txn.calculateRead(dataset, ()-> {
+ Model model = modelFor(graph) ;
+ return isolate(model) ;
+ }) ;
+ }
+
+ @Override
+ public Model fetch() {
+ checkOpen() ;
+ return fetch(null) ;
+ }
+
+ @Override
+ public void put(String file) {
+ checkOpen() ;
+ doPutPost(null, file, true) ;
+ }
+
+ @Override
+ public void put(String graph, String file) {
+ checkOpen() ;
+ doPutPost(graph, file, true) ;
+ }
+
+ @Override
+ public void put(Model model) {
+ put(null, model) ;
+ }
+
+ @Override
+ public void put(String graphName, Model model) {
+ checkOpen() ;
+ Txn.executeWrite(dataset, ()-> {
+ Model modelDst = modelFor(graphName) ;
+ modelDst.removeAll();
+ modelDst.add(model) ;
+ }) ;
+ }
+
+ @Override
+ public void delete(String graph) {
+ checkOpen() ;
+ Txn.executeWrite(dataset,() ->{
+ if ( RDFConn.isDefault(graph) )
+ dataset.getDefaultModel().removeAll();
+ else
+ dataset.removeNamedModel(graph);
+ }) ;
+ }
+
+ @Override
+ public void delete() {
+ checkOpen() ;
+ delete(null) ;
+ }
+
+ private void doPutPost(String graph, String file, boolean replace) {
+ Objects.requireNonNull(file) ;
+ Lang lang = RDFLanguages.filenameToLang(file) ;
+
+ Txn.executeWrite(dataset,() ->{
+ if ( RDFLanguages.isTriples(lang) ) {
+ Model model = RDFConn.isDefault(graph) ? dataset.getDefaultModel() : dataset.getNamedModel(graph) ;
+ if ( replace )
+ model.removeAll() ;
+ RDFDataMgr.read(model, file);
+ }
+ else if ( RDFLanguages.isQuads(lang) ) {
+ if ( replace )
+ dataset.asDatasetGraph().clear();
+ // Try to POST to the dataset.
+ RDFDataMgr.read(dataset, file);
+ }
+ else
+ throw new ARQException("Not an RDF format: "+file+" (lang="+lang+")") ;
+ }) ;
+ }
+
+ /**
+ * Called to isolate a model from it's storage. Must be inside a
+ * transaction.
+ */
+ private Model isolate(Model model) {
+ if ( ! isolateByCopy ) {
+ // Half-way - read-only but dataset changes can be seen.
+ Graph g = new GraphReadOnly(model.getGraph()) ;
+ return ModelFactory.createModelForGraph(g) ;
+ }
+ // Copy.
+ Model m2 = ModelFactory.createDefaultModel() ;
+ m2.add(model) ;
+ return m2 ;
+ }
+
+ /**
+ * Called to isolate a dataset from it's storage. Must be inside a
+ * transaction.
+ */
+ private Dataset isolate(Dataset dataset) {
+ if ( ! isolateByCopy ) {
+ DatasetGraph dsg = new DatasetGraphReadOnly(dataset.asDatasetGraph()) ;
+ return DatasetFactory.wrap(dsg) ;
+ }
+
+ // Copy.
+ DatasetGraph dsg2 = DatasetGraphFactory.create() ;
+ dataset.asDatasetGraph().find().forEachRemaining(q -> dsg2.add(q) );
+ return DatasetFactory.wrap(dsg2) ;
+ }
+
+ private Model modelFor(String graph) {
+ if ( RDFConn.isDefault(graph))
+ return dataset.getDefaultModel() ;
+ return dataset.getNamedModel(graph) ;
+ }
+
+ @Override
+ public Dataset fetchDataset() {
+ checkOpen() ;
+ return Txn.calculateRead(dataset,() -> isolate(dataset)) ;
+ }
+
+ @Override
+ public void loadDataset(String file) {
+ checkOpen() ;
+ Txn.executeWrite(dataset,() ->{
+ RDFDataMgr.read(dataset, file);
+ }) ;
+ }
+
+ @Override
+ public void loadDataset(Dataset dataset) {
+ Txn.executeWrite(dataset,() ->{
+ dataset.asDatasetGraph().find().forEachRemaining((q)->this.dataset.asDatasetGraph().add(q)) ;
+ }) ;
+ }
+
+ @Override
+ public void putDataset(String file) {
+ checkOpen() ;
+ Txn.executeWrite(dataset,() ->{
+ dataset.asDatasetGraph().clear();
+ RDFDataMgr.read(dataset, file);
+ }) ;
+ }
+
+ @Override
+ public void putDataset(Dataset dataset) {
+ Txn.executeWrite(dataset,() ->{
+ this.dataset = isolate(dataset) ;
+ }) ;
+ }
+
+ @Override
+ public void close() {
+ dataset = null ;
+ }
+
+ @Override
+ public boolean isClosed() {
+ return dataset == null ;
+ }
+
+ private void checkOpen() {
+ if ( dataset == null )
+ throw new ARQException("closed") ;
+ }
+
+ @Override
+ public void begin(ReadWrite readWrite) { dataset.begin(readWrite); }
+
+ @Override
+ public void commit() { dataset.commit(); }
+
+ @Override
+ public void abort() { dataset.abort(); }
+
+ @Override
+ public boolean isInTransaction() { return dataset.isInTransaction() ; }
+
+ @Override
+ public void end() { dataset.end() ; }
+
+
+}
+
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionModular.java
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionModular.java b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionModular.java
new file mode 100644
index 0000000..8f36355
--- /dev/null
+++ b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionModular.java
@@ -0,0 +1,199 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.jena.rdfconnection;
+
+import org.apache.jena.query.Dataset ;
+import org.apache.jena.query.Query ;
+import org.apache.jena.query.QueryExecution ;
+import org.apache.jena.query.ReadWrite ;
+import org.apache.jena.rdf.model.Model ;
+import org.apache.jena.sparql.core.Transactional ;
+import org.apache.jena.update.UpdateRequest ;
+
+/**
+ *
+ */
+public class RDFConnectionModular implements RDFConnection {
+
+ private final SparqlQueryConnection queryConnection ;
+ private final SparqlUpdateConnection updateConnection ;
+ private final RDFDatasetConnection datasetConnection ;
+ private final Transactional transactional ;
+
+ @Override
+ public void begin(ReadWrite readWrite) { transactional.begin(readWrite); }
+ @Override
+ public void commit() { transactional.commit(); }
+ @Override
+ public void abort() { transactional.abort(); }
+ @Override
+ public void end() { transactional.end(); }
+ @Override
+ public boolean isInTransaction() {
+ return transactional.isInTransaction() ;
+ }
+
+ public RDFConnectionModular(SparqlQueryConnection queryConnection ,
+ SparqlUpdateConnection updateConnection ,
+ RDFDatasetConnection datasetConnection ) {
+ this.queryConnection = queryConnection ;
+ this.updateConnection = updateConnection ;
+ this.datasetConnection = datasetConnection ;
+ this.transactional =
+ updateConnection != null ? updateConnection :
+ datasetConnection != null ? datasetConnection :
+ queryConnection != null ? queryConnection :
+ null ;
+ }
+
+ public RDFConnectionModular(RDFConnection connection ) {
+ this.queryConnection = connection ;
+ this.updateConnection = connection ;
+ this.datasetConnection = connection ;
+ this.transactional = connection ;
+ }
+
+ private SparqlQueryConnection queryConnection() {
+ if ( queryConnection == null )
+ throw new UnsupportedOperationException("No SparqlQueryConnection") ;
+ return queryConnection ;
+ }
+
+ private SparqlUpdateConnection updateConnection() {
+ if ( updateConnection == null )
+ throw new UnsupportedOperationException("No SparqlUpdateConnection") ;
+ return updateConnection ;
+ }
+
+ private RDFDatasetConnection datasetConnection() {
+ if ( datasetConnection == null )
+ throw new UnsupportedOperationException("No RDFDatasetConnection") ;
+ return datasetConnection ;
+ }
+
+
+
+ @Override
+ public QueryExecution query(Query query) { return queryConnection().query(query) ; }
+
+ @Override
+ public void update(UpdateRequest update) {
+ updateConnection().update(update) ;
+ }
+
+ @Override
+ public void load(String graphName, String file) {
+ datasetConnection().load(graphName, file) ;
+ }
+
+ @Override
+ public void load(String file) {
+ datasetConnection().load(file) ;
+ }
+
+ @Override
+ public void load(String graphName, Model model) {
+ datasetConnection().load(graphName, model) ;
+ }
+
+ @Override
+ public void load(Model model) {
+ datasetConnection().load(model) ;
+ }
+
+ @Override
+ public void put(String graphName, String file) {
+ datasetConnection().put(graphName, file) ;
+ }
+
+ @Override
+ public void put(String file) {
+ datasetConnection().put(file) ;
+ }
+
+ @Override
+ public void put(String graphName, Model model) {
+ datasetConnection().put(graphName, model) ;
+ }
+
+ @Override
+ public void put(Model model) {
+ datasetConnection().put(model) ;
+ }
+
+ @Override
+ public void delete(String graphName) {
+ datasetConnection().delete(graphName) ;
+ }
+
+ @Override
+ public void delete() {
+ datasetConnection().delete() ;
+ }
+
+ @Override
+ public void loadDataset(String file) {
+ datasetConnection().loadDataset(file) ;
+ }
+
+ @Override
+ public void loadDataset(Dataset dataset) {
+ datasetConnection().loadDataset(dataset) ;
+ }
+
+ @Override
+ public void putDataset(String file) {
+ datasetConnection().putDataset(file) ;
+ }
+
+ @Override
+ public void putDataset(Dataset dataset) {
+ datasetConnection().putDataset(dataset) ;
+ }
+
+ // /** Clear the dataset - remove all named graphs, clear the default graph. */
+ // public void clearDataset() ;
+
+ @Override
+ public Model fetch(String graphName) {
+ return datasetConnection.fetch(graphName) ;
+ }
+ @Override
+ public Model fetch() {
+ return datasetConnection().fetch() ;
+ }
+ @Override
+ public Dataset fetchDataset() {
+ return datasetConnection().fetchDataset() ;
+ }
+ @Override
+ public boolean isClosed() { return false ; }
+
+ /** Close this connection. Use with try-resource. */
+ @Override
+ public void close() {
+ if ( queryConnection != null )
+ queryConnection.close();
+ if ( updateConnection != null )
+ updateConnection.close();
+ if ( datasetConnection != null )
+ datasetConnection.close() ;
+ }
+}
+
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionRemote.java
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionRemote.java b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionRemote.java
new file mode 100644
index 0000000..02d7bef
--- /dev/null
+++ b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFConnectionRemote.java
@@ -0,0 +1,478 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.jena.rdfconnection;
+
+import static java.util.Objects.requireNonNull ;
+
+import java.io.File ;
+import java.io.InputStream ;
+import java.util.concurrent.locks.ReentrantLock ;
+import java.util.function.Supplier ;
+
+import org.apache.http.HttpEntity ;
+import org.apache.http.client.HttpClient ;
+import org.apache.http.entity.EntityTemplate ;
+import org.apache.http.protocol.HttpContext ;
+import org.apache.jena.atlas.io.IO ;
+import org.apache.jena.atlas.web.HttpException ;
+import org.apache.jena.atlas.web.TypedInputStream ;
+import org.apache.jena.graph.Graph ;
+import org.apache.jena.query.* ;
+import org.apache.jena.rdf.model.Model ;
+import org.apache.jena.rdf.model.ModelFactory ;
+import org.apache.jena.riot.* ;
+import org.apache.jena.riot.web.HttpCaptureResponse ;
+import org.apache.jena.riot.web.HttpOp ;
+import org.apache.jena.riot.web.HttpResponseLib ;
+import org.apache.jena.sparql.ARQException ;
+import org.apache.jena.sparql.core.DatasetGraph ;
+import org.apache.jena.sparql.core.Transactional ;
+import org.apache.jena.system.Txn ;
+import org.apache.jena.update.UpdateExecutionFactory ;
+import org.apache.jena.update.UpdateProcessor ;
+import org.apache.jena.update.UpdateRequest ;
+import org.apache.jena.web.HttpSC ;
+
+/**
+ *
+ */
+public class RDFConnectionRemote implements RDFConnection {
+ private static final String fusekiDftSrvQuery = "sparql" ;
+ private static final String fusekiDftSrvUpdate = "update" ;
+ private static final String fusekiDftSrvGSP = "data" ;
+
+ private boolean isOpen = true ;
+ private final String destination ;
+ private final String svcQuery ;
+ private final String svcUpdate ;
+ private final String svcGraphStore ;
+ private HttpClient httpClient ;
+ private HttpContext httpContext = null ;
+
+ // Builder?
+ // HttpContext, HttpClient.
+ // Statics for "query", "query+update" : SparqlQueryConnectionRemote > SparqlUpdateConnectionRemote > RDFConnectionRemote
+ // XXX Very long "HttpOp.execHttpPost"
+
+ /** Create connection that wil use the {@link HttpClient} using URL of the dataset and default service names */
+ public RDFConnectionRemote(HttpClient httpClient, String destination) {
+ this(httpClient,
+ requireNonNull(destination),
+ fusekiDftSrvQuery,
+ fusekiDftSrvUpdate,
+ fusekiDftSrvGSP) ;
+ }
+
+
+ /** Create connection, using URL of the dataset and default service names */
+ public RDFConnectionRemote(String destination) {
+ this(requireNonNull(destination),
+ fusekiDftSrvQuery,
+ fusekiDftSrvUpdate,
+ fusekiDftSrvGSP) ;
+ }
+
+ // ??
+ /** Create connection, using full URLs for services. Pass a null for "no service endpoint". */
+ public RDFConnectionRemote(String sQuery, String sUpdate, String sGSP) {
+ this(null, sQuery, sUpdate, sGSP) ;
+ }
+
+ /** Create connection, using URL of the dataset and short names for the services */
+ public RDFConnectionRemote(String destination, String sQuery, String sUpdate, String sGSP) {
+ this(null, destination, sQuery, sUpdate, sGSP) ;
+ }
+
+ /** Create connection, using URL of the dataset and short names for the services */
+ public RDFConnectionRemote(HttpClient httpClient, String destination, String sQuery, String sUpdate, String sGSP) {
+ this.destination = destination ;
+ this.svcQuery = formServiceURL(destination,sQuery) ;
+ this.svcUpdate = formServiceURL(destination,sUpdate) ;
+ this.svcGraphStore = formServiceURL(destination,sGSP) ;
+// if ( httpClient == null )
+// httpClient = HttpOp.getDefaultHttpClient() ;
+ this.httpClient = httpClient ;
+ }
+
+ public HttpClient getHttpClient() {
+ return httpClient ;
+ }
+
+ public void setHttpClient(HttpClient httpClient) {
+ this.httpClient = httpClient ;
+ }
+
+ public HttpContext getHttpContext() {
+ return httpContext ;
+ }
+
+ public void setHttpContext(HttpContext httpContext) {
+ this.httpContext = httpContext ;
+ }
+
+ private static String formServiceURL(String destination, String srvEndpoint) {
+ if ( destination == null )
+ return srvEndpoint ;
+ String dest = destination ;
+ if ( dest.endsWith("/") )
+ dest = dest.substring(0, dest.length()-1) ;
+ return dest+"/"+srvEndpoint ;
+ }
+
+ @Override
+ public QueryExecution query(Query query) {
+ checkQuery();
+ return exec(()->QueryExecutionFactory.createServiceRequest(svcQuery, query)) ;
+ }
+
+ @Override
+ public void update(UpdateRequest update) {
+ checkUpdate();
+ UpdateProcessor proc = UpdateExecutionFactory.createRemote(update, svcUpdate) ;
+ exec(()->proc.execute());
+ }
+
+ @Override
+ public Model fetch(String graphName) {
+ checkGSP() ;
+ String url = RDFConn.urlForGraph(svcGraphStore, graphName) ;
+ Graph graph = fetch$(url) ;
+ return ModelFactory.createModelForGraph(graph) ;
+ }
+
+ @Override
+ public Model fetch() {
+ checkGSP() ;
+ return fetch(null) ;
+ }
+
+ private Graph fetch$(String url) {
+ HttpCaptureResponse<Graph> graph = HttpResponseLib.graphHandler() ;
+ exec(()->HttpOp.execHttpGet(url, WebContent.defaultGraphAcceptHeader, graph, this.httpClient, this.httpContext)) ;
+ return graph.get() ;
+ }
+
+ @Override
+ public void load(String graph, String file) {
+ checkGSP() ;
+ upload(graph, file, false) ;
+ }
+
+ @Override
+ public void load(String file) {
+ checkGSP() ;
+ upload(null, file, false) ;
+ }
+
+ @Override
+ public void load(Model model) {
+ doPutPost(model, null, false) ;
+ }
+
+ @Override
+ public void load(String graphName, Model model) {
+ doPutPost(model, graphName, false) ;
+ }
+
+ @Override
+ public void put(String graph, String file) {
+ checkGSP() ;
+ upload(graph, file, true) ;
+ }
+
+ @Override
+ public void put(String file) {
+ checkGSP() ;
+ upload(null, file, true) ;
+ }
+
+ @Override
+ public void put(String graphName, Model model) {
+ checkGSP() ;
+ doPutPost(model, graphName, true) ;
+ }
+
+ @Override
+ public void put(Model model) {
+ checkGSP() ;
+ doPutPost(model, null, true) ;
+ }
+
+ private void upload(String graph, String file, boolean replace) {
+ // if triples
+ Lang lang = RDFLanguages.filenameToLang(file) ;
+ if ( RDFLanguages.isQuads(lang) )
+ throw new ARQException("Can't load quads into a graph") ;
+ if ( ! RDFLanguages.isTriples(lang) )
+ throw new ARQException("Not an RDF format: "+file+" (lang="+lang+")") ;
+ String url = RDFConn.urlForGraph(svcGraphStore, graph) ;
+ doPutPost(url, file, lang, replace) ;
+ }
+
+ private void doPutPost(String url, String file, Lang lang, boolean replace) {
+ File f = new File(file) ;
+ long length = f.length() ;
+ InputStream source = IO.openFile(file) ;
+ // Charset.
+ exec(()->{
+ if ( replace )
+ HttpOp.execHttpPut(url, lang.getContentType().getContentType(), source, length, httpClient, this.httpContext) ;
+ else
+ HttpOp.execHttpPost(url, lang.getContentType().getContentType(), source, length, null, null, httpClient, this.httpContext) ;
+ }) ;
+ }
+
+ private void doPutPost(Model model, String name, boolean replace) {
+ String url = RDFConn.urlForGraph(svcGraphStore, name) ;
+ exec(()->{
+ Graph graph = model.getGraph() ;
+ if ( replace )
+ HttpOp.execHttpPut(url, graphToHttpEntity(graph), httpClient, this.httpContext) ;
+ else
+ HttpOp.execHttpPost(url, graphToHttpEntity(graph), null, null, httpClient, this.httpContext) ;
+ });
+ }
+
+ @Override
+ public void delete(String graph) {
+ checkGSP() ;
+ String url = RDFConn.urlForGraph(svcGraphStore, graph) ;
+ exec(()->HttpOp.execHttpDelete(url));
+ }
+
+ @Override
+ public void delete() {
+ checkGSP() ;
+ delete(null) ;
+ }
+
+ @Override
+ public Dataset fetchDataset() {
+ if ( destination == null )
+ throw new ARQException("Dataset operations not available - no dataset URL provided") ;
+ Dataset ds = DatasetFactory.createTxnMem() ;
+ Txn.executeWrite(ds, ()->{
+ TypedInputStream s = exec(()->HttpOp.execHttpGet(destination, WebContent.defaultDatasetAcceptHeader)) ;
+ Lang lang = RDFLanguages.contentTypeToLang(s.getContentType()) ;
+ RDFDataMgr.read(ds, s, lang) ;
+ }) ;
+ return ds ;
+ }
+
+ @Override
+ public void loadDataset(String file) {
+ if ( destination == null )
+ throw new ARQException("Dataset operations not available - no dataset URl provided") ;
+ doPutPostDataset(file, false) ;
+ }
+
+ @Override
+ public void loadDataset(Dataset dataset) {
+ if ( destination == null )
+ throw new ARQException("Dataset operations not available - no dataset URl provided") ;
+ doPutPostDataset(dataset, false) ;
+ }
+
+ @Override
+ public void putDataset(String file) {
+ if ( destination == null )
+ throw new ARQException("Dataset operations not available - no dataset URl provided") ;
+ doPutPostDataset(file, true) ;
+ }
+
+ @Override
+ public void putDataset(Dataset dataset) {
+ if ( destination == null )
+ throw new ARQException("Dataset operations not available - no dataset URl provided") ;
+ doPutPostDataset(dataset, true) ;
+ }
+
+ private void doPutPostDataset(String file, boolean replace) {
+ Lang lang = RDFLanguages.filenameToLang(file) ;
+ File f = new File(file) ;
+ long length = f.length() ;
+ exec(()->{
+ InputStream source = IO.openFile(file) ;
+ if ( replace )
+ HttpOp.execHttpPut(destination, lang.getContentType().getContentType(), source, length, httpClient, httpContext) ;
+ else
+ HttpOp.execHttpPost(destination, lang.getContentType().getContentType(), source, length, null, null, httpClient, httpContext) ;
+ });
+ }
+
+ private void doPutPostDataset(Dataset dataset, boolean replace) {
+ exec(()->{
+ DatasetGraph dsg = dataset.asDatasetGraph() ;
+ if ( replace )
+ HttpOp.execHttpPut(destination, datasetToHttpEntity(dsg), httpClient, null) ;
+ else
+ HttpOp.execHttpPost(destination, datasetToHttpEntity(dsg), httpClient, null) ;
+ });
+ }
+
+
+ private void checkQuery() {
+ checkOpen() ;
+ if ( svcQuery == null )
+ throw new ARQException("No query service defined for this RDFConnection") ;
+ }
+
+ private void checkUpdate() {
+ checkOpen() ;
+ if ( svcUpdate == null )
+ throw new ARQException("No update service defined for this RDFConnection") ;
+ }
+
+ private void checkGSP() {
+ checkOpen() ;
+ if ( svcGraphStore == null )
+ throw new ARQException("No SPARQL Graph Store service defined for this RDFConnection") ;
+ }
+
+ private void checkDataset() {
+ checkOpen() ;
+ if ( destination == null )
+ throw new ARQException("Dataset operations not available - no dataset URL provided") ;
+ }
+
+ private void checkOpen() {
+ if ( ! isOpen )
+ throw new ARQException("closed") ;
+ }
+
+ @Override
+ public void close() {
+ isOpen = false ;
+ }
+
+ @Override
+ public boolean isClosed() {
+ return ! isOpen ;
+ }
+
+ /** Create an HttpEntity for the graph */
+ protected HttpEntity graphToHttpEntity(Graph graph) {
+ return graphToHttpEntity(graph, RDFFormat.NTRIPLES) ;
+ }
+
+ /** Create an HttpEntity for the graph */
+ protected HttpEntity graphToHttpEntity(Graph graph, RDFFormat syntax) {
+ EntityTemplate entity = new EntityTemplate((out)->RDFDataMgr.write(out, graph, syntax)) ;
+ String ct = syntax.getLang().getContentType().getContentType() ;
+ entity.setContentType(ct) ;
+ return entity ;
+ }
+
+ /** Create an HttpEntity for the dataset */
+ protected HttpEntity datasetToHttpEntity(DatasetGraph dataset) {
+ return datasetToHttpEntity(dataset, RDFFormat.NQUADS) ;
+ }
+
+ /** Create an HttpEntity for the dataset */
+ protected HttpEntity datasetToHttpEntity(DatasetGraph dataset, RDFFormat syntax) {
+ EntityTemplate entity = new EntityTemplate((out)->RDFDataMgr.write(out, dataset, syntax)) ;
+ String ct = syntax.getLang().getContentType().getContentType() ;
+ entity.setContentType(ct) ;
+ return entity ;
+ }
+
+ /** Convert HTTP status codes to exceptions */
+ static void exec(Runnable action) {
+ try { action.run() ; }
+ catch (HttpException ex) { handleHttpException(ex, false) ; }
+ }
+
+ /** Convert HTTP status codes to exceptions */
+ static <X> X exec(Supplier<X> action) {
+ try { return action.get() ; }
+ catch (HttpException ex) { handleHttpException(ex, true) ; return null ;}
+ }
+
+ private static void handleHttpException(HttpException ex, boolean ignore404) {
+ if ( ex.getResponseCode() == HttpSC.NOT_FOUND_404 && ignore404 )
+ return ;
+ throw ex ;
+ }
+
+ /** Engine for the transaction lifecycle.
+ * MR+SW
+ */
+
+ static class TxnLifecycle implements Transactional {
+ // MR+SW policy.
+ private ReentrantLock lock = new ReentrantLock() ;
+ private ThreadLocal<ReadWrite> mode = ThreadLocal.withInitial(()->null) ;
+ @Override
+ public void begin(ReadWrite readWrite) {
+ if ( readWrite == ReadWrite.WRITE )
+ lock.lock();
+ mode.set(readWrite);
+ }
+
+ @Override
+ public void commit() {
+ if ( mode.get() == ReadWrite.WRITE)
+ lock.unlock();
+ mode.set(null);
+ }
+
+ @Override
+ public void abort() {
+ if ( mode.get() == ReadWrite.WRITE )
+ lock.unlock();
+ mode.set(null);
+ }
+
+ @Override
+ public boolean isInTransaction() {
+ return mode.get() != null ;
+ }
+
+ @Override
+ public void end() {
+ ReadWrite rw = mode.get() ;
+ if ( rw == null )
+ return ;
+ if ( rw == ReadWrite.WRITE ) {
+ abort() ;
+ return ;
+ }
+ mode.set(null) ;
+ }
+ }
+
+ private TxnLifecycle inner = new TxnLifecycle() ;
+
+ @Override
+ public void begin(ReadWrite readWrite) { inner.begin(readWrite); }
+
+ @Override
+ public void commit() { inner.commit(); }
+
+ @Override
+ public void abort() { inner.abort(); }
+
+ @Override
+ public boolean isInTransaction() { return inner.isInTransaction() ; }
+
+ @Override
+ public void end() { inner.end(); }
+
+}
+
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFDatasetAccessConnection.java
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFDatasetAccessConnection.java b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFDatasetAccessConnection.java
new file mode 100644
index 0000000..a9bfbd1
--- /dev/null
+++ b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFDatasetAccessConnection.java
@@ -0,0 +1,57 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.jena.rdfconnection;
+
+import org.apache.jena.query.Dataset ;
+import org.apache.jena.rdf.model.Model ;
+import org.apache.jena.sparql.core.Transactional ;
+
+/**
+ * SPARQL Graph Store Protocol (read operations) and whole dataset access.
+ * {@link RDFDatasetConnection} adds the write operations.
+ *
+ * @see RDFDatasetConnection
+ * @see RDFConnection
+ * @see RDFConnectionFactory
+ */
+public interface RDFDatasetAccessConnection extends Transactional, AutoCloseable
+{
+ /** Fetch a named graph.
+ * This is SPARQL Graph Store Protocol HTTP GET or equivalent.
+ *
+ * @param graphName URI string for the graph name (null or "default" for the default graph)
+ * @return Model
+ */
+ public Model fetch(String graphName) ;
+
+ /** Fetch the default graph.
+ * This is SPARQL Graph Store Protocol HTTP GET or equivalent.
+ * @return Model
+ */
+ public Model fetch() ;
+
+ /** Fetch the contents of the dataset */
+ public Dataset fetchDataset() ;
+
+ /** Test whether this connection is closed or not */
+ public boolean isClosed() ;
+
+ /** Close this connection. Use with try-resource. */
+ @Override public void close() ;
+}
http://git-wip-us.apache.org/repos/asf/jena/blob/2e26b78e/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFDatasetConnection.java
----------------------------------------------------------------------
diff --git a/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFDatasetConnection.java b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFDatasetConnection.java
new file mode 100644
index 0000000..688e654
--- /dev/null
+++ b/jena-rdfconnection/src/main/java/org/apache/jena/rdfconnection/RDFDatasetConnection.java
@@ -0,0 +1,150 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.jena.rdfconnection;
+
+import org.apache.jena.query.Dataset ;
+import org.apache.jena.rdf.model.Model ;
+import org.apache.jena.sparql.core.Transactional ;
+
+/**
+ * SPARQL Graph Store Protocol and whole dataset access.
+ * This adds the write operations. The read operations are defined by {@link RDFDatasetAccessConnection}.
+ *
+ * @see RDFDatasetAccessConnection
+ * @see RDFConnection
+ * @see RDFConnectionFactory
+ */
+public interface RDFDatasetConnection extends RDFDatasetAccessConnection, Transactional, AutoCloseable
+{
+ /** Load (add, append) RDF into a named graph in a dataset.
+ * This is SPARQL Graph Store Protocol HTTP POST or equivalent.
+ *
+ * @param graphName Graph name (null or "default" for the default graph)
+ * @param file File of the data.
+ */
+ public void load(String graphName, String file) ;
+
+ /** Load (add, append) RDF into the default graph of a dataset.
+ * This is SPARQL Graph Store Protocol HTTP POST or equivalent.
+ *
+ * @param file File of the data.
+ */
+ public void load(String file) ;
+
+ /** Load (add, append) RDF into a named graph in a dataset.
+ * This is SPARQL Graph Store Protocol HTTP POST or equivalent.
+ *
+ * @param graphName Graph name (null or "default" for the default graph)
+ * @param model Data.
+ */
+ public void load(String graphName, Model model) ;
+
+ /** Load (add, append) RDF into the default graph of a dataset.
+ * This is SPARQL Graph Store Protocol HTTP POST or equivalent.
+ *
+ * @param model Data.
+ */
+ public void load(Model model) ;
+
+ /** Set the contents of a named graph of a dataset.
+ * Any existing data is lost.
+ * This is SPARQL Graph Store Protocol HTTP PUT or equivalent.
+ *
+ * @param graphName Graph name (null or "default" for the default graph)
+ * @param file File of the data.
+ */
+ public void put(String graphName, String file) ;
+
+ /** Set the contents of the default graph of a dataset.
+ * Any existing data is lost.
+ * This is SPARQL Graph Store Protocol HTTP PUT or equivalent.
+ *
+ * @param file File of the data.
+ */
+ public void put(String file) ;
+
+ /** Set the contents of a named graph of a dataset.
+ * Any existing data is lost.
+ * This is SPARQL Graph Store Protocol HTTP PUT or equivalent.
+ *
+ * @param graphName Graph name (null or "default" for the default graph)
+ * @param model Data.
+ */
+ public void put(String graphName, Model model) ;
+
+ /** Set the contents of the default graph of a dataset.
+ * Any existing data is lost.
+ * This is SPARQL Graph Store Protocol HTTP PUT or equivalent.
+ *
+ * @param model Data.
+ */
+ public void put( Model model) ;
+
+ /**
+ * Delete a graph from the dataset.
+ * Null or "default" measn the default graph, which is cleared, not removed.
+ *
+ * @param graphName
+ */
+ public void delete(String graphName) ;
+
+ /**
+ * Remove all data from the default graph.
+ */
+ public void delete() ;
+
+ /* Load (add, append) RDF triple or quad data into a dataset. Triples wil go into the default graph.
+ * This is not a SPARQL Graph Store Protocol operation.
+ * It is an HTTP POST equivalent to the dataset.
+ */
+ public void loadDataset(String file) ;
+
+ /* Load (add, append) RDF triple or quad data into a dataset. Triples wil go into the default graph.
+ * This is not a SPARQL Graph Store Protocol operation.
+ * It is an HTTP POST equivalent to the dataset.
+ */
+ public void loadDataset(Dataset dataset) ;
+
+ /* Set RDF triple or quad data as the dataset contents.
+ * Triples will go into the default graph, quads in named graphs.
+ * This is not a SPARQL Graph Store Protocol operation.
+ * It is an HTTP PUT equivalent to the dataset.
+ */
+ public void putDataset(String file) ;
+
+ /* Set RDF triple or quad data as the dataset contents.
+ * Triples will go into the default graph, quads in named graphs.
+ * This is not a SPARQL Graph Store Protocol operation.
+ * It is an HTTP PUT equivalent to the dataset.
+ */
+ public void putDataset(Dataset dataset) ;
+
+ // /** Clear the dataset - remove all named graphs, clear the default graph. */
+// public void clearDataset() ;
+
+
+ /** Test whether this connection is closed or not */
+ @Override
+ public boolean isClosed() ;
+
+ /** Close this connection. Use with try-resource. */
+ @Override
+ public void close() ;
+}
+