You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@jena.apache.org by Marc Agate <ag...@gmail.com> on 2017/12/26 14:34:15 UTC

Custom ARQ function not working with fuseki endpoint

Hi !

I successfully implemented sparql queries using custom ARQ functions
using the following (custom function code):

****************
public class LevenshteinFilter extends FunctionBase2 {

    public LevenshteinFilter() { super() ; }

    public NodeValue exec(NodeValue value1, NodeValue value2){
        LevenshteinDistance LD=new LevenshteinDistance();
        int i = LD.apply(value1.asString(), value2.asString()); 
        return NodeValue.makeInteger(i); 
    }
}
***************

it works fine when I query against a Model loaded from a turtle file,
like this:

***************
InputStream input =
QueryProcessor.class.getClassLoader().getResourceAsStream("full.ttl");
            model =
ModelFactory.createMemModelMaker().createModel("default");
            model.read(input,null,"TURTLE"); // null base URI, since
model URIs are absolute
            input.close();
***************

with the query being sent like this :

***************
String functionUri = "http://www.example1.org/LevenshteinFunction"; 
        FunctionRegistry.get().put(functionUri ,
LevenshteinFilter.class);

        String s = "whatever you want";
        String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l, \"" + s + "\")
< 4) }";                                              
        Query query = QueryFactory.create(sparql);
        QueryExecution qexec = QueryExecutionFactory.create(query,
model); 
        ResultSet rs = qexec.execSelect();
***************

However, if i use a working fuseki endpoint for the same dataset
(full.ttl) like this :

***************
fusekiUrl="http://localhost:3030/ds/query";
***************

sending the query like this (using
QueryExecutionFactory.sparqlService(fusekiUrl,query) instead of
QueryExecutionFactory.create(query,model) ):

***************
String functionUri = "http://www.example1.org/LevenshteinFunction"; 
        FunctionRegistry.get().put(functionUri ,
LevenshteinFilter.class);

        String s = "whatever you want";
        String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \"" + s + "\")
< 4) }";                                       
        Query query = QueryFactory.create(sparql);
        QueryExecution qexec =
QueryExecutionFactory.sparqlService(fusekiUrl,query); 
        ResultSet rs = qexec.execSelect();
***************

Then I don't get any results back. In both cases I printed out the
FunctionRegistry and they contain exactly the same entries, especially
:

key=http://www.example1.org/LevenshteinFunction value:
org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133e

Any clue ?

Thanks

Re: Custom ARQ function not working with fuseki endpoint

Posted by ajs6f <aj...@apache.org>.
Good. Please do take the opportunity to close the Stack Exchange post where you asked the same question with a link to this thread.

https://stackoverflow.com/questions/47979588/custom-arq-function-not-working-with-fuseki-endpoint

Adam Soroka

> On Dec 26, 2017, at 4:15 PM, Marc Agate <ag...@gmail.com> wrote:
> 
> Hi Adam,
> 
> I know about strlen : the part I shared with you guys was just for
> testing purpose as our application is going to require quite a bit of
> very specific filters.
> I was just testing the setup (the mechanism) for further "real" custom
> functions development.
> 
> Thanks
> 
> Marc
> 
> 
> Le mardi 26 décembre 2017 à 15:53 -0500, ajs6f a écrit :
>> I'm glad you got your function working. Now that I look at it, it
>> seems possible that you could use a built-in SPARQL function:
>> 
>> https://www.w3.org/TR/sparql11-query/#func-strlen
>> 
>> Adam Soroka
>> 
>>> On Dec 26, 2017, at 3:50 PM, Marc Agate <ag...@gmail.com>
>>> wrote:
>>> 
>>> Hi,
>>> 
>>> I finally got the whole test working properly
>>> 
>>> To summarize:
>>> 
>>> 1) Deploy the custom functions classes on fuseki server
>>> 2) Modify the fuseki config :
>>> 
>>> add [] ja:loadClass "io.bdrc.ldsearch.query.functions.MyFilter"
>>> 
>>> where MyFilter implementation is :
>>> 
>>> import org.apache.jena.sparql.expr.NodeValue;
>>> import org.apache.jena.sparql.function.FunctionBase1;
>>> 
>>> 
>>> public class MyFilter extends FunctionBase1 {
>>> 	
>>> 	public MyFilter() { super() ; }
>>> 	
>>> 	public NodeValue exec(NodeValue value1){
>>> 		
>>>         int d = value1.asString().length();
>>>         return NodeValue.makeInteger(new Integer(d)); 
>>>     }
>>> }
>>> 
>>> 3) add the following prefix to the context:
>>> 
>>> PREFIX f: <java:io.bdrc.ldsearch.query.functions.>
>>> 
>>> Note that "io.bdrc.ldsearch.query.functions is the package of
>>> MyFilter
>>> class, not the class itself --> this means that you never call a
>>> class
>>> method in sparql queries but only a class that implements
>>> org.apache.jena.sparql.function.FunctionBaseX where X is the number
>>> of
>>> argument of your filter function
>>> 
>>> 4) Write (for instance) the query like this:
>>> 
>>> SELECT DISTINCT ?l
>>> WHERE { ?x skos:prefLabel ?l .  
>>>   FILTER (f:MyFilter(?l) < 20) 
>>> }
>>> 
>>> Thanks to Adam and Andy
>>> 
>>> Marc
>>> 
>>> Le mardi 26 décembre 2017 à 19:44 +0000, Andy Seaborne a écrit :
>>>>> Exception in thread "main" HttpException: 404	at
>>>>> org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.
>>>>> java
>>>>> :328
>>>>> )	
>>>> 
>>>> 404 - HTTP not found - when trying to call Fuseki - i.e. the
>>>> query 
>>>> service URL is wrong. A server is running at the address
>>>> (host:port)
>>>> but 
>>>> the path does not name an HTTP resource.
>>>> 
>>>>      Andy
>>>> 
>>>> 
>>>> On 26/12/17 19:00, ajs6f wrote:
>>>>> I don't understand how that config is getting parsed at all.
>>>>> It's
>>>>> not valid Turtle.
>>>>> 
>>>>>> ja:loadClass
>>>>>> "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
>>>>> 
>>>>> is not a triple at all. It should probably be:
>>>>> 
>>>>> [] ja:loadClass
>>>>> "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
>>>>> 
>>>>> Riot gives "Expected IRI for predicate: got:
>>>>> [STRING:io.bdrc.ldsearch.query.functions.CustomARQFunctions]"
>>>>> 
>>>>> It's not clear to me how you could be successfully loading your
>>>>> extension function with invalid config.
>>>>> 
>>>>> Adam Soroka
>>>>> 
>>>>> 
>>>>>> On Dec 26, 2017, at 1:42 PM, Marc Agate <agate.marc@gmail.com
>>>>>>> 
>>>>>> wrote:
>>>>>> 
>>>>>> Well...
>>>>>> 
>>>>>> Here is the query
>>>>>> 
>>>>>> PREFIX : <http://purl.bdrc.io/ontology/core/>
>>>>>> PREFIX adm: <http://purl.bdrc.io/ontology/admin/>
>>>>>> PREFIX bdr: <http://purl.bdrc.io/resource/>
>>>>>> PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
>>>>>> PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
>>>>>> PREFIX skos: <http://www.w3.org/2004/02/skos/core#>
>>>>>> PREFIX tbr: <http://purl.bdrc.io/ontology/toberemoved/>
>>>>>> PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>
>>>>>> PREFIX f:
>>>>>> <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
>>>>>> 
>>>>>> SELECT DISTINCT ?l
>>>>>> WHERE {
>>>>>> ?x skos:prefLabel ?l .
>>>>>> FILTER (f:myFilter(?l) < 100)
>>>>>> }
>>>>>> 
>>>>>> Note that whene I change FILTER (f:myFilter(?l) < 100)  to
>>>>>> FILTER
>>>>>> (STRLEN(?l) < 100), I don't get the 404 exception...
>>>>>> Therefore it's not a connection issue. I thing it's more like
>>>>>> a
>>>>>> unresolved URI or so.
>>>>>> 
>>>>>> Now, since you'are asking for it, here is the full fuseki
>>>>>> config:
>>>>>> 
>>>>>> #############################################################
>>>>>> ###
>>>>>> # Fuseki configuration for BDRC, configures two endpoints:
>>>>>> #   - /bdrc is read-only
>>>>>> #   - /bdrcrw is read-write
>>>>>> #
>>>>>> # This was painful to come up with but the web interface
>>>>>> basically
>>>>>> allows no option
>>>>>> # and there is no subclass inference by default so such a
>>>>>> configuration
>>>>>> file is necessary.
>>>>>> #
>>>>>> # The main doc sources are:
>>>>>> #  - https://jena.apache.org/documentation/fuseki2/fuseki-con
>>>>>> figu
>>>>>> ration
>>>>>> .html
>>>>>> #  - https://jena.apache.org/documentation/assembler/assemble
>>>>>> r-ho
>>>>>> wto.ht
>>>>>> ml
>>>>>> #  - https://jena.apache.org/documentation/assembler/assemble
>>>>>> r.tt
>>>>>> l
>>>>>> #
>>>>>> # See https://jena.apache.org/documentation/fuseki2/fuseki-la
>>>>>> yout
>>>>>> .html
>>>>>> for the destination of this file.
>>>>>> 
>>>>>> @prefix fuseki:  <http://jena.apache.org/fuseki#> .
>>>>>> @prefix rdf:     <http://www.w3.org/1999/02/22-rdf-syntax-ns#
>>>>>>> .
>>>>>> @prefix rdfs:    <http://www.w3.org/2000/01/rdf-schema#> .
>>>>>> @prefix tdb:     <http://jena.hpl.hp.com/2008/tdb#> .
>>>>>> # @prefix tdb2:    <http://jena.apache.org/2016/tdb#> .
>>>>>> @prefix ja:      <http://jena.hpl.hp.com/2005/11/Assembler#>
>>>>>> .
>>>>>> @prefix :        <http://base/#> .
>>>>>> @prefix text:    <http://jena.apache.org/text#> .
>>>>>> @prefix skos:    <http://www.w3.org/2004/02/skos/core#> .
>>>>>> @prefix adm:     <http://purl.bdrc.io/ontology/admin/> .
>>>>>> @prefix bdd:     <http://purl.bdrc.io/data/> .
>>>>>> @prefix bdo:     <http://purl.bdrc.io/ontology/core/> .
>>>>>> @prefix bdr:     <http://purl.bdrc.io/resource/> .
>>>>>> @prefix f:
>>>>>> <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
>>>>>> .
>>>>>> 
>>>>>> ja:loadClass
>>>>>> "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
>>>>>> # [] ja:loadClass "org.seaborne.tdb2.TDB2" .
>>>>>> # tdb2:DatasetTDB2  rdfs:subClassOf  ja:RDFDataset .
>>>>>> # tdb2:GraphTDB2    rdfs:subClassOf  ja:Model .
>>>>>> 
>>>>>> [] rdf:type fuseki:Server ;
>>>>>>     fuseki:services (
>>>>>>       :bdrcrw
>>>>>> #      :bdrcro
>>>>>>     ) .
>>>>>> 
>>>>>> :bdrcrw rdf:type fuseki:Service ;
>>>>>>      fuseki:name                       "bdrcrw" ;     # name
>>>>>> of
>>>>>> the
>>>>>> dataset in the url
>>>>>>      fuseki:serviceQuery               "query" ;    # SPARQL
>>>>>> query
>>>>>> service
>>>>>>      fuseki:serviceUpdate              "update" ;   # SPARQL
>>>>>> update
>>>>>> service
>>>>>>      fuseki:serviceUpload              "upload" ;   # Non-
>>>>>> SPARQL
>>>>>> upload
>>>>>> service
>>>>>>      fuseki:serviceReadWriteGraphStore "data" ;     # SPARQL
>>>>>> Graph store
>>>>>> protocol (read and write)
>>>>>>      fuseki:dataset                    :bdrc_text_dataset ;
>>>>>>      .
>>>>>> 
>>>>>> # :bdrcro rdf:type fuseki:Service ;
>>>>>> #     fuseki:name                     "bdrc" ;
>>>>>> #     fuseki:serviceQuery             "query" ;
>>>>>> #     fuseki:serviceReadGraphStore    "data" ;
>>>>>> #     fuseki:dataset           		:bdrc_text_dat
>>>>>> aset
>>>>>> ;
>>>>>> #     .
>>>>>> 
>>>>>> # using TDB
>>>>>> :dataset_bdrc rdf:type      tdb:DatasetTDB ;
>>>>>>       tdb:location "/etc/fuseki/databases/bdrc" ;
>>>>>>       tdb:unionDefaultGraph true ;
>>>>>>       .
>>>>>> 
>>>>>> # # try using TDB2
>>>>>> # :dataset_bdrc rdf:type      tdb2:DatasetTDB2 ;
>>>>>> #      tdb2:location "/etc/fuseki/databases/bdrc" ;
>>>>>> #      tdb2:unionDefaultGraph true ;
>>>>>> #   .
>>>>>> 
>>>>>> :bdrc_text_dataset rdf:type     text:TextDataset ;
>>>>>>      text:dataset   :dataset_bdrc ;
>>>>>>      text:index     :bdrc_lucene_index ;
>>>>>>      .
>>>>>> 
>>>>>> # Text index description
>>>>>> :bdrc_lucene_index a text:TextIndexLucene ;
>>>>>>      text:directory <file:/etc/fuseki/lucene-bdrc> ;
>>>>>>      text:storeValues true ;
>>>>>>      text:multilingualSupport true ;
>>>>>>      text:entityMap :bdrc_entmap ;
>>>>>>      text:defineAnalyzers (
>>>>>>          [ text:addLang "bo" ;
>>>>>>            text:analyzer [
>>>>>>              a text:GenericAnalyzer ;
>>>>>>              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
>>>>>>              text:params (
>>>>>>                  [ text:paramName "segmentInWords" ;
>>>>>>                    text:paramType text:TypeBoolean ;
>>>>>>                    text:paramValue false ]
>>>>>>                  [ text:paramName "lemmatize" ;
>>>>>>                    text:paramType text:TypeBoolean ;
>>>>>>                    text:paramValue true ]
>>>>>>                  [ text:paramName "filterChars" ;
>>>>>>                    text:paramType text:TypeBoolean ;
>>>>>>                    text:paramValue false ]
>>>>>>                  [ text:paramName "fromEwts" ;
>>>>>>                    text:paramType text:TypeBoolean ;
>>>>>>                    text:paramValue false ]
>>>>>>                  )
>>>>>>              ] ;
>>>>>>            ]
>>>>>>          [ text:addLang "bo-x-ewts" ;
>>>>>>            text:analyzer [
>>>>>>              a text:GenericAnalyzer ;
>>>>>>              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
>>>>>>              text:params (
>>>>>>                  [ text:paramName "segmentInWords" ;
>>>>>>                    text:paramType text:TypeBoolean ;
>>>>>>                    text:paramValue false ]
>>>>>>                  [ text:paramName "lemmatize" ;
>>>>>>                    text:paramType text:TypeBoolean ;
>>>>>>                    text:paramValue true ]
>>>>>>                  [ text:paramName "filterChars" ;
>>>>>>                    text:paramType text:TypeBoolean ;
>>>>>>                    text:paramValue false ]
>>>>>>                  [ text:paramName "fromEwts" ;
>>>>>>                    text:paramType text:TypeBoolean ;
>>>>>>                    text:paramValue true ]
>>>>>>                  )
>>>>>>              ] ;
>>>>>>            ]
>>>>>>        ) ;
>>>>>>      .
>>>>>> 
>>>>>> # Index mappings
>>>>>> :bdrc_entmap a text:EntityMap ;
>>>>>>      text:entityField      "uri" ;
>>>>>>      text:uidField         "uid" ;
>>>>>>      text:defaultField     "label" ;
>>>>>>      text:langField        "lang" ;
>>>>>>      text:graphField       "graph" ; ## enable graph-specific
>>>>>> indexing
>>>>>>      text:map (
>>>>>>           [ text:field "label" ;
>>>>>>             text:predicate skos:prefLabel ]
>>>>>>           [ text:field "altLabel" ;
>>>>>>             text:predicate skos:altLabel ; ]
>>>>>>           [ text:field "rdfsLabel" ;
>>>>>>             text:predicate rdfs:label ; ]
>>>>>>           [ text:field "chunkContents" ;
>>>>>>             text:predicate bdo:chunkContents ; ]
>>>>>>           [ text:field "eTextTitle" ;
>>>>>>             text:predicate bdo:eTextTitle ; ]
>>>>>>           [ text:field "logMessage" ;
>>>>>>             text:predicate adm:logMessage ; ]
>>>>>>           [ text:field "noteText" ;
>>>>>>             text:predicate bdo:noteText ; ]
>>>>>>           [ text:field "workAuthorshipStatement" ;
>>>>>>             text:predicate bdo:workAuthorshipStatement ; ]
>>>>>>           [ text:field "workColophon" ;
>>>>>>             text:predicate bdo:workColophon ; ]
>>>>>>           [ text:field "workEditionStatement" ;
>>>>>>             text:predicate bdo:workEditionStatement ; ]
>>>>>>           [ text:field "workPublisherLocation" ;
>>>>>>             text:predicate bdo:workPublisherLocation ; ]
>>>>>>           [ text:field "workPublisherName" ;
>>>>>>             text:predicate bdo:workPublisherName ; ]
>>>>>>           [ text:field "workSeriesName" ;
>>>>>>             text:predicate bdo:workSeriesName ; ]
>>>>>>           ) ;
>>>>>>      .
>>>>>> #############################################################
>>>>>> ####
>>>>>> ##
>>>>>> 
>>>>>> It would be wonderful to have the java:URI scheme thing
>>>>>> working
>>>>>> (all
>>>>>> custom functions in a single class and method calls done
>>>>>> directly
>>>>>> in
>>>>>> the sparql query : sounds like a dream !)
>>>>>> 
>>>>>> Marc
>>>>>> 
>>>>>> Le mardi 26 décembre 2017 à 13:22 -0500, ajs6f a écrit :
>>>>>>> That exception doesn't appear to have anything to do with
>>>>>>> extension
>>>>>>> functions. It indicates a problem between client and
>>>>>>> server.
>>>>>>> 
>>>>>>> Please show at _least_ your actual query execution code,
>>>>>>> your
>>>>>>> complete Fuseki config, and a complete stacktrace.
>>>>>>> 
>>>>>>> 
>>>>>>> ajs6f
>>>>>>> 
>>>>>>>> On Dec 26, 2017, at 1:17 PM, Marc Agate <agate.marc@gmail
>>>>>>>> .com
>>>>>>>>> 
>>>>>>>> 
>>>>>>>> wrote:
>>>>>>>> 
>>>>>>>> I forgot to mention that according to
>>>>>>>> https://jena.apache.org/documentation/query/java-uri.html
>>>>>>>> I tried for testing purpose to set a PREFIX f:
>>>>>>>> <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions
>>>>>>>> .>an
>>>>>>>> d
>>>>>>>> added
>>>>>>>> the following to fuseki config :
>>>>>>>> ja:loadClass
>>>>>>>> "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
>>>>>>>> .
>>>>>>>> where CustomARQFunctions is :
>>>>>>>> public class CustomARQFunctions {		public
>>>>>>>> static
>>>>>>>> NodeValue myFilter(NodeValue value1){		    
>>>>>>>>     
>>>>>>>> int i
>>>>>>>> =
>>>>>>>> value1.asString().length();         return
>>>>>>>> NodeValue.makeInteger(i);     }
>>>>>>>> }
>>>>>>>> since according to
>>>>>>>> https://jena.apache.org/documentation/query/writing_funct
>>>>>>>> ions
>>>>>>>> .html
>>>>>>>> using the java:URI scheme "dynamically loads the code,
>>>>>>>> which
>>>>>>>> must
>>>>>>>> be on
>>>>>>>> the Java classpath. With this scheme, the function URI
>>>>>>>> gives
>>>>>>>> the
>>>>>>>> class
>>>>>>>> name. There is automatic registration of a wrapper into
>>>>>>>> the
>>>>>>>> function
>>>>>>>> registry. This way, no explicit registration step is
>>>>>>>> needed
>>>>>>>> by the
>>>>>>>> application and queries issues with the command line
>>>>>>>> tools
>>>>>>>> can load
>>>>>>>> custom functions."
>>>>>>>> but no luck: I keep getting the following exception:
>>>>>>>> Exception in thread "main" HttpException: 404	at
>>>>>>>> org.apache.jena.sparql.engine.http.HttpQuery.execGet(Http
>>>>>>>> Quer
>>>>>>>> y.java
>>>>>>>> :328
>>>>>>>> )	at
>>>>>>>> org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQue
>>>>>>>> ry.j
>>>>>>>> ava:28
>>>>>>>> 8)	
>>>>>>>> at
>>>>>>>> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execRe
>>>>>>>> sult
>>>>>>>> SetInn
>>>>>>>> er(Q
>>>>>>>> ueryEngineHTTP.java:348)	at
>>>>>>>> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSe
>>>>>>>> lect
>>>>>>>> (Query
>>>>>>>> Engi
>>>>>>>> neHTTP.java:340)
>>>>>>>> I am stuck !
>>>>>>>> Marc
>>>>>>>> Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a
>>>>>>>> écrit :
>>>>>>>>> Hi,
>>>>>>>>> Adam's gave me the right direction.
>>>>>>>>> I managed to load my function class in fuseki config
>>>>>>>>> using
>>>>>>>>> ja:loadClass
>>>>>>>>> but now remains the following issue (the function is
>>>>>>>>> not
>>>>>>>>> registered)seefuseki logs :
>>>>>>>>> [2017-12-26 16:10:13] exec       WARN  URI <http://purl
>>>>>>>>> .bdr
>>>>>>>>> c.io/f
>>>>>>>>> unct
>>>>>>>>> ions#MyFilterFunction> has no registered function
>>>>>>>>> factory
>>>>>>>>> How can I register this function now that I have the
>>>>>>>>> code
>>>>>>>>> available
>>>>>>>>> onthe endpoint side ?
>>>>>>>>> Thanks for helping
>>>>>>>>> Marc.
>>>>>>>>> 
>>>>>>>>> Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne
>>>>>>>>> a
>>>>>>>>> écrit :
>>>>>>>>>> As well s Adam's point (and all the libraries your
>>>>>>>>>> function
>>>>>>>>>> needs, transitively)
>>>>>>>>>> What is in the Fuseki log file?How was the data
>>>>>>>>>> loaded
>>>>>>>>>> into
>>>>>>>>>> Fuseki?
>>>>>>>>>>   >> I printed out the >> FunctionRegistry
>>>>>>>>>>       And
>>>>>>>>>> On 26/12/17 14:51, ajs6f wrote:
>>>>>>>>>>> I'm not as familiar with the extension points of
>>>>>>>>>>> ARQ as
>>>>>>>>>>> I
>>>>>>>>>>> wouldlike to be, but as I understand what you are
>>>>>>>>>>> doing, you
>>>>>>>>>>> areregistering a new function with your _local_
>>>>>>>>>>> registry,
>>>>>>>>>>> then
>>>>>>>>>>> firinga query at a _remote_ endpoint (which has a
>>>>>>>>>>> completely
>>>>>>>>>>> independentregistry in a different JVM in a
>>>>>>>>>>> different
>>>>>>>>>>> process,
>>>>>>>>>>> potentially ina different _system_).
>>>>>>>>>>> The query is getting interpreted and executed by
>>>>>>>>>>> that
>>>>>>>>>>> remoteservice, not locally. So you need to register
>>>>>>>>>>> the
>>>>>>>>>>> function
>>>>>>>>>>> _there_.
>>>>>>>>>>> Take a look at this thread:
>>>>>>>>>>> https://lists.apache.org/thread.html/1cda23332af426
>>>>>>>>>>> 4883
>>>>>>>>>>> e88697
>>>>>>>>>>> d994
>>>>>>>>>>> 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.o
>>>>>>>>>>> rg%3
>>>>>>>>>>> E
>>>>>>>>>>> It should get you started as to how to register
>>>>>>>>>>> extensionfunctionality in Fuseki.
>>>>>>>>>>> 
>>>>>>>>>>> Adam Soroka
>>>>>>>>>>>> On Dec 26, 2017, at 9:34 AM, Marc Agate <agate.ma
>>>>>>>>>>>> rc@g
>>>>>>>>>>>> mail.c
>>>>>>>>>>>> om>
>>>>>>>>>>>> wrote:
>>>>>>>>>>>> 
>>>>>>>>>>>> Hi !
>>>>>>>>>>>> 
>>>>>>>>>>>> I successfully implemented sparql queries using
>>>>>>>>>>>> custom ARQ
>>>>>>>>>>>> functions
>>>>>>>>>>>> using the following (custom function code):
>>>>>>>>>>>> 
>>>>>>>>>>>> ****************
>>>>>>>>>>>> public class LevenshteinFilter extends
>>>>>>>>>>>> FunctionBase2
>>>>>>>>>>>> {
>>>>>>>>>>>> 
>>>>>>>>>>>>       public LevenshteinFilter() { super() ; }
>>>>>>>>>>>> 
>>>>>>>>>>>>       public NodeValue exec(NodeValue value1,
>>>>>>>>>>>> NodeValue
>>>>>>>>>>>> value2){
>>>>>>>>>>>>           LevenshteinDistance LD=new
>>>>>>>>>>>> LevenshteinDistance();
>>>>>>>>>>>>           int i = LD.apply(value1.asString(),
>>>>>>>>>>>> value2.asString());
>>>>>>>>>>>>           return NodeValue.makeInteger(i);
>>>>>>>>>>>>       }
>>>>>>>>>>>> }
>>>>>>>>>>>> ***************
>>>>>>>>>>>> 
>>>>>>>>>>>> it works fine when I query against a Model loaded
>>>>>>>>>>>> from a
>>>>>>>>>>>> turtle
>>>>>>>>>>>> file,
>>>>>>>>>>>> like this:
>>>>>>>>>>>> 
>>>>>>>>>>>> ***************
>>>>>>>>>>>> InputStream input =
>>>>>>>>>>>> QueryProcessor.class.getClassLoader().getResource
>>>>>>>>>>>> AsSt
>>>>>>>>>>>> ream("
>>>>>>>>>>>> full
>>>>>>>>>>>> .t
>>>>>>>>>>>> tl");
>>>>>>>>>>>>               model =
>>>>>>>>>>>> ModelFactory.createMemModelMaker().createModel("d
>>>>>>>>>>>> efau
>>>>>>>>>>>> lt");
>>>>>>>>>>>>               model.read(input,null,"TURTLE"); //
>>>>>>>>>>>> null base
>>>>>>>>>>>> URI,
>>>>>>>>>>>> since
>>>>>>>>>>>> model URIs are absolute
>>>>>>>>>>>>               input.close();
>>>>>>>>>>>> ***************
>>>>>>>>>>>> 
>>>>>>>>>>>> with the query being sent like this :
>>>>>>>>>>>> 
>>>>>>>>>>>> ***************
>>>>>>>>>>>> String functionUri = "http://www.example1.org/Lev
>>>>>>>>>>>> ensh
>>>>>>>>>>>> teinFu
>>>>>>>>>>>> ncti
>>>>>>>>>>>> on
>>>>>>>>>>>> ";
>>>>>>>>>>>>           FunctionRegistry.get().put(functionUri
>>>>>>>>>>>> ,
>>>>>>>>>>>> LevenshteinFilter.class);
>>>>>>>>>>>> 
>>>>>>>>>>>>           String s = "whatever you want";
>>>>>>>>>>>>           String sparql = prefixes+" SELECT
>>>>>>>>>>>> DISTINCT
>>>>>>>>>>>> ?l
>>>>>>>>>>>> WHERE {
>>>>>>>>>>>> ?x
>>>>>>>>>>>> rdfs:label ?l . "
>>>>>>>>>>>> +  "FILTER(fct:LevenshteinFunction(?l,
>>>>>>>>>>>> \"" +
>>>>>>>>>>>> s
>>>>>>>>>>>> + "\")
>>>>>>>>>>>> < 4) }";
>>>>>>>>>>>>           Query query =
>>>>>>>>>>>> QueryFactory.create(sparql);
>>>>>>>>>>>>           QueryExecution qexec =
>>>>>>>>>>>> QueryExecutionFactory.create(query,
>>>>>>>>>>>> model);
>>>>>>>>>>>>           ResultSet rs = qexec.execSelect();
>>>>>>>>>>>> ***************
>>>>>>>>>>>> 
>>>>>>>>>>>> However, if i use a working fuseki endpoint for
>>>>>>>>>>>> the
>>>>>>>>>>>> same
>>>>>>>>>>>> dataset
>>>>>>>>>>>> (full.ttl) like this :
>>>>>>>>>>>> 
>>>>>>>>>>>> ***************
>>>>>>>>>>>> fusekiUrl="http://localhost:3030/ds/query";
>>>>>>>>>>>> ***************
>>>>>>>>>>>> 
>>>>>>>>>>>> sending the query like this (using
>>>>>>>>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,que
>>>>>>>>>>>> ry)
>>>>>>>>>>>> instead of
>>>>>>>>>>>> QueryExecutionFactory.create(query,model) ):
>>>>>>>>>>>> 
>>>>>>>>>>>> ***************
>>>>>>>>>>>> String functionUri = "http://www.example1.org/Lev
>>>>>>>>>>>> ensh
>>>>>>>>>>>> teinFu
>>>>>>>>>>>> ncti
>>>>>>>>>>>> on
>>>>>>>>>>>> ";
>>>>>>>>>>>>           FunctionRegistry.get().put(functionUri
>>>>>>>>>>>> ,
>>>>>>>>>>>> LevenshteinFilter.class);
>>>>>>>>>>>> 
>>>>>>>>>>>>           String s = "whatever you want";
>>>>>>>>>>>>           String sparql = prefixes+" SELECT
>>>>>>>>>>>> DISTINCT
>>>>>>>>>>>> ?l
>>>>>>>>>>>> WHERE {
>>>>>>>>>>>> ?x
>>>>>>>>>>>> rdfs:label ?l . " +
>>>>>>>>>>>> "FILTER(fct:LevenshteinFunction(?l, \""
>>>>>>>>>>>> + s
>>>>>>>>>>>> +
>>>>>>>>>>>> "\")
>>>>>>>>>>>> < 4) }";
>>>>>>>>>>>>           Query query =
>>>>>>>>>>>> QueryFactory.create(sparql);
>>>>>>>>>>>>           QueryExecution qexec =
>>>>>>>>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,que
>>>>>>>>>>>> ry);
>>>>>>>>>>>>           ResultSet rs = qexec.execSelect();
>>>>>>>>>>>> ***************
>>>>>>>>>>>> 
>>>>>>>>>>>> Then I don't get any results back. In both cases
>>>>>>>>>>>> I
>>>>>>>>>>>> printed
>>>>>>>>>>>> out
>>>>>>>>>>>> the
>>>>>>>>>>>> FunctionRegistry and they contain exactly the
>>>>>>>>>>>> same
>>>>>>>>>>>> entries,
>>>>>>>>>>>> especially
>>>>>>>>>>>> :
>>>>>>>>>>>> 
>>>>>>>>>>>> key=http://www.example1.org/LevenshteinFunction
>>>>>>>>>>>> value:
>>>>>>>>>>>> org.apache.jena.sparql.function.FunctionFactoryAu
>>>>>>>>>>>> to@5
>>>>>>>>>>>> a45133
>>>>>>>>>>>> e
>>>>>>>>>>>> 
>>>>>>>>>>>> Any clue ?
>>>>>>>>>>>> 
>>>>>>>>>>>> Thanks
>>>>>>> 
>>>>>>> 
>> 
>> 


Re: Custom ARQ function not working with fuseki endpoint

Posted by Marc Agate <ag...@gmail.com>.
Hi Adam,

I know about strlen : the part I shared with you guys was just for
testing purpose as our application is going to require quite a bit of
very specific filters.
I was just testing the setup (the mechanism) for further "real" custom
functions development.

Thanks

Marc


Le mardi 26 décembre 2017 à 15:53 -0500, ajs6f a écrit :
> I'm glad you got your function working. Now that I look at it, it
> seems possible that you could use a built-in SPARQL function:
> 
> https://www.w3.org/TR/sparql11-query/#func-strlen
> 
> Adam Soroka
> 
> > On Dec 26, 2017, at 3:50 PM, Marc Agate <ag...@gmail.com>
> > wrote:
> > 
> > Hi,
> > 
> > I finally got the whole test working properly
> > 
> > To summarize:
> > 
> > 1) Deploy the custom functions classes on fuseki server
> > 2) Modify the fuseki config :
> > 
> > add [] ja:loadClass "io.bdrc.ldsearch.query.functions.MyFilter"
> > 
> > where MyFilter implementation is :
> > 
> > import org.apache.jena.sparql.expr.NodeValue;
> > import org.apache.jena.sparql.function.FunctionBase1;
> > 
> > 
> > public class MyFilter extends FunctionBase1 {
> > 	
> > 	public MyFilter() { super() ; }
> > 	
> > 	public NodeValue exec(NodeValue value1){
> > 		
> >         int d = value1.asString().length();
> >         return NodeValue.makeInteger(new Integer(d)); 
> >     }
> > }
> > 
> > 3) add the following prefix to the context:
> > 
> > PREFIX f: <java:io.bdrc.ldsearch.query.functions.>
> > 
> > Note that "io.bdrc.ldsearch.query.functions is the package of
> > MyFilter
> > class, not the class itself --> this means that you never call a
> > class
> > method in sparql queries but only a class that implements
> > org.apache.jena.sparql.function.FunctionBaseX where X is the number
> > of
> > argument of your filter function
> > 
> > 4) Write (for instance) the query like this:
> > 
> > SELECT DISTINCT ?l
> > WHERE { ?x skos:prefLabel ?l .  
> >   FILTER (f:MyFilter(?l) < 20) 
> > }
> > 
> > Thanks to Adam and Andy
> > 
> > Marc
> > 
> > Le mardi 26 décembre 2017 à 19:44 +0000, Andy Seaborne a écrit :
> > > > Exception in thread "main" HttpException: 404	at
> > > > org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.
> > > > java
> > > > :328
> > > > )	
> > > 
> > > 404 - HTTP not found - when trying to call Fuseki - i.e. the
> > > query 
> > > service URL is wrong. A server is running at the address
> > > (host:port)
> > > but 
> > > the path does not name an HTTP resource.
> > > 
> > >      Andy
> > > 
> > > 
> > > On 26/12/17 19:00, ajs6f wrote:
> > > > I don't understand how that config is getting parsed at all.
> > > > It's
> > > > not valid Turtle.
> > > > 
> > > > > ja:loadClass
> > > > > "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> > > > 
> > > > is not a triple at all. It should probably be:
> > > > 
> > > > [] ja:loadClass
> > > > "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> > > > 
> > > > Riot gives "Expected IRI for predicate: got:
> > > > [STRING:io.bdrc.ldsearch.query.functions.CustomARQFunctions]"
> > > > 
> > > > It's not clear to me how you could be successfully loading your
> > > > extension function with invalid config.
> > > > 
> > > > Adam Soroka
> > > > 
> > > > 
> > > > > On Dec 26, 2017, at 1:42 PM, Marc Agate <agate.marc@gmail.com
> > > > > >
> > > > > wrote:
> > > > > 
> > > > > Well...
> > > > > 
> > > > > Here is the query
> > > > > 
> > > > > PREFIX : <http://purl.bdrc.io/ontology/core/>
> > > > > PREFIX adm: <http://purl.bdrc.io/ontology/admin/>
> > > > > PREFIX bdr: <http://purl.bdrc.io/resource/>
> > > > > PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
> > > > > PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
> > > > > PREFIX skos: <http://www.w3.org/2004/02/skos/core#>
> > > > > PREFIX tbr: <http://purl.bdrc.io/ontology/toberemoved/>
> > > > > PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>
> > > > > PREFIX f:
> > > > > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
> > > > > 
> > > > > SELECT DISTINCT ?l
> > > > > WHERE {
> > > > > ?x skos:prefLabel ?l .
> > > > > FILTER (f:myFilter(?l) < 100)
> > > > > }
> > > > > 
> > > > > Note that whene I change FILTER (f:myFilter(?l) < 100)  to
> > > > > FILTER
> > > > > (STRLEN(?l) < 100), I don't get the 404 exception...
> > > > > Therefore it's not a connection issue. I thing it's more like
> > > > > a
> > > > > unresolved URI or so.
> > > > > 
> > > > > Now, since you'are asking for it, here is the full fuseki
> > > > > config:
> > > > > 
> > > > > #############################################################
> > > > > ###
> > > > > # Fuseki configuration for BDRC, configures two endpoints:
> > > > > #   - /bdrc is read-only
> > > > > #   - /bdrcrw is read-write
> > > > > #
> > > > > # This was painful to come up with but the web interface
> > > > > basically
> > > > > allows no option
> > > > > # and there is no subclass inference by default so such a
> > > > > configuration
> > > > > file is necessary.
> > > > > #
> > > > > # The main doc sources are:
> > > > > #  - https://jena.apache.org/documentation/fuseki2/fuseki-con
> > > > > figu
> > > > > ration
> > > > > .html
> > > > > #  - https://jena.apache.org/documentation/assembler/assemble
> > > > > r-ho
> > > > > wto.ht
> > > > > ml
> > > > > #  - https://jena.apache.org/documentation/assembler/assemble
> > > > > r.tt
> > > > > l
> > > > > #
> > > > > # See https://jena.apache.org/documentation/fuseki2/fuseki-la
> > > > > yout
> > > > > .html
> > > > > for the destination of this file.
> > > > > 
> > > > > @prefix fuseki:  <http://jena.apache.org/fuseki#> .
> > > > > @prefix rdf:     <http://www.w3.org/1999/02/22-rdf-syntax-ns#
> > > > > > .
> > > > > @prefix rdfs:    <http://www.w3.org/2000/01/rdf-schema#> .
> > > > > @prefix tdb:     <http://jena.hpl.hp.com/2008/tdb#> .
> > > > > # @prefix tdb2:    <http://jena.apache.org/2016/tdb#> .
> > > > > @prefix ja:      <http://jena.hpl.hp.com/2005/11/Assembler#>
> > > > > .
> > > > > @prefix :        <http://base/#> .
> > > > > @prefix text:    <http://jena.apache.org/text#> .
> > > > > @prefix skos:    <http://www.w3.org/2004/02/skos/core#> .
> > > > > @prefix adm:     <http://purl.bdrc.io/ontology/admin/> .
> > > > > @prefix bdd:     <http://purl.bdrc.io/data/> .
> > > > > @prefix bdo:     <http://purl.bdrc.io/ontology/core/> .
> > > > > @prefix bdr:     <http://purl.bdrc.io/resource/> .
> > > > > @prefix f:
> > > > > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
> > > > > .
> > > > > 
> > > > > ja:loadClass
> > > > > "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> > > > > # [] ja:loadClass "org.seaborne.tdb2.TDB2" .
> > > > > # tdb2:DatasetTDB2  rdfs:subClassOf  ja:RDFDataset .
> > > > > # tdb2:GraphTDB2    rdfs:subClassOf  ja:Model .
> > > > > 
> > > > > [] rdf:type fuseki:Server ;
> > > > >     fuseki:services (
> > > > >       :bdrcrw
> > > > > #      :bdrcro
> > > > >     ) .
> > > > > 
> > > > > :bdrcrw rdf:type fuseki:Service ;
> > > > >      fuseki:name                       "bdrcrw" ;     # name
> > > > > of
> > > > > the
> > > > > dataset in the url
> > > > >      fuseki:serviceQuery               "query" ;    # SPARQL
> > > > > query
> > > > > service
> > > > >      fuseki:serviceUpdate              "update" ;   # SPARQL
> > > > > update
> > > > > service
> > > > >      fuseki:serviceUpload              "upload" ;   # Non-
> > > > > SPARQL
> > > > > upload
> > > > > service
> > > > >      fuseki:serviceReadWriteGraphStore "data" ;     # SPARQL
> > > > > Graph store
> > > > > protocol (read and write)
> > > > >      fuseki:dataset                    :bdrc_text_dataset ;
> > > > >      .
> > > > > 
> > > > > # :bdrcro rdf:type fuseki:Service ;
> > > > > #     fuseki:name                     "bdrc" ;
> > > > > #     fuseki:serviceQuery             "query" ;
> > > > > #     fuseki:serviceReadGraphStore    "data" ;
> > > > > #     fuseki:dataset           		:bdrc_text_dat
> > > > > aset
> > > > > ;
> > > > > #     .
> > > > > 
> > > > > # using TDB
> > > > > :dataset_bdrc rdf:type      tdb:DatasetTDB ;
> > > > >       tdb:location "/etc/fuseki/databases/bdrc" ;
> > > > >       tdb:unionDefaultGraph true ;
> > > > >       .
> > > > > 
> > > > > # # try using TDB2
> > > > > # :dataset_bdrc rdf:type      tdb2:DatasetTDB2 ;
> > > > > #      tdb2:location "/etc/fuseki/databases/bdrc" ;
> > > > > #      tdb2:unionDefaultGraph true ;
> > > > > #   .
> > > > > 
> > > > > :bdrc_text_dataset rdf:type     text:TextDataset ;
> > > > >      text:dataset   :dataset_bdrc ;
> > > > >      text:index     :bdrc_lucene_index ;
> > > > >      .
> > > > > 
> > > > > # Text index description
> > > > > :bdrc_lucene_index a text:TextIndexLucene ;
> > > > >      text:directory <file:/etc/fuseki/lucene-bdrc> ;
> > > > >      text:storeValues true ;
> > > > >      text:multilingualSupport true ;
> > > > >      text:entityMap :bdrc_entmap ;
> > > > >      text:defineAnalyzers (
> > > > >          [ text:addLang "bo" ;
> > > > >            text:analyzer [
> > > > >              a text:GenericAnalyzer ;
> > > > >              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
> > > > >              text:params (
> > > > >                  [ text:paramName "segmentInWords" ;
> > > > >                    text:paramType text:TypeBoolean ;
> > > > >                    text:paramValue false ]
> > > > >                  [ text:paramName "lemmatize" ;
> > > > >                    text:paramType text:TypeBoolean ;
> > > > >                    text:paramValue true ]
> > > > >                  [ text:paramName "filterChars" ;
> > > > >                    text:paramType text:TypeBoolean ;
> > > > >                    text:paramValue false ]
> > > > >                  [ text:paramName "fromEwts" ;
> > > > >                    text:paramType text:TypeBoolean ;
> > > > >                    text:paramValue false ]
> > > > >                  )
> > > > >              ] ;
> > > > >            ]
> > > > >          [ text:addLang "bo-x-ewts" ;
> > > > >            text:analyzer [
> > > > >              a text:GenericAnalyzer ;
> > > > >              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
> > > > >              text:params (
> > > > >                  [ text:paramName "segmentInWords" ;
> > > > >                    text:paramType text:TypeBoolean ;
> > > > >                    text:paramValue false ]
> > > > >                  [ text:paramName "lemmatize" ;
> > > > >                    text:paramType text:TypeBoolean ;
> > > > >                    text:paramValue true ]
> > > > >                  [ text:paramName "filterChars" ;
> > > > >                    text:paramType text:TypeBoolean ;
> > > > >                    text:paramValue false ]
> > > > >                  [ text:paramName "fromEwts" ;
> > > > >                    text:paramType text:TypeBoolean ;
> > > > >                    text:paramValue true ]
> > > > >                  )
> > > > >              ] ;
> > > > >            ]
> > > > >        ) ;
> > > > >      .
> > > > > 
> > > > > # Index mappings
> > > > > :bdrc_entmap a text:EntityMap ;
> > > > >      text:entityField      "uri" ;
> > > > >      text:uidField         "uid" ;
> > > > >      text:defaultField     "label" ;
> > > > >      text:langField        "lang" ;
> > > > >      text:graphField       "graph" ; ## enable graph-specific
> > > > > indexing
> > > > >      text:map (
> > > > >           [ text:field "label" ;
> > > > >             text:predicate skos:prefLabel ]
> > > > >           [ text:field "altLabel" ;
> > > > >             text:predicate skos:altLabel ; ]
> > > > >           [ text:field "rdfsLabel" ;
> > > > >             text:predicate rdfs:label ; ]
> > > > >           [ text:field "chunkContents" ;
> > > > >             text:predicate bdo:chunkContents ; ]
> > > > >           [ text:field "eTextTitle" ;
> > > > >             text:predicate bdo:eTextTitle ; ]
> > > > >           [ text:field "logMessage" ;
> > > > >             text:predicate adm:logMessage ; ]
> > > > >           [ text:field "noteText" ;
> > > > >             text:predicate bdo:noteText ; ]
> > > > >           [ text:field "workAuthorshipStatement" ;
> > > > >             text:predicate bdo:workAuthorshipStatement ; ]
> > > > >           [ text:field "workColophon" ;
> > > > >             text:predicate bdo:workColophon ; ]
> > > > >           [ text:field "workEditionStatement" ;
> > > > >             text:predicate bdo:workEditionStatement ; ]
> > > > >           [ text:field "workPublisherLocation" ;
> > > > >             text:predicate bdo:workPublisherLocation ; ]
> > > > >           [ text:field "workPublisherName" ;
> > > > >             text:predicate bdo:workPublisherName ; ]
> > > > >           [ text:field "workSeriesName" ;
> > > > >             text:predicate bdo:workSeriesName ; ]
> > > > >           ) ;
> > > > >      .
> > > > > #############################################################
> > > > > ####
> > > > > ##
> > > > > 
> > > > > It would be wonderful to have the java:URI scheme thing
> > > > > working
> > > > > (all
> > > > > custom functions in a single class and method calls done
> > > > > directly
> > > > > in
> > > > > the sparql query : sounds like a dream !)
> > > > > 
> > > > > Marc
> > > > > 
> > > > > Le mardi 26 décembre 2017 à 13:22 -0500, ajs6f a écrit :
> > > > > > That exception doesn't appear to have anything to do with
> > > > > > extension
> > > > > > functions. It indicates a problem between client and
> > > > > > server.
> > > > > > 
> > > > > > Please show at _least_ your actual query execution code,
> > > > > > your
> > > > > > complete Fuseki config, and a complete stacktrace.
> > > > > > 
> > > > > > 
> > > > > > ajs6f
> > > > > > 
> > > > > > > On Dec 26, 2017, at 1:17 PM, Marc Agate <agate.marc@gmail
> > > > > > > .com
> > > > > > > > 
> > > > > > > 
> > > > > > > wrote:
> > > > > > > 
> > > > > > > I forgot to mention that according to
> > > > > > > https://jena.apache.org/documentation/query/java-uri.html
> > > > > > > I tried for testing purpose to set a PREFIX f:
> > > > > > > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions
> > > > > > > .>an
> > > > > > > d
> > > > > > > added
> > > > > > > the following to fuseki config :
> > > > > > > ja:loadClass
> > > > > > > "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
> > > > > > > .
> > > > > > > where CustomARQFunctions is :
> > > > > > > public class CustomARQFunctions {		public
> > > > > > > static
> > > > > > > NodeValue myFilter(NodeValue value1){		    
> > > > > > >     
> > > > > > > int i
> > > > > > > =
> > > > > > > value1.asString().length();         return
> > > > > > > NodeValue.makeInteger(i);     }
> > > > > > > }
> > > > > > > since according to
> > > > > > > https://jena.apache.org/documentation/query/writing_funct
> > > > > > > ions
> > > > > > > .html
> > > > > > > using the java:URI scheme "dynamically loads the code,
> > > > > > > which
> > > > > > > must
> > > > > > > be on
> > > > > > > the Java classpath. With this scheme, the function URI
> > > > > > > gives
> > > > > > > the
> > > > > > > class
> > > > > > > name. There is automatic registration of a wrapper into
> > > > > > > the
> > > > > > > function
> > > > > > > registry. This way, no explicit registration step is
> > > > > > > needed
> > > > > > > by the
> > > > > > > application and queries issues with the command line
> > > > > > > tools
> > > > > > > can load
> > > > > > > custom functions."
> > > > > > > but no luck: I keep getting the following exception:
> > > > > > > Exception in thread "main" HttpException: 404	at
> > > > > > > org.apache.jena.sparql.engine.http.HttpQuery.execGet(Http
> > > > > > > Quer
> > > > > > > y.java
> > > > > > > :328
> > > > > > > )	at
> > > > > > > org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQue
> > > > > > > ry.j
> > > > > > > ava:28
> > > > > > > 8)	
> > > > > > > at
> > > > > > > org.apache.jena.sparql.engine.http.QueryEngineHTTP.execRe
> > > > > > > sult
> > > > > > > SetInn
> > > > > > > er(Q
> > > > > > > ueryEngineHTTP.java:348)	at
> > > > > > > org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSe
> > > > > > > lect
> > > > > > > (Query
> > > > > > > Engi
> > > > > > > neHTTP.java:340)
> > > > > > > I am stuck !
> > > > > > > Marc
> > > > > > > Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a
> > > > > > > écrit :
> > > > > > > > Hi,
> > > > > > > > Adam's gave me the right direction.
> > > > > > > > I managed to load my function class in fuseki config
> > > > > > > > using
> > > > > > > > ja:loadClass
> > > > > > > > but now remains the following issue (the function is
> > > > > > > > not
> > > > > > > > registered)seefuseki logs :
> > > > > > > > [2017-12-26 16:10:13] exec       WARN  URI <http://purl
> > > > > > > > .bdr
> > > > > > > > c.io/f
> > > > > > > > unct
> > > > > > > > ions#MyFilterFunction> has no registered function
> > > > > > > > factory
> > > > > > > > How can I register this function now that I have the
> > > > > > > > code
> > > > > > > > available
> > > > > > > > onthe endpoint side ?
> > > > > > > > Thanks for helping
> > > > > > > > Marc.
> > > > > > > > 
> > > > > > > > Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne
> > > > > > > > a
> > > > > > > > écrit :
> > > > > > > > > As well s Adam's point (and all the libraries your
> > > > > > > > > function
> > > > > > > > > needs, transitively)
> > > > > > > > > What is in the Fuseki log file?How was the data
> > > > > > > > > loaded
> > > > > > > > > into
> > > > > > > > > Fuseki?
> > > > > > > > >   >> I printed out the >> FunctionRegistry
> > > > > > > > >       And
> > > > > > > > > On 26/12/17 14:51, ajs6f wrote:
> > > > > > > > > > I'm not as familiar with the extension points of
> > > > > > > > > > ARQ as
> > > > > > > > > > I
> > > > > > > > > > wouldlike to be, but as I understand what you are
> > > > > > > > > > doing, you
> > > > > > > > > > areregistering a new function with your _local_
> > > > > > > > > > registry,
> > > > > > > > > > then
> > > > > > > > > > firinga query at a _remote_ endpoint (which has a
> > > > > > > > > > completely
> > > > > > > > > > independentregistry in a different JVM in a
> > > > > > > > > > different
> > > > > > > > > > process,
> > > > > > > > > > potentially ina different _system_).
> > > > > > > > > > The query is getting interpreted and executed by
> > > > > > > > > > that
> > > > > > > > > > remoteservice, not locally. So you need to register
> > > > > > > > > > the
> > > > > > > > > > function
> > > > > > > > > > _there_.
> > > > > > > > > > Take a look at this thread:
> > > > > > > > > > https://lists.apache.org/thread.html/1cda23332af426
> > > > > > > > > > 4883
> > > > > > > > > > e88697
> > > > > > > > > > d994
> > > > > > > > > > 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.o
> > > > > > > > > > rg%3
> > > > > > > > > > E
> > > > > > > > > > It should get you started as to how to register
> > > > > > > > > > extensionfunctionality in Fuseki.
> > > > > > > > > > 
> > > > > > > > > > Adam Soroka
> > > > > > > > > > > On Dec 26, 2017, at 9:34 AM, Marc Agate <agate.ma
> > > > > > > > > > > rc@g
> > > > > > > > > > > mail.c
> > > > > > > > > > > om>
> > > > > > > > > > > wrote:
> > > > > > > > > > > 
> > > > > > > > > > > Hi !
> > > > > > > > > > > 
> > > > > > > > > > > I successfully implemented sparql queries using
> > > > > > > > > > > custom ARQ
> > > > > > > > > > > functions
> > > > > > > > > > > using the following (custom function code):
> > > > > > > > > > > 
> > > > > > > > > > > ****************
> > > > > > > > > > > public class LevenshteinFilter extends
> > > > > > > > > > > FunctionBase2
> > > > > > > > > > > {
> > > > > > > > > > > 
> > > > > > > > > > >       public LevenshteinFilter() { super() ; }
> > > > > > > > > > > 
> > > > > > > > > > >       public NodeValue exec(NodeValue value1,
> > > > > > > > > > > NodeValue
> > > > > > > > > > > value2){
> > > > > > > > > > >           LevenshteinDistance LD=new
> > > > > > > > > > > LevenshteinDistance();
> > > > > > > > > > >           int i = LD.apply(value1.asString(),
> > > > > > > > > > > value2.asString());
> > > > > > > > > > >           return NodeValue.makeInteger(i);
> > > > > > > > > > >       }
> > > > > > > > > > > }
> > > > > > > > > > > ***************
> > > > > > > > > > > 
> > > > > > > > > > > it works fine when I query against a Model loaded
> > > > > > > > > > > from a
> > > > > > > > > > > turtle
> > > > > > > > > > > file,
> > > > > > > > > > > like this:
> > > > > > > > > > > 
> > > > > > > > > > > ***************
> > > > > > > > > > > InputStream input =
> > > > > > > > > > > QueryProcessor.class.getClassLoader().getResource
> > > > > > > > > > > AsSt
> > > > > > > > > > > ream("
> > > > > > > > > > > full
> > > > > > > > > > > .t
> > > > > > > > > > > tl");
> > > > > > > > > > >               model =
> > > > > > > > > > > ModelFactory.createMemModelMaker().createModel("d
> > > > > > > > > > > efau
> > > > > > > > > > > lt");
> > > > > > > > > > >               model.read(input,null,"TURTLE"); //
> > > > > > > > > > > null base
> > > > > > > > > > > URI,
> > > > > > > > > > > since
> > > > > > > > > > > model URIs are absolute
> > > > > > > > > > >               input.close();
> > > > > > > > > > > ***************
> > > > > > > > > > > 
> > > > > > > > > > > with the query being sent like this :
> > > > > > > > > > > 
> > > > > > > > > > > ***************
> > > > > > > > > > > String functionUri = "http://www.example1.org/Lev
> > > > > > > > > > > ensh
> > > > > > > > > > > teinFu
> > > > > > > > > > > ncti
> > > > > > > > > > > on
> > > > > > > > > > > ";
> > > > > > > > > > >           FunctionRegistry.get().put(functionUri
> > > > > > > > > > > ,
> > > > > > > > > > > LevenshteinFilter.class);
> > > > > > > > > > > 
> > > > > > > > > > >           String s = "whatever you want";
> > > > > > > > > > >           String sparql = prefixes+" SELECT
> > > > > > > > > > > DISTINCT
> > > > > > > > > > > ?l
> > > > > > > > > > > WHERE {
> > > > > > > > > > > ?x
> > > > > > > > > > > rdfs:label ?l . "
> > > > > > > > > > > +  "FILTER(fct:LevenshteinFunction(?l,
> > > > > > > > > > > \"" +
> > > > > > > > > > > s
> > > > > > > > > > > + "\")
> > > > > > > > > > > < 4) }";
> > > > > > > > > > >           Query query =
> > > > > > > > > > > QueryFactory.create(sparql);
> > > > > > > > > > >           QueryExecution qexec =
> > > > > > > > > > > QueryExecutionFactory.create(query,
> > > > > > > > > > > model);
> > > > > > > > > > >           ResultSet rs = qexec.execSelect();
> > > > > > > > > > > ***************
> > > > > > > > > > > 
> > > > > > > > > > > However, if i use a working fuseki endpoint for
> > > > > > > > > > > the
> > > > > > > > > > > same
> > > > > > > > > > > dataset
> > > > > > > > > > > (full.ttl) like this :
> > > > > > > > > > > 
> > > > > > > > > > > ***************
> > > > > > > > > > > fusekiUrl="http://localhost:3030/ds/query";
> > > > > > > > > > > ***************
> > > > > > > > > > > 
> > > > > > > > > > > sending the query like this (using
> > > > > > > > > > > QueryExecutionFactory.sparqlService(fusekiUrl,que
> > > > > > > > > > > ry)
> > > > > > > > > > > instead of
> > > > > > > > > > > QueryExecutionFactory.create(query,model) ):
> > > > > > > > > > > 
> > > > > > > > > > > ***************
> > > > > > > > > > > String functionUri = "http://www.example1.org/Lev
> > > > > > > > > > > ensh
> > > > > > > > > > > teinFu
> > > > > > > > > > > ncti
> > > > > > > > > > > on
> > > > > > > > > > > ";
> > > > > > > > > > >           FunctionRegistry.get().put(functionUri
> > > > > > > > > > > ,
> > > > > > > > > > > LevenshteinFilter.class);
> > > > > > > > > > > 
> > > > > > > > > > >           String s = "whatever you want";
> > > > > > > > > > >           String sparql = prefixes+" SELECT
> > > > > > > > > > > DISTINCT
> > > > > > > > > > > ?l
> > > > > > > > > > > WHERE {
> > > > > > > > > > > ?x
> > > > > > > > > > > rdfs:label ?l . " +
> > > > > > > > > > > "FILTER(fct:LevenshteinFunction(?l, \""
> > > > > > > > > > > + s
> > > > > > > > > > > +
> > > > > > > > > > > "\")
> > > > > > > > > > > < 4) }";
> > > > > > > > > > >           Query query =
> > > > > > > > > > > QueryFactory.create(sparql);
> > > > > > > > > > >           QueryExecution qexec =
> > > > > > > > > > > QueryExecutionFactory.sparqlService(fusekiUrl,que
> > > > > > > > > > > ry);
> > > > > > > > > > >           ResultSet rs = qexec.execSelect();
> > > > > > > > > > > ***************
> > > > > > > > > > > 
> > > > > > > > > > > Then I don't get any results back. In both cases
> > > > > > > > > > > I
> > > > > > > > > > > printed
> > > > > > > > > > > out
> > > > > > > > > > > the
> > > > > > > > > > > FunctionRegistry and they contain exactly the
> > > > > > > > > > > same
> > > > > > > > > > > entries,
> > > > > > > > > > > especially
> > > > > > > > > > > :
> > > > > > > > > > > 
> > > > > > > > > > > key=http://www.example1.org/LevenshteinFunction
> > > > > > > > > > > value:
> > > > > > > > > > > org.apache.jena.sparql.function.FunctionFactoryAu
> > > > > > > > > > > to@5
> > > > > > > > > > > a45133
> > > > > > > > > > > e
> > > > > > > > > > > 
> > > > > > > > > > > Any clue ?
> > > > > > > > > > > 
> > > > > > > > > > > Thanks
> > > > > > 
> > > > > > 
> 
> 

Re: Custom ARQ function not working with fuseki endpoint

Posted by ajs6f <aj...@apache.org>.
I'm glad you got your function working. Now that I look at it, it seems possible that you could use a built-in SPARQL function:

https://www.w3.org/TR/sparql11-query/#func-strlen

Adam Soroka

> On Dec 26, 2017, at 3:50 PM, Marc Agate <ag...@gmail.com> wrote:
> 
> Hi,
> 
> I finally got the whole test working properly
> 
> To summarize:
> 
> 1) Deploy the custom functions classes on fuseki server
> 2) Modify the fuseki config :
> 
> add [] ja:loadClass "io.bdrc.ldsearch.query.functions.MyFilter"
> 
> where MyFilter implementation is :
> 
> import org.apache.jena.sparql.expr.NodeValue;
> import org.apache.jena.sparql.function.FunctionBase1;
> 
> 
> public class MyFilter extends FunctionBase1 {
> 	
> 	public MyFilter() { super() ; }
> 	
> 	public NodeValue exec(NodeValue value1){
> 		
>         int d = value1.asString().length();
>         return NodeValue.makeInteger(new Integer(d)); 
>     }
> }
> 
> 3) add the following prefix to the context:
> 
> PREFIX f: <java:io.bdrc.ldsearch.query.functions.>
> 
> Note that "io.bdrc.ldsearch.query.functions is the package of MyFilter
> class, not the class itself --> this means that you never call a class
> method in sparql queries but only a class that implements
> org.apache.jena.sparql.function.FunctionBaseX where X is the number of
> argument of your filter function
> 
> 4) Write (for instance) the query like this:
> 
> SELECT DISTINCT ?l
> WHERE { ?x skos:prefLabel ?l .  
>   FILTER (f:MyFilter(?l) < 20) 
> }
> 
> Thanks to Adam and Andy
> 
> Marc
> 
> Le mardi 26 décembre 2017 à 19:44 +0000, Andy Seaborne a écrit :
>>> Exception in thread "main" HttpException: 404	at
>>> org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.java
>>> :328
>>> )	
>> 
>> 404 - HTTP not found - when trying to call Fuseki - i.e. the query 
>> service URL is wrong. A server is running at the address (host:port)
>> but 
>> the path does not name an HTTP resource.
>> 
>>      Andy
>> 
>> 
>> On 26/12/17 19:00, ajs6f wrote:
>>> I don't understand how that config is getting parsed at all. It's
>>> not valid Turtle.
>>> 
>>>> ja:loadClass
>>>> "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
>>> 
>>> is not a triple at all. It should probably be:
>>> 
>>> [] ja:loadClass
>>> "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
>>> 
>>> Riot gives "Expected IRI for predicate: got:
>>> [STRING:io.bdrc.ldsearch.query.functions.CustomARQFunctions]"
>>> 
>>> It's not clear to me how you could be successfully loading your
>>> extension function with invalid config.
>>> 
>>> Adam Soroka
>>> 
>>> 
>>>> On Dec 26, 2017, at 1:42 PM, Marc Agate <ag...@gmail.com>
>>>> wrote:
>>>> 
>>>> Well...
>>>> 
>>>> Here is the query
>>>> 
>>>> PREFIX : <http://purl.bdrc.io/ontology/core/>
>>>> PREFIX adm: <http://purl.bdrc.io/ontology/admin/>
>>>> PREFIX bdr: <http://purl.bdrc.io/resource/>
>>>> PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
>>>> PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
>>>> PREFIX skos: <http://www.w3.org/2004/02/skos/core#>
>>>> PREFIX tbr: <http://purl.bdrc.io/ontology/toberemoved/>
>>>> PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>
>>>> PREFIX f:
>>>> <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
>>>> 
>>>> SELECT DISTINCT ?l
>>>> WHERE {
>>>> ?x skos:prefLabel ?l .
>>>> FILTER (f:myFilter(?l) < 100)
>>>> }
>>>> 
>>>> Note that whene I change FILTER (f:myFilter(?l) < 100)  to FILTER
>>>> (STRLEN(?l) < 100), I don't get the 404 exception...
>>>> Therefore it's not a connection issue. I thing it's more like a
>>>> unresolved URI or so.
>>>> 
>>>> Now, since you'are asking for it, here is the full fuseki config:
>>>> 
>>>> ################################################################
>>>> # Fuseki configuration for BDRC, configures two endpoints:
>>>> #   - /bdrc is read-only
>>>> #   - /bdrcrw is read-write
>>>> #
>>>> # This was painful to come up with but the web interface
>>>> basically
>>>> allows no option
>>>> # and there is no subclass inference by default so such a
>>>> configuration
>>>> file is necessary.
>>>> #
>>>> # The main doc sources are:
>>>> #  - https://jena.apache.org/documentation/fuseki2/fuseki-configu
>>>> ration
>>>> .html
>>>> #  - https://jena.apache.org/documentation/assembler/assembler-ho
>>>> wto.ht
>>>> ml
>>>> #  - https://jena.apache.org/documentation/assembler/assembler.tt
>>>> l
>>>> #
>>>> # See https://jena.apache.org/documentation/fuseki2/fuseki-layout
>>>> .html
>>>> for the destination of this file.
>>>> 
>>>> @prefix fuseki:  <http://jena.apache.org/fuseki#> .
>>>> @prefix rdf:     <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>>>> @prefix rdfs:    <http://www.w3.org/2000/01/rdf-schema#> .
>>>> @prefix tdb:     <http://jena.hpl.hp.com/2008/tdb#> .
>>>> # @prefix tdb2:    <http://jena.apache.org/2016/tdb#> .
>>>> @prefix ja:      <http://jena.hpl.hp.com/2005/11/Assembler#> .
>>>> @prefix :        <http://base/#> .
>>>> @prefix text:    <http://jena.apache.org/text#> .
>>>> @prefix skos:    <http://www.w3.org/2004/02/skos/core#> .
>>>> @prefix adm:     <http://purl.bdrc.io/ontology/admin/> .
>>>> @prefix bdd:     <http://purl.bdrc.io/data/> .
>>>> @prefix bdo:     <http://purl.bdrc.io/ontology/core/> .
>>>> @prefix bdr:     <http://purl.bdrc.io/resource/> .
>>>> @prefix f:
>>>> <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
>>>> .
>>>> 
>>>> ja:loadClass
>>>> "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
>>>> # [] ja:loadClass "org.seaborne.tdb2.TDB2" .
>>>> # tdb2:DatasetTDB2  rdfs:subClassOf  ja:RDFDataset .
>>>> # tdb2:GraphTDB2    rdfs:subClassOf  ja:Model .
>>>> 
>>>> [] rdf:type fuseki:Server ;
>>>>     fuseki:services (
>>>>       :bdrcrw
>>>> #      :bdrcro
>>>>     ) .
>>>> 
>>>> :bdrcrw rdf:type fuseki:Service ;
>>>>      fuseki:name                       "bdrcrw" ;     # name of
>>>> the
>>>> dataset in the url
>>>>      fuseki:serviceQuery               "query" ;    # SPARQL
>>>> query
>>>> service
>>>>      fuseki:serviceUpdate              "update" ;   # SPARQL
>>>> update
>>>> service
>>>>      fuseki:serviceUpload              "upload" ;   # Non-SPARQL
>>>> upload
>>>> service
>>>>      fuseki:serviceReadWriteGraphStore "data" ;     # SPARQL
>>>> Graph store
>>>> protocol (read and write)
>>>>      fuseki:dataset                    :bdrc_text_dataset ;
>>>>      .
>>>> 
>>>> # :bdrcro rdf:type fuseki:Service ;
>>>> #     fuseki:name                     "bdrc" ;
>>>> #     fuseki:serviceQuery             "query" ;
>>>> #     fuseki:serviceReadGraphStore    "data" ;
>>>> #     fuseki:dataset           		:bdrc_text_dataset
>>>> ;
>>>> #     .
>>>> 
>>>> # using TDB
>>>> :dataset_bdrc rdf:type      tdb:DatasetTDB ;
>>>>       tdb:location "/etc/fuseki/databases/bdrc" ;
>>>>       tdb:unionDefaultGraph true ;
>>>>       .
>>>> 
>>>> # # try using TDB2
>>>> # :dataset_bdrc rdf:type      tdb2:DatasetTDB2 ;
>>>> #      tdb2:location "/etc/fuseki/databases/bdrc" ;
>>>> #      tdb2:unionDefaultGraph true ;
>>>> #   .
>>>> 
>>>> :bdrc_text_dataset rdf:type     text:TextDataset ;
>>>>      text:dataset   :dataset_bdrc ;
>>>>      text:index     :bdrc_lucene_index ;
>>>>      .
>>>> 
>>>> # Text index description
>>>> :bdrc_lucene_index a text:TextIndexLucene ;
>>>>      text:directory <file:/etc/fuseki/lucene-bdrc> ;
>>>>      text:storeValues true ;
>>>>      text:multilingualSupport true ;
>>>>      text:entityMap :bdrc_entmap ;
>>>>      text:defineAnalyzers (
>>>>          [ text:addLang "bo" ;
>>>>            text:analyzer [
>>>>              a text:GenericAnalyzer ;
>>>>              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
>>>>              text:params (
>>>>                  [ text:paramName "segmentInWords" ;
>>>>                    text:paramType text:TypeBoolean ;
>>>>                    text:paramValue false ]
>>>>                  [ text:paramName "lemmatize" ;
>>>>                    text:paramType text:TypeBoolean ;
>>>>                    text:paramValue true ]
>>>>                  [ text:paramName "filterChars" ;
>>>>                    text:paramType text:TypeBoolean ;
>>>>                    text:paramValue false ]
>>>>                  [ text:paramName "fromEwts" ;
>>>>                    text:paramType text:TypeBoolean ;
>>>>                    text:paramValue false ]
>>>>                  )
>>>>              ] ;
>>>>            ]
>>>>          [ text:addLang "bo-x-ewts" ;
>>>>            text:analyzer [
>>>>              a text:GenericAnalyzer ;
>>>>              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
>>>>              text:params (
>>>>                  [ text:paramName "segmentInWords" ;
>>>>                    text:paramType text:TypeBoolean ;
>>>>                    text:paramValue false ]
>>>>                  [ text:paramName "lemmatize" ;
>>>>                    text:paramType text:TypeBoolean ;
>>>>                    text:paramValue true ]
>>>>                  [ text:paramName "filterChars" ;
>>>>                    text:paramType text:TypeBoolean ;
>>>>                    text:paramValue false ]
>>>>                  [ text:paramName "fromEwts" ;
>>>>                    text:paramType text:TypeBoolean ;
>>>>                    text:paramValue true ]
>>>>                  )
>>>>              ] ;
>>>>            ]
>>>>        ) ;
>>>>      .
>>>> 
>>>> # Index mappings
>>>> :bdrc_entmap a text:EntityMap ;
>>>>      text:entityField      "uri" ;
>>>>      text:uidField         "uid" ;
>>>>      text:defaultField     "label" ;
>>>>      text:langField        "lang" ;
>>>>      text:graphField       "graph" ; ## enable graph-specific
>>>> indexing
>>>>      text:map (
>>>>           [ text:field "label" ;
>>>>             text:predicate skos:prefLabel ]
>>>>           [ text:field "altLabel" ;
>>>>             text:predicate skos:altLabel ; ]
>>>>           [ text:field "rdfsLabel" ;
>>>>             text:predicate rdfs:label ; ]
>>>>           [ text:field "chunkContents" ;
>>>>             text:predicate bdo:chunkContents ; ]
>>>>           [ text:field "eTextTitle" ;
>>>>             text:predicate bdo:eTextTitle ; ]
>>>>           [ text:field "logMessage" ;
>>>>             text:predicate adm:logMessage ; ]
>>>>           [ text:field "noteText" ;
>>>>             text:predicate bdo:noteText ; ]
>>>>           [ text:field "workAuthorshipStatement" ;
>>>>             text:predicate bdo:workAuthorshipStatement ; ]
>>>>           [ text:field "workColophon" ;
>>>>             text:predicate bdo:workColophon ; ]
>>>>           [ text:field "workEditionStatement" ;
>>>>             text:predicate bdo:workEditionStatement ; ]
>>>>           [ text:field "workPublisherLocation" ;
>>>>             text:predicate bdo:workPublisherLocation ; ]
>>>>           [ text:field "workPublisherName" ;
>>>>             text:predicate bdo:workPublisherName ; ]
>>>>           [ text:field "workSeriesName" ;
>>>>             text:predicate bdo:workSeriesName ; ]
>>>>           ) ;
>>>>      .
>>>> #################################################################
>>>> ##
>>>> 
>>>> It would be wonderful to have the java:URI scheme thing working
>>>> (all
>>>> custom functions in a single class and method calls done directly
>>>> in
>>>> the sparql query : sounds like a dream !)
>>>> 
>>>> Marc
>>>> 
>>>> Le mardi 26 décembre 2017 à 13:22 -0500, ajs6f a écrit :
>>>>> That exception doesn't appear to have anything to do with
>>>>> extension
>>>>> functions. It indicates a problem between client and server.
>>>>> 
>>>>> Please show at _least_ your actual query execution code, your
>>>>> complete Fuseki config, and a complete stacktrace.
>>>>> 
>>>>> 
>>>>> ajs6f
>>>>> 
>>>>>> On Dec 26, 2017, at 1:17 PM, Marc Agate <agate.marc@gmail.com
>>>>>>> 
>>>>>> wrote:
>>>>>> 
>>>>>> I forgot to mention that according to
>>>>>> https://jena.apache.org/documentation/query/java-uri.html
>>>>>> I tried for testing purpose to set a PREFIX f:
>>>>>> <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>an
>>>>>> d
>>>>>> added
>>>>>> the following to fuseki config :
>>>>>> ja:loadClass
>>>>>> "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
>>>>>> .
>>>>>> where CustomARQFunctions is :
>>>>>> public class CustomARQFunctions {		public
>>>>>> static
>>>>>> NodeValue myFilter(NodeValue value1){		        
>>>>>> int i
>>>>>> =
>>>>>> value1.asString().length();         return
>>>>>> NodeValue.makeInteger(i);     }
>>>>>> }
>>>>>> since according to
>>>>>> https://jena.apache.org/documentation/query/writing_functions
>>>>>> .html
>>>>>> using the java:URI scheme "dynamically loads the code, which
>>>>>> must
>>>>>> be on
>>>>>> the Java classpath. With this scheme, the function URI gives
>>>>>> the
>>>>>> class
>>>>>> name. There is automatic registration of a wrapper into the
>>>>>> function
>>>>>> registry. This way, no explicit registration step is needed
>>>>>> by the
>>>>>> application and queries issues with the command line tools
>>>>>> can load
>>>>>> custom functions."
>>>>>> but no luck: I keep getting the following exception:
>>>>>> Exception in thread "main" HttpException: 404	at
>>>>>> org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuer
>>>>>> y.java
>>>>>> :328
>>>>>> )	at
>>>>>> org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQuery.j
>>>>>> ava:28
>>>>>> 8)	
>>>>>> at
>>>>>> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execResult
>>>>>> SetInn
>>>>>> er(Q
>>>>>> ueryEngineHTTP.java:348)	at
>>>>>> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSelect
>>>>>> (Query
>>>>>> Engi
>>>>>> neHTTP.java:340)
>>>>>> I am stuck !
>>>>>> Marc
>>>>>> Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a écrit :
>>>>>>> Hi,
>>>>>>> Adam's gave me the right direction.
>>>>>>> I managed to load my function class in fuseki config using
>>>>>>> ja:loadClass
>>>>>>> but now remains the following issue (the function is not
>>>>>>> registered)seefuseki logs :
>>>>>>> [2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdr
>>>>>>> c.io/f
>>>>>>> unct
>>>>>>> ions#MyFilterFunction> has no registered function factory
>>>>>>> How can I register this function now that I have the code
>>>>>>> available
>>>>>>> onthe endpoint side ?
>>>>>>> Thanks for helping
>>>>>>> Marc.
>>>>>>> 
>>>>>>> Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a
>>>>>>> écrit :
>>>>>>>> As well s Adam's point (and all the libraries your
>>>>>>>> function
>>>>>>>> needs, transitively)
>>>>>>>> What is in the Fuseki log file?How was the data loaded
>>>>>>>> into
>>>>>>>> Fuseki?
>>>>>>>>   >> I printed out the >> FunctionRegistry
>>>>>>>>       And
>>>>>>>> On 26/12/17 14:51, ajs6f wrote:
>>>>>>>>> I'm not as familiar with the extension points of ARQ as
>>>>>>>>> I
>>>>>>>>> wouldlike to be, but as I understand what you are
>>>>>>>>> doing, you
>>>>>>>>> areregistering a new function with your _local_
>>>>>>>>> registry,
>>>>>>>>> then
>>>>>>>>> firinga query at a _remote_ endpoint (which has a
>>>>>>>>> completely
>>>>>>>>> independentregistry in a different JVM in a different
>>>>>>>>> process,
>>>>>>>>> potentially ina different _system_).
>>>>>>>>> The query is getting interpreted and executed by that
>>>>>>>>> remoteservice, not locally. So you need to register the
>>>>>>>>> function
>>>>>>>>> _there_.
>>>>>>>>> Take a look at this thread:
>>>>>>>>> https://lists.apache.org/thread.html/1cda23332af4264883
>>>>>>>>> e88697
>>>>>>>>> d994
>>>>>>>>> 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3
>>>>>>>>> E
>>>>>>>>> It should get you started as to how to register
>>>>>>>>> extensionfunctionality in Fuseki.
>>>>>>>>> 
>>>>>>>>> Adam Soroka
>>>>>>>>>> On Dec 26, 2017, at 9:34 AM, Marc Agate <agate.marc@g
>>>>>>>>>> mail.c
>>>>>>>>>> om>
>>>>>>>>>> wrote:
>>>>>>>>>> 
>>>>>>>>>> Hi !
>>>>>>>>>> 
>>>>>>>>>> I successfully implemented sparql queries using
>>>>>>>>>> custom ARQ
>>>>>>>>>> functions
>>>>>>>>>> using the following (custom function code):
>>>>>>>>>> 
>>>>>>>>>> ****************
>>>>>>>>>> public class LevenshteinFilter extends FunctionBase2
>>>>>>>>>> {
>>>>>>>>>> 
>>>>>>>>>>       public LevenshteinFilter() { super() ; }
>>>>>>>>>> 
>>>>>>>>>>       public NodeValue exec(NodeValue value1,
>>>>>>>>>> NodeValue
>>>>>>>>>> value2){
>>>>>>>>>>           LevenshteinDistance LD=new
>>>>>>>>>> LevenshteinDistance();
>>>>>>>>>>           int i = LD.apply(value1.asString(),
>>>>>>>>>> value2.asString());
>>>>>>>>>>           return NodeValue.makeInteger(i);
>>>>>>>>>>       }
>>>>>>>>>> }
>>>>>>>>>> ***************
>>>>>>>>>> 
>>>>>>>>>> it works fine when I query against a Model loaded
>>>>>>>>>> from a
>>>>>>>>>> turtle
>>>>>>>>>> file,
>>>>>>>>>> like this:
>>>>>>>>>> 
>>>>>>>>>> ***************
>>>>>>>>>> InputStream input =
>>>>>>>>>> QueryProcessor.class.getClassLoader().getResourceAsSt
>>>>>>>>>> ream("
>>>>>>>>>> full
>>>>>>>>>> .t
>>>>>>>>>> tl");
>>>>>>>>>>               model =
>>>>>>>>>> ModelFactory.createMemModelMaker().createModel("defau
>>>>>>>>>> lt");
>>>>>>>>>>               model.read(input,null,"TURTLE"); //
>>>>>>>>>> null base
>>>>>>>>>> URI,
>>>>>>>>>> since
>>>>>>>>>> model URIs are absolute
>>>>>>>>>>               input.close();
>>>>>>>>>> ***************
>>>>>>>>>> 
>>>>>>>>>> with the query being sent like this :
>>>>>>>>>> 
>>>>>>>>>> ***************
>>>>>>>>>> String functionUri = "http://www.example1.org/Levensh
>>>>>>>>>> teinFu
>>>>>>>>>> ncti
>>>>>>>>>> on
>>>>>>>>>> ";
>>>>>>>>>>           FunctionRegistry.get().put(functionUri ,
>>>>>>>>>> LevenshteinFilter.class);
>>>>>>>>>> 
>>>>>>>>>>           String s = "whatever you want";
>>>>>>>>>>           String sparql = prefixes+" SELECT DISTINCT
>>>>>>>>>> ?l
>>>>>>>>>> WHERE {
>>>>>>>>>> ?x
>>>>>>>>>> rdfs:label ?l . "
>>>>>>>>>> +  "FILTER(fct:LevenshteinFunction(?l,
>>>>>>>>>> \"" +
>>>>>>>>>> s
>>>>>>>>>> + "\")
>>>>>>>>>> < 4) }";
>>>>>>>>>>           Query query = QueryFactory.create(sparql);
>>>>>>>>>>           QueryExecution qexec =
>>>>>>>>>> QueryExecutionFactory.create(query,
>>>>>>>>>> model);
>>>>>>>>>>           ResultSet rs = qexec.execSelect();
>>>>>>>>>> ***************
>>>>>>>>>> 
>>>>>>>>>> However, if i use a working fuseki endpoint for the
>>>>>>>>>> same
>>>>>>>>>> dataset
>>>>>>>>>> (full.ttl) like this :
>>>>>>>>>> 
>>>>>>>>>> ***************
>>>>>>>>>> fusekiUrl="http://localhost:3030/ds/query";
>>>>>>>>>> ***************
>>>>>>>>>> 
>>>>>>>>>> sending the query like this (using
>>>>>>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query)
>>>>>>>>>> instead of
>>>>>>>>>> QueryExecutionFactory.create(query,model) ):
>>>>>>>>>> 
>>>>>>>>>> ***************
>>>>>>>>>> String functionUri = "http://www.example1.org/Levensh
>>>>>>>>>> teinFu
>>>>>>>>>> ncti
>>>>>>>>>> on
>>>>>>>>>> ";
>>>>>>>>>>           FunctionRegistry.get().put(functionUri ,
>>>>>>>>>> LevenshteinFilter.class);
>>>>>>>>>> 
>>>>>>>>>>           String s = "whatever you want";
>>>>>>>>>>           String sparql = prefixes+" SELECT DISTINCT
>>>>>>>>>> ?l
>>>>>>>>>> WHERE {
>>>>>>>>>> ?x
>>>>>>>>>> rdfs:label ?l . " +
>>>>>>>>>> "FILTER(fct:LevenshteinFunction(?l, \""
>>>>>>>>>> + s
>>>>>>>>>> +
>>>>>>>>>> "\")
>>>>>>>>>> < 4) }";
>>>>>>>>>>           Query query = QueryFactory.create(sparql);
>>>>>>>>>>           QueryExecution qexec =
>>>>>>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query);
>>>>>>>>>>           ResultSet rs = qexec.execSelect();
>>>>>>>>>> ***************
>>>>>>>>>> 
>>>>>>>>>> Then I don't get any results back. In both cases I
>>>>>>>>>> printed
>>>>>>>>>> out
>>>>>>>>>> the
>>>>>>>>>> FunctionRegistry and they contain exactly the same
>>>>>>>>>> entries,
>>>>>>>>>> especially
>>>>>>>>>> :
>>>>>>>>>> 
>>>>>>>>>> key=http://www.example1.org/LevenshteinFunction
>>>>>>>>>> value:
>>>>>>>>>> org.apache.jena.sparql.function.FunctionFactoryAuto@5
>>>>>>>>>> a45133
>>>>>>>>>> e
>>>>>>>>>> 
>>>>>>>>>> Any clue ?
>>>>>>>>>> 
>>>>>>>>>> Thanks
>>>>> 
>>>>> 


Re: Custom ARQ function not working with fuseki endpoint

Posted by Marc Agate <ag...@gmail.com>.
Hi,

I finally got the whole test working properly

To summarize:

1) Deploy the custom functions classes on fuseki server
2) Modify the fuseki config :

add [] ja:loadClass "io.bdrc.ldsearch.query.functions.MyFilter"

where MyFilter implementation is :

import org.apache.jena.sparql.expr.NodeValue;
import org.apache.jena.sparql.function.FunctionBase1;


public class MyFilter extends FunctionBase1 {
	
	public MyFilter() { super() ; }
	
	public NodeValue exec(NodeValue value1){
		
        int d = value1.asString().length();
        return NodeValue.makeInteger(new Integer(d)); 
    }
}

3) add the following prefix to the context:

PREFIX f: <java:io.bdrc.ldsearch.query.functions.>

Note that "io.bdrc.ldsearch.query.functions is the package of MyFilter
class, not the class itself --> this means that you never call a class
method in sparql queries but only a class that implements
org.apache.jena.sparql.function.FunctionBaseX where X is the number of
argument of your filter function

4) Write (for instance) the query like this:

SELECT DISTINCT ?l
WHERE { ?x skos:prefLabel ?l .  
  FILTER (f:MyFilter(?l) < 20) 
}

Thanks to Adam and Andy

Marc

Le mardi 26 décembre 2017 à 19:44 +0000, Andy Seaborne a écrit :
> > Exception in thread "main" HttpException: 404	at
> > org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.java
> > :328
> > )	
> 
> 404 - HTTP not found - when trying to call Fuseki - i.e. the query 
> service URL is wrong. A server is running at the address (host:port)
> but 
> the path does not name an HTTP resource.
> 
>      Andy
> 
> 
> On 26/12/17 19:00, ajs6f wrote:
> > I don't understand how that config is getting parsed at all. It's
> > not valid Turtle.
> > 
> > > ja:loadClass
> > > "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> > 
> > is not a triple at all. It should probably be:
> > 
> > [] ja:loadClass
> > "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> > 
> > Riot gives "Expected IRI for predicate: got:
> > [STRING:io.bdrc.ldsearch.query.functions.CustomARQFunctions]"
> > 
> > It's not clear to me how you could be successfully loading your
> > extension function with invalid config.
> > 
> > Adam Soroka
> > 
> > 
> > > On Dec 26, 2017, at 1:42 PM, Marc Agate <ag...@gmail.com>
> > > wrote:
> > > 
> > > Well...
> > > 
> > > Here is the query
> > > 
> > > PREFIX : <http://purl.bdrc.io/ontology/core/>
> > > PREFIX adm: <http://purl.bdrc.io/ontology/admin/>
> > > PREFIX bdr: <http://purl.bdrc.io/resource/>
> > > PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
> > > PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
> > > PREFIX skos: <http://www.w3.org/2004/02/skos/core#>
> > > PREFIX tbr: <http://purl.bdrc.io/ontology/toberemoved/>
> > > PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>
> > > PREFIX f:
> > > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
> > > 
> > > SELECT DISTINCT ?l
> > > WHERE {
> > > ?x skos:prefLabel ?l .
> > > FILTER (f:myFilter(?l) < 100)
> > > }
> > > 
> > > Note that whene I change FILTER (f:myFilter(?l) < 100)  to FILTER
> > > (STRLEN(?l) < 100), I don't get the 404 exception...
> > > Therefore it's not a connection issue. I thing it's more like a
> > > unresolved URI or so.
> > > 
> > > Now, since you'are asking for it, here is the full fuseki config:
> > > 
> > > ################################################################
> > > # Fuseki configuration for BDRC, configures two endpoints:
> > > #   - /bdrc is read-only
> > > #   - /bdrcrw is read-write
> > > #
> > > # This was painful to come up with but the web interface
> > > basically
> > > allows no option
> > > # and there is no subclass inference by default so such a
> > > configuration
> > > file is necessary.
> > > #
> > > # The main doc sources are:
> > > #  - https://jena.apache.org/documentation/fuseki2/fuseki-configu
> > > ration
> > > .html
> > > #  - https://jena.apache.org/documentation/assembler/assembler-ho
> > > wto.ht
> > > ml
> > > #  - https://jena.apache.org/documentation/assembler/assembler.tt
> > > l
> > > #
> > > # See https://jena.apache.org/documentation/fuseki2/fuseki-layout
> > > .html
> > > for the destination of this file.
> > > 
> > > @prefix fuseki:  <http://jena.apache.org/fuseki#> .
> > > @prefix rdf:     <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> > > @prefix rdfs:    <http://www.w3.org/2000/01/rdf-schema#> .
> > > @prefix tdb:     <http://jena.hpl.hp.com/2008/tdb#> .
> > > # @prefix tdb2:    <http://jena.apache.org/2016/tdb#> .
> > > @prefix ja:      <http://jena.hpl.hp.com/2005/11/Assembler#> .
> > > @prefix :        <http://base/#> .
> > > @prefix text:    <http://jena.apache.org/text#> .
> > > @prefix skos:    <http://www.w3.org/2004/02/skos/core#> .
> > > @prefix adm:     <http://purl.bdrc.io/ontology/admin/> .
> > > @prefix bdd:     <http://purl.bdrc.io/data/> .
> > > @prefix bdo:     <http://purl.bdrc.io/ontology/core/> .
> > > @prefix bdr:     <http://purl.bdrc.io/resource/> .
> > > @prefix f:
> > > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
> > > .
> > > 
> > > ja:loadClass
> > > "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> > > # [] ja:loadClass "org.seaborne.tdb2.TDB2" .
> > > # tdb2:DatasetTDB2  rdfs:subClassOf  ja:RDFDataset .
> > > # tdb2:GraphTDB2    rdfs:subClassOf  ja:Model .
> > > 
> > > [] rdf:type fuseki:Server ;
> > >     fuseki:services (
> > >       :bdrcrw
> > > #      :bdrcro
> > >     ) .
> > > 
> > > :bdrcrw rdf:type fuseki:Service ;
> > >      fuseki:name                       "bdrcrw" ;     # name of
> > > the
> > > dataset in the url
> > >      fuseki:serviceQuery               "query" ;    # SPARQL
> > > query
> > > service
> > >      fuseki:serviceUpdate              "update" ;   # SPARQL
> > > update
> > > service
> > >      fuseki:serviceUpload              "upload" ;   # Non-SPARQL
> > > upload
> > > service
> > >      fuseki:serviceReadWriteGraphStore "data" ;     # SPARQL
> > > Graph store
> > > protocol (read and write)
> > >      fuseki:dataset                    :bdrc_text_dataset ;
> > >      .
> > > 
> > > # :bdrcro rdf:type fuseki:Service ;
> > > #     fuseki:name                     "bdrc" ;
> > > #     fuseki:serviceQuery             "query" ;
> > > #     fuseki:serviceReadGraphStore    "data" ;
> > > #     fuseki:dataset           		:bdrc_text_dataset
> > > ;
> > > #     .
> > > 
> > > # using TDB
> > > :dataset_bdrc rdf:type      tdb:DatasetTDB ;
> > >       tdb:location "/etc/fuseki/databases/bdrc" ;
> > >       tdb:unionDefaultGraph true ;
> > >       .
> > > 
> > > # # try using TDB2
> > > # :dataset_bdrc rdf:type      tdb2:DatasetTDB2 ;
> > > #      tdb2:location "/etc/fuseki/databases/bdrc" ;
> > > #      tdb2:unionDefaultGraph true ;
> > > #   .
> > > 
> > > :bdrc_text_dataset rdf:type     text:TextDataset ;
> > >      text:dataset   :dataset_bdrc ;
> > >      text:index     :bdrc_lucene_index ;
> > >      .
> > > 
> > > # Text index description
> > > :bdrc_lucene_index a text:TextIndexLucene ;
> > >      text:directory <file:/etc/fuseki/lucene-bdrc> ;
> > >      text:storeValues true ;
> > >      text:multilingualSupport true ;
> > >      text:entityMap :bdrc_entmap ;
> > >      text:defineAnalyzers (
> > >          [ text:addLang "bo" ;
> > >            text:analyzer [
> > >              a text:GenericAnalyzer ;
> > >              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
> > >              text:params (
> > >                  [ text:paramName "segmentInWords" ;
> > >                    text:paramType text:TypeBoolean ;
> > >                    text:paramValue false ]
> > >                  [ text:paramName "lemmatize" ;
> > >                    text:paramType text:TypeBoolean ;
> > >                    text:paramValue true ]
> > >                  [ text:paramName "filterChars" ;
> > >                    text:paramType text:TypeBoolean ;
> > >                    text:paramValue false ]
> > >                  [ text:paramName "fromEwts" ;
> > >                    text:paramType text:TypeBoolean ;
> > >                    text:paramValue false ]
> > >                  )
> > >              ] ;
> > >            ]
> > >          [ text:addLang "bo-x-ewts" ;
> > >            text:analyzer [
> > >              a text:GenericAnalyzer ;
> > >              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
> > >              text:params (
> > >                  [ text:paramName "segmentInWords" ;
> > >                    text:paramType text:TypeBoolean ;
> > >                    text:paramValue false ]
> > >                  [ text:paramName "lemmatize" ;
> > >                    text:paramType text:TypeBoolean ;
> > >                    text:paramValue true ]
> > >                  [ text:paramName "filterChars" ;
> > >                    text:paramType text:TypeBoolean ;
> > >                    text:paramValue false ]
> > >                  [ text:paramName "fromEwts" ;
> > >                    text:paramType text:TypeBoolean ;
> > >                    text:paramValue true ]
> > >                  )
> > >              ] ;
> > >            ]
> > >        ) ;
> > >      .
> > > 
> > > # Index mappings
> > > :bdrc_entmap a text:EntityMap ;
> > >      text:entityField      "uri" ;
> > >      text:uidField         "uid" ;
> > >      text:defaultField     "label" ;
> > >      text:langField        "lang" ;
> > >      text:graphField       "graph" ; ## enable graph-specific
> > > indexing
> > >      text:map (
> > >           [ text:field "label" ;
> > >             text:predicate skos:prefLabel ]
> > >           [ text:field "altLabel" ;
> > >             text:predicate skos:altLabel ; ]
> > >           [ text:field "rdfsLabel" ;
> > >             text:predicate rdfs:label ; ]
> > >           [ text:field "chunkContents" ;
> > >             text:predicate bdo:chunkContents ; ]
> > >           [ text:field "eTextTitle" ;
> > >             text:predicate bdo:eTextTitle ; ]
> > >           [ text:field "logMessage" ;
> > >             text:predicate adm:logMessage ; ]
> > >           [ text:field "noteText" ;
> > >             text:predicate bdo:noteText ; ]
> > >           [ text:field "workAuthorshipStatement" ;
> > >             text:predicate bdo:workAuthorshipStatement ; ]
> > >           [ text:field "workColophon" ;
> > >             text:predicate bdo:workColophon ; ]
> > >           [ text:field "workEditionStatement" ;
> > >             text:predicate bdo:workEditionStatement ; ]
> > >           [ text:field "workPublisherLocation" ;
> > >             text:predicate bdo:workPublisherLocation ; ]
> > >           [ text:field "workPublisherName" ;
> > >             text:predicate bdo:workPublisherName ; ]
> > >           [ text:field "workSeriesName" ;
> > >             text:predicate bdo:workSeriesName ; ]
> > >           ) ;
> > >      .
> > > #################################################################
> > > ##
> > > 
> > > It would be wonderful to have the java:URI scheme thing working
> > > (all
> > > custom functions in a single class and method calls done directly
> > > in
> > > the sparql query : sounds like a dream !)
> > > 
> > > Marc
> > > 
> > > Le mardi 26 décembre 2017 à 13:22 -0500, ajs6f a écrit :
> > > > That exception doesn't appear to have anything to do with
> > > > extension
> > > > functions. It indicates a problem between client and server.
> > > > 
> > > > Please show at _least_ your actual query execution code, your
> > > > complete Fuseki config, and a complete stacktrace.
> > > > 
> > > > 
> > > > ajs6f
> > > > 
> > > > > On Dec 26, 2017, at 1:17 PM, Marc Agate <agate.marc@gmail.com
> > > > > >
> > > > > wrote:
> > > > > 
> > > > > I forgot to mention that according to
> > > > > https://jena.apache.org/documentation/query/java-uri.html
> > > > > I tried for testing purpose to set a PREFIX f:
> > > > > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>an
> > > > > d
> > > > > added
> > > > > the following to fuseki config :
> > > > > ja:loadClass
> > > > > "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
> > > > > .
> > > > > where CustomARQFunctions is :
> > > > > public class CustomARQFunctions {		public
> > > > > static
> > > > > NodeValue myFilter(NodeValue value1){		        
> > > > > int i
> > > > > =
> > > > > value1.asString().length();         return
> > > > > NodeValue.makeInteger(i);     }
> > > > > }
> > > > > since according to
> > > > > https://jena.apache.org/documentation/query/writing_functions
> > > > > .html
> > > > > using the java:URI scheme "dynamically loads the code, which
> > > > > must
> > > > > be on
> > > > > the Java classpath. With this scheme, the function URI gives
> > > > > the
> > > > > class
> > > > > name. There is automatic registration of a wrapper into the
> > > > > function
> > > > > registry. This way, no explicit registration step is needed
> > > > > by the
> > > > > application and queries issues with the command line tools
> > > > > can load
> > > > > custom functions."
> > > > > but no luck: I keep getting the following exception:
> > > > > Exception in thread "main" HttpException: 404	at
> > > > > org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuer
> > > > > y.java
> > > > > :328
> > > > > )	at
> > > > > org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQuery.j
> > > > > ava:28
> > > > > 8)	
> > > > > at
> > > > > org.apache.jena.sparql.engine.http.QueryEngineHTTP.execResult
> > > > > SetInn
> > > > > er(Q
> > > > > ueryEngineHTTP.java:348)	at
> > > > > org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSelect
> > > > > (Query
> > > > > Engi
> > > > > neHTTP.java:340)
> > > > > I am stuck !
> > > > > Marc
> > > > > Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a écrit :
> > > > > > Hi,
> > > > > > Adam's gave me the right direction.
> > > > > > I managed to load my function class in fuseki config using
> > > > > > ja:loadClass
> > > > > > but now remains the following issue (the function is not
> > > > > > registered)seefuseki logs :
> > > > > > [2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdr
> > > > > > c.io/f
> > > > > > unct
> > > > > > ions#MyFilterFunction> has no registered function factory
> > > > > > How can I register this function now that I have the code
> > > > > > available
> > > > > > onthe endpoint side ?
> > > > > > Thanks for helping
> > > > > > Marc.
> > > > > > 
> > > > > > Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a
> > > > > > écrit :
> > > > > > > As well s Adam's point (and all the libraries your
> > > > > > > function
> > > > > > > needs, transitively)
> > > > > > > What is in the Fuseki log file?How was the data loaded
> > > > > > > into
> > > > > > > Fuseki?
> > > > > > >   >> I printed out the >> FunctionRegistry
> > > > > > >       And
> > > > > > > On 26/12/17 14:51, ajs6f wrote:
> > > > > > > > I'm not as familiar with the extension points of ARQ as
> > > > > > > > I
> > > > > > > > wouldlike to be, but as I understand what you are
> > > > > > > > doing, you
> > > > > > > > areregistering a new function with your _local_
> > > > > > > > registry,
> > > > > > > > then
> > > > > > > > firinga query at a _remote_ endpoint (which has a
> > > > > > > > completely
> > > > > > > > independentregistry in a different JVM in a different
> > > > > > > > process,
> > > > > > > > potentially ina different _system_).
> > > > > > > > The query is getting interpreted and executed by that
> > > > > > > > remoteservice, not locally. So you need to register the
> > > > > > > > function
> > > > > > > > _there_.
> > > > > > > > Take a look at this thread:
> > > > > > > > https://lists.apache.org/thread.html/1cda23332af4264883
> > > > > > > > e88697
> > > > > > > > d994
> > > > > > > > 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3
> > > > > > > > E
> > > > > > > > It should get you started as to how to register
> > > > > > > > extensionfunctionality in Fuseki.
> > > > > > > > 
> > > > > > > > Adam Soroka
> > > > > > > > > On Dec 26, 2017, at 9:34 AM, Marc Agate <agate.marc@g
> > > > > > > > > mail.c
> > > > > > > > > om>
> > > > > > > > > wrote:
> > > > > > > > > 
> > > > > > > > > Hi !
> > > > > > > > > 
> > > > > > > > > I successfully implemented sparql queries using
> > > > > > > > > custom ARQ
> > > > > > > > > functions
> > > > > > > > > using the following (custom function code):
> > > > > > > > > 
> > > > > > > > > ****************
> > > > > > > > > public class LevenshteinFilter extends FunctionBase2
> > > > > > > > > {
> > > > > > > > > 
> > > > > > > > >       public LevenshteinFilter() { super() ; }
> > > > > > > > > 
> > > > > > > > >       public NodeValue exec(NodeValue value1,
> > > > > > > > > NodeValue
> > > > > > > > > value2){
> > > > > > > > >           LevenshteinDistance LD=new
> > > > > > > > > LevenshteinDistance();
> > > > > > > > >           int i = LD.apply(value1.asString(),
> > > > > > > > > value2.asString());
> > > > > > > > >           return NodeValue.makeInteger(i);
> > > > > > > > >       }
> > > > > > > > > }
> > > > > > > > > ***************
> > > > > > > > > 
> > > > > > > > > it works fine when I query against a Model loaded
> > > > > > > > > from a
> > > > > > > > > turtle
> > > > > > > > > file,
> > > > > > > > > like this:
> > > > > > > > > 
> > > > > > > > > ***************
> > > > > > > > > InputStream input =
> > > > > > > > > QueryProcessor.class.getClassLoader().getResourceAsSt
> > > > > > > > > ream("
> > > > > > > > > full
> > > > > > > > > .t
> > > > > > > > > tl");
> > > > > > > > >               model =
> > > > > > > > > ModelFactory.createMemModelMaker().createModel("defau
> > > > > > > > > lt");
> > > > > > > > >               model.read(input,null,"TURTLE"); //
> > > > > > > > > null base
> > > > > > > > > URI,
> > > > > > > > > since
> > > > > > > > > model URIs are absolute
> > > > > > > > >               input.close();
> > > > > > > > > ***************
> > > > > > > > > 
> > > > > > > > > with the query being sent like this :
> > > > > > > > > 
> > > > > > > > > ***************
> > > > > > > > > String functionUri = "http://www.example1.org/Levensh
> > > > > > > > > teinFu
> > > > > > > > > ncti
> > > > > > > > > on
> > > > > > > > > ";
> > > > > > > > >           FunctionRegistry.get().put(functionUri ,
> > > > > > > > > LevenshteinFilter.class);
> > > > > > > > > 
> > > > > > > > >           String s = "whatever you want";
> > > > > > > > >           String sparql = prefixes+" SELECT DISTINCT
> > > > > > > > > ?l
> > > > > > > > > WHERE {
> > > > > > > > > ?x
> > > > > > > > > rdfs:label ?l . "
> > > > > > > > > +  "FILTER(fct:LevenshteinFunction(?l,
> > > > > > > > > \"" +
> > > > > > > > > s
> > > > > > > > > + "\")
> > > > > > > > > < 4) }";
> > > > > > > > >           Query query = QueryFactory.create(sparql);
> > > > > > > > >           QueryExecution qexec =
> > > > > > > > > QueryExecutionFactory.create(query,
> > > > > > > > > model);
> > > > > > > > >           ResultSet rs = qexec.execSelect();
> > > > > > > > > ***************
> > > > > > > > > 
> > > > > > > > > However, if i use a working fuseki endpoint for the
> > > > > > > > > same
> > > > > > > > > dataset
> > > > > > > > > (full.ttl) like this :
> > > > > > > > > 
> > > > > > > > > ***************
> > > > > > > > > fusekiUrl="http://localhost:3030/ds/query";
> > > > > > > > > ***************
> > > > > > > > > 
> > > > > > > > > sending the query like this (using
> > > > > > > > > QueryExecutionFactory.sparqlService(fusekiUrl,query)
> > > > > > > > > instead of
> > > > > > > > > QueryExecutionFactory.create(query,model) ):
> > > > > > > > > 
> > > > > > > > > ***************
> > > > > > > > > String functionUri = "http://www.example1.org/Levensh
> > > > > > > > > teinFu
> > > > > > > > > ncti
> > > > > > > > > on
> > > > > > > > > ";
> > > > > > > > >           FunctionRegistry.get().put(functionUri ,
> > > > > > > > > LevenshteinFilter.class);
> > > > > > > > > 
> > > > > > > > >           String s = "whatever you want";
> > > > > > > > >           String sparql = prefixes+" SELECT DISTINCT
> > > > > > > > > ?l
> > > > > > > > > WHERE {
> > > > > > > > > ?x
> > > > > > > > > rdfs:label ?l . " +
> > > > > > > > > "FILTER(fct:LevenshteinFunction(?l, \""
> > > > > > > > > + s
> > > > > > > > > +
> > > > > > > > > "\")
> > > > > > > > > < 4) }";
> > > > > > > > >           Query query = QueryFactory.create(sparql);
> > > > > > > > >           QueryExecution qexec =
> > > > > > > > > QueryExecutionFactory.sparqlService(fusekiUrl,query);
> > > > > > > > >           ResultSet rs = qexec.execSelect();
> > > > > > > > > ***************
> > > > > > > > > 
> > > > > > > > > Then I don't get any results back. In both cases I
> > > > > > > > > printed
> > > > > > > > > out
> > > > > > > > > the
> > > > > > > > > FunctionRegistry and they contain exactly the same
> > > > > > > > > entries,
> > > > > > > > > especially
> > > > > > > > > :
> > > > > > > > > 
> > > > > > > > > key=http://www.example1.org/LevenshteinFunction
> > > > > > > > > value:
> > > > > > > > > org.apache.jena.sparql.function.FunctionFactoryAuto@5
> > > > > > > > > a45133
> > > > > > > > > e
> > > > > > > > > 
> > > > > > > > > Any clue ?
> > > > > > > > > 
> > > > > > > > > Thanks
> > > > 
> > > > 

Re: Custom ARQ function not working with fuseki endpoint

Posted by Andy Seaborne <an...@apache.org>.
> Exception in thread "main" HttpException: 404	at
> org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.java:328
> )	

404 - HTTP not found - when trying to call Fuseki - i.e. the query 
service URL is wrong. A server is running at the address (host:port) but 
the path does not name an HTTP resource.

     Andy


On 26/12/17 19:00, ajs6f wrote:
> I don't understand how that config is getting parsed at all. It's not valid Turtle.
> 
>> ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> 
> is not a triple at all. It should probably be:
> 
> [] ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> 
> Riot gives "Expected IRI for predicate: got: [STRING:io.bdrc.ldsearch.query.functions.CustomARQFunctions]"
> 
> It's not clear to me how you could be successfully loading your extension function with invalid config.
> 
> Adam Soroka
> 
> 
>> On Dec 26, 2017, at 1:42 PM, Marc Agate <ag...@gmail.com> wrote:
>>
>> Well...
>>
>> Here is the query
>>
>> PREFIX : <http://purl.bdrc.io/ontology/core/>
>> PREFIX adm: <http://purl.bdrc.io/ontology/admin/>
>> PREFIX bdr: <http://purl.bdrc.io/resource/>
>> PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
>> PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
>> PREFIX skos: <http://www.w3.org/2004/02/skos/core#>
>> PREFIX tbr: <http://purl.bdrc.io/ontology/toberemoved/>
>> PREFIX xsd: <http://www.w3.org/2001/XMLSchema#>
>> PREFIX f: <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
>>
>> SELECT DISTINCT ?l
>> WHERE {
>> ?x skos:prefLabel ?l .
>> FILTER (f:myFilter(?l) < 100)
>> }
>>
>> Note that whene I change FILTER (f:myFilter(?l) < 100)  to FILTER
>> (STRLEN(?l) < 100), I don't get the 404 exception...
>> Therefore it's not a connection issue. I thing it's more like a
>> unresolved URI or so.
>>
>> Now, since you'are asking for it, here is the full fuseki config:
>>
>> ################################################################
>> # Fuseki configuration for BDRC, configures two endpoints:
>> #   - /bdrc is read-only
>> #   - /bdrcrw is read-write
>> #
>> # This was painful to come up with but the web interface basically
>> allows no option
>> # and there is no subclass inference by default so such a configuration
>> file is necessary.
>> #
>> # The main doc sources are:
>> #  - https://jena.apache.org/documentation/fuseki2/fuseki-configuration
>> .html
>> #  - https://jena.apache.org/documentation/assembler/assembler-howto.ht
>> ml
>> #  - https://jena.apache.org/documentation/assembler/assembler.ttl
>> #
>> # See https://jena.apache.org/documentation/fuseki2/fuseki-layout.html
>> for the destination of this file.
>>
>> @prefix fuseki:  <http://jena.apache.org/fuseki#> .
>> @prefix rdf:     <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
>> @prefix rdfs:    <http://www.w3.org/2000/01/rdf-schema#> .
>> @prefix tdb:     <http://jena.hpl.hp.com/2008/tdb#> .
>> # @prefix tdb2:    <http://jena.apache.org/2016/tdb#> .
>> @prefix ja:      <http://jena.hpl.hp.com/2005/11/Assembler#> .
>> @prefix :        <http://base/#> .
>> @prefix text:    <http://jena.apache.org/text#> .
>> @prefix skos:    <http://www.w3.org/2004/02/skos/core#> .
>> @prefix adm:     <http://purl.bdrc.io/ontology/admin/> .
>> @prefix bdd:     <http://purl.bdrc.io/data/> .
>> @prefix bdo:     <http://purl.bdrc.io/ontology/core/> .
>> @prefix bdr:     <http://purl.bdrc.io/resource/> .
>> @prefix f: <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
>> .
>>
>> ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
>> # [] ja:loadClass "org.seaborne.tdb2.TDB2" .
>> # tdb2:DatasetTDB2  rdfs:subClassOf  ja:RDFDataset .
>> # tdb2:GraphTDB2    rdfs:subClassOf  ja:Model .
>>
>> [] rdf:type fuseki:Server ;
>>     fuseki:services (
>>       :bdrcrw
>> #      :bdrcro
>>     ) .
>>
>> :bdrcrw rdf:type fuseki:Service ;
>>      fuseki:name                       "bdrcrw" ;     # name of the
>> dataset in the url
>>      fuseki:serviceQuery               "query" ;    # SPARQL query
>> service
>>      fuseki:serviceUpdate              "update" ;   # SPARQL update
>> service
>>      fuseki:serviceUpload              "upload" ;   # Non-SPARQL upload
>> service
>>      fuseki:serviceReadWriteGraphStore "data" ;     # SPARQL Graph store
>> protocol (read and write)
>>      fuseki:dataset                    :bdrc_text_dataset ;
>>      .
>>
>> # :bdrcro rdf:type fuseki:Service ;
>> #     fuseki:name                     "bdrc" ;
>> #     fuseki:serviceQuery             "query" ;
>> #     fuseki:serviceReadGraphStore    "data" ;
>> #     fuseki:dataset           		:bdrc_text_dataset ;
>> #     .
>>
>> # using TDB
>> :dataset_bdrc rdf:type      tdb:DatasetTDB ;
>>       tdb:location "/etc/fuseki/databases/bdrc" ;
>>       tdb:unionDefaultGraph true ;
>>       .
>>
>> # # try using TDB2
>> # :dataset_bdrc rdf:type      tdb2:DatasetTDB2 ;
>> #      tdb2:location "/etc/fuseki/databases/bdrc" ;
>> #      tdb2:unionDefaultGraph true ;
>> #   .
>>
>> :bdrc_text_dataset rdf:type     text:TextDataset ;
>>      text:dataset   :dataset_bdrc ;
>>      text:index     :bdrc_lucene_index ;
>>      .
>>
>> # Text index description
>> :bdrc_lucene_index a text:TextIndexLucene ;
>>      text:directory <file:/etc/fuseki/lucene-bdrc> ;
>>      text:storeValues true ;
>>      text:multilingualSupport true ;
>>      text:entityMap :bdrc_entmap ;
>>      text:defineAnalyzers (
>>          [ text:addLang "bo" ;
>>            text:analyzer [
>>              a text:GenericAnalyzer ;
>>              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
>>              text:params (
>>                  [ text:paramName "segmentInWords" ;
>>                    text:paramType text:TypeBoolean ;
>>                    text:paramValue false ]
>>                  [ text:paramName "lemmatize" ;
>>                    text:paramType text:TypeBoolean ;
>>                    text:paramValue true ]
>>                  [ text:paramName "filterChars" ;
>>                    text:paramType text:TypeBoolean ;
>>                    text:paramValue false ]
>>                  [ text:paramName "fromEwts" ;
>>                    text:paramType text:TypeBoolean ;
>>                    text:paramValue false ]
>>                  )
>>              ] ;
>>            ]
>>          [ text:addLang "bo-x-ewts" ;
>>            text:analyzer [
>>              a text:GenericAnalyzer ;
>>              text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
>>              text:params (
>>                  [ text:paramName "segmentInWords" ;
>>                    text:paramType text:TypeBoolean ;
>>                    text:paramValue false ]
>>                  [ text:paramName "lemmatize" ;
>>                    text:paramType text:TypeBoolean ;
>>                    text:paramValue true ]
>>                  [ text:paramName "filterChars" ;
>>                    text:paramType text:TypeBoolean ;
>>                    text:paramValue false ]
>>                  [ text:paramName "fromEwts" ;
>>                    text:paramType text:TypeBoolean ;
>>                    text:paramValue true ]
>>                  )
>>              ] ;
>>            ]
>>        ) ;
>>      .
>>
>> # Index mappings
>> :bdrc_entmap a text:EntityMap ;
>>      text:entityField      "uri" ;
>>      text:uidField         "uid" ;
>>      text:defaultField     "label" ;
>>      text:langField        "lang" ;
>>      text:graphField       "graph" ; ## enable graph-specific indexing
>>      text:map (
>>           [ text:field "label" ;
>>             text:predicate skos:prefLabel ]
>>           [ text:field "altLabel" ;
>>             text:predicate skos:altLabel ; ]
>>           [ text:field "rdfsLabel" ;
>>             text:predicate rdfs:label ; ]
>>           [ text:field "chunkContents" ;
>>             text:predicate bdo:chunkContents ; ]
>>           [ text:field "eTextTitle" ;
>>             text:predicate bdo:eTextTitle ; ]
>>           [ text:field "logMessage" ;
>>             text:predicate adm:logMessage ; ]
>>           [ text:field "noteText" ;
>>             text:predicate bdo:noteText ; ]
>>           [ text:field "workAuthorshipStatement" ;
>>             text:predicate bdo:workAuthorshipStatement ; ]
>>           [ text:field "workColophon" ;
>>             text:predicate bdo:workColophon ; ]
>>           [ text:field "workEditionStatement" ;
>>             text:predicate bdo:workEditionStatement ; ]
>>           [ text:field "workPublisherLocation" ;
>>             text:predicate bdo:workPublisherLocation ; ]
>>           [ text:field "workPublisherName" ;
>>             text:predicate bdo:workPublisherName ; ]
>>           [ text:field "workSeriesName" ;
>>             text:predicate bdo:workSeriesName ; ]
>>           ) ;
>>      .
>> ###################################################################
>>
>> It would be wonderful to have the java:URI scheme thing working (all
>> custom functions in a single class and method calls done directly in
>> the sparql query : sounds like a dream !)
>>
>> Marc
>>
>> Le mardi 26 décembre 2017 à 13:22 -0500, ajs6f a écrit :
>>> That exception doesn't appear to have anything to do with extension
>>> functions. It indicates a problem between client and server.
>>>
>>> Please show at _least_ your actual query execution code, your
>>> complete Fuseki config, and a complete stacktrace.
>>>
>>>
>>> ajs6f
>>>
>>>> On Dec 26, 2017, at 1:17 PM, Marc Agate <ag...@gmail.com>
>>>> wrote:
>>>>
>>>> I forgot to mention that according to
>>>> https://jena.apache.org/documentation/query/java-uri.html
>>>> I tried for testing purpose to set a PREFIX f:
>>>> <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>and
>>>> added
>>>> the following to fuseki config :
>>>> ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
>>>> .
>>>> where CustomARQFunctions is :
>>>> public class CustomARQFunctions {		public static
>>>> NodeValue myFilter(NodeValue value1){		        int i
>>>> =
>>>> value1.asString().length();         return
>>>> NodeValue.makeInteger(i);     }
>>>> }
>>>> since according to
>>>> https://jena.apache.org/documentation/query/writing_functions.html
>>>> using the java:URI scheme "dynamically loads the code, which must
>>>> be on
>>>> the Java classpath. With this scheme, the function URI gives the
>>>> class
>>>> name. There is automatic registration of a wrapper into the
>>>> function
>>>> registry. This way, no explicit registration step is needed by the
>>>> application and queries issues with the command line tools can load
>>>> custom functions."
>>>> but no luck: I keep getting the following exception:
>>>> Exception in thread "main" HttpException: 404	at
>>>> org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.java
>>>> :328
>>>> )	at
>>>> org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQuery.java:28
>>>> 8)	
>>>> at
>>>> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execResultSetInn
>>>> er(Q
>>>> ueryEngineHTTP.java:348)	at
>>>> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSelect(Query
>>>> Engi
>>>> neHTTP.java:340)
>>>> I am stuck !
>>>> Marc
>>>> Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a écrit :
>>>>> Hi,
>>>>> Adam's gave me the right direction.
>>>>> I managed to load my function class in fuseki config using
>>>>> ja:loadClass
>>>>> but now remains the following issue (the function is not
>>>>> registered)seefuseki logs :
>>>>> [2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdrc.io/f
>>>>> unct
>>>>> ions#MyFilterFunction> has no registered function factory
>>>>> How can I register this function now that I have the code
>>>>> available
>>>>> onthe endpoint side ?
>>>>> Thanks for helping
>>>>> Marc.
>>>>>
>>>>> Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a écrit :
>>>>>> As well s Adam's point (and all the libraries your function
>>>>>> needs, transitively)
>>>>>> What is in the Fuseki log file?How was the data loaded into
>>>>>> Fuseki?
>>>>>>   >> I printed out the >> FunctionRegistry
>>>>>>       And
>>>>>> On 26/12/17 14:51, ajs6f wrote:
>>>>>>> I'm not as familiar with the extension points of ARQ as I
>>>>>>> wouldlike to be, but as I understand what you are doing, you
>>>>>>> areregistering a new function with your _local_ registry,
>>>>>>> then
>>>>>>> firinga query at a _remote_ endpoint (which has a completely
>>>>>>> independentregistry in a different JVM in a different
>>>>>>> process,
>>>>>>> potentially ina different _system_).
>>>>>>> The query is getting interpreted and executed by that
>>>>>>> remoteservice, not locally. So you need to register the
>>>>>>> function
>>>>>>> _there_.
>>>>>>> Take a look at this thread:
>>>>>>> https://lists.apache.org/thread.html/1cda23332af4264883e88697
>>>>>>> d994
>>>>>>> 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E
>>>>>>> It should get you started as to how to register
>>>>>>> extensionfunctionality in Fuseki.
>>>>>>>
>>>>>>> Adam Soroka
>>>>>>>> On Dec 26, 2017, at 9:34 AM, Marc Agate <agate.marc@gmail.c
>>>>>>>> om>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>> Hi !
>>>>>>>>
>>>>>>>> I successfully implemented sparql queries using custom ARQ
>>>>>>>> functions
>>>>>>>> using the following (custom function code):
>>>>>>>>
>>>>>>>> ****************
>>>>>>>> public class LevenshteinFilter extends FunctionBase2 {
>>>>>>>>
>>>>>>>>       public LevenshteinFilter() { super() ; }
>>>>>>>>
>>>>>>>>       public NodeValue exec(NodeValue value1, NodeValue
>>>>>>>> value2){
>>>>>>>>           LevenshteinDistance LD=new LevenshteinDistance();
>>>>>>>>           int i = LD.apply(value1.asString(),
>>>>>>>> value2.asString());
>>>>>>>>           return NodeValue.makeInteger(i);
>>>>>>>>       }
>>>>>>>> }
>>>>>>>> ***************
>>>>>>>>
>>>>>>>> it works fine when I query against a Model loaded from a
>>>>>>>> turtle
>>>>>>>> file,
>>>>>>>> like this:
>>>>>>>>
>>>>>>>> ***************
>>>>>>>> InputStream input =
>>>>>>>> QueryProcessor.class.getClassLoader().getResourceAsStream("
>>>>>>>> full
>>>>>>>> .t
>>>>>>>> tl");
>>>>>>>>               model =
>>>>>>>> ModelFactory.createMemModelMaker().createModel("default");
>>>>>>>>               model.read(input,null,"TURTLE"); // null base
>>>>>>>> URI,
>>>>>>>> since
>>>>>>>> model URIs are absolute
>>>>>>>>               input.close();
>>>>>>>> ***************
>>>>>>>>
>>>>>>>> with the query being sent like this :
>>>>>>>>
>>>>>>>> ***************
>>>>>>>> String functionUri = "http://www.example1.org/LevenshteinFu
>>>>>>>> ncti
>>>>>>>> on
>>>>>>>> ";
>>>>>>>>           FunctionRegistry.get().put(functionUri ,
>>>>>>>> LevenshteinFilter.class);
>>>>>>>>
>>>>>>>>           String s = "whatever you want";
>>>>>>>>           String sparql = prefixes+" SELECT DISTINCT ?l
>>>>>>>> WHERE {
>>>>>>>> ?x
>>>>>>>> rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l,
>>>>>>>> \"" +
>>>>>>>> s
>>>>>>>> + "\")
>>>>>>>> < 4) }";
>>>>>>>>           Query query = QueryFactory.create(sparql);
>>>>>>>>           QueryExecution qexec =
>>>>>>>> QueryExecutionFactory.create(query,
>>>>>>>> model);
>>>>>>>>           ResultSet rs = qexec.execSelect();
>>>>>>>> ***************
>>>>>>>>
>>>>>>>> However, if i use a working fuseki endpoint for the same
>>>>>>>> dataset
>>>>>>>> (full.ttl) like this :
>>>>>>>>
>>>>>>>> ***************
>>>>>>>> fusekiUrl="http://localhost:3030/ds/query";
>>>>>>>> ***************
>>>>>>>>
>>>>>>>> sending the query like this (using
>>>>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query)
>>>>>>>> instead of
>>>>>>>> QueryExecutionFactory.create(query,model) ):
>>>>>>>>
>>>>>>>> ***************
>>>>>>>> String functionUri = "http://www.example1.org/LevenshteinFu
>>>>>>>> ncti
>>>>>>>> on
>>>>>>>> ";
>>>>>>>>           FunctionRegistry.get().put(functionUri ,
>>>>>>>> LevenshteinFilter.class);
>>>>>>>>
>>>>>>>>           String s = "whatever you want";
>>>>>>>>           String sparql = prefixes+" SELECT DISTINCT ?l
>>>>>>>> WHERE {
>>>>>>>> ?x
>>>>>>>> rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \""
>>>>>>>> + s
>>>>>>>> +
>>>>>>>> "\")
>>>>>>>> < 4) }";
>>>>>>>>           Query query = QueryFactory.create(sparql);
>>>>>>>>           QueryExecution qexec =
>>>>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query);
>>>>>>>>           ResultSet rs = qexec.execSelect();
>>>>>>>> ***************
>>>>>>>>
>>>>>>>> Then I don't get any results back. In both cases I printed
>>>>>>>> out
>>>>>>>> the
>>>>>>>> FunctionRegistry and they contain exactly the same entries,
>>>>>>>> especially
>>>>>>>> :
>>>>>>>>
>>>>>>>> key=http://www.example1.org/LevenshteinFunction value:
>>>>>>>> org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133
>>>>>>>> e
>>>>>>>>
>>>>>>>> Any clue ?
>>>>>>>>
>>>>>>>> Thanks
>>>
>>>
> 

Re: Custom ARQ function not working with fuseki endpoint

Posted by Marc Agate <ag...@gmail.com>.
Hi Adam,

Thanks ! thta was it (my mistake) and I think that except for the
ja:loadclass thing , everything else is good turtle..

I used 
[] ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions"

and everything loads correctly now :

#####################################################################
26-Dec-2017 20:40:36.783 INFOS [localhost-startStop-1]
org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was
scanned for TLDs yet contained no TLDs. Enable debug logging for this
logger for a complete list of JARs that were scanned but no TLDs were
found in them. Skipping unneeded JARs during scanning can improve
startup time and JSP compilation time.
[2017-12-26 20:40:37] Config     INFO  FUSEKI_HOME=unset
[2017-12-26 20:40:37] Config     INFO  FUSEKI_BASE=/etc/fuseki
[2017-12-26 20:40:37] Config     INFO  Shiro file:
file:///etc/fuseki/shiro.ini
[2017-12-26 20:40:37] Config     INFO  Context path = /fuseki
[2017-12-26 20:40:37] Config     INFO  Configuration file:
/etc/fuseki/config.ttl
[2017-12-26 20:40:37] Config     INFO  Load configuration:
file:///etc/fuseki/configuration/bdrc.ttl
[2017-12-26 20:40:38] Config     INFO  Register: /bdrcrw
26-Dec-2017 20:40:38.393 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployWAR Deployment of web
application archive /usr/local/fuseki/tomcat/webapps/fuseki.war has
finished in 3 288 ms
26-Dec-2017 20:40:38.394 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Déploiement du
répertoire /usr/local/fuseki/tomcat/webapps/host-manager de
l'application web
26-Dec-2017 20:40:38.413 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Deployment of
web application directory /usr/local/fuseki/tomcat/webapps/host-manager 
has finished in 19 ms
26-Dec-2017 20:40:38.413 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Déploiement du
répertoire /usr/local/fuseki/tomcat/webapps/ROOT de l'application web
26-Dec-2017 20:40:38.425 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Deployment of
web application directory /usr/local/fuseki/tomcat/webapps/ROOT has
finished in 12 ms
26-Dec-2017 20:40:38.425 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Déploiement du
répertoire /usr/local/fuseki/tomcat/webapps/examples de l'application
web
26-Dec-2017 20:40:38.612 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Deployment of
web application directory /usr/local/fuseki/tomcat/webapps/examples has
finished in 186 ms
26-Dec-2017 20:40:38.612 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Déploiement du
répertoire /usr/local/fuseki/tomcat/webapps/manager de l'application
web
26-Dec-2017 20:40:38.626 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Deployment of
web application directory /usr/local/fuseki/tomcat/webapps/manager has
finished in 14 ms
26-Dec-2017 20:40:38.626 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Déploiement du
répertoire /usr/local/fuseki/tomcat/webapps/docs de l'application web
26-Dec-2017 20:40:38.637 INFOS [localhost-startStop-1]
org.apache.catalina.startup.HostConfig.deployDirectory Deployment of
web application directory /usr/local/fuseki/tomcat/webapps/docs has
finished in 11 ms
26-Dec-2017 20:40:38.643 INFOS [main]
org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler
["http-nio-13180"]
26-Dec-2017 20:40:38.659 INFOS [main]
org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler
["ajp-nio-13109"]
26-Dec-2017 20:40:38.666 INFOS [main]
org.apache.catalina.startup.Catalina.start Server startup in 3616 ms
########################################################"""

All exceptions are gone but still, the query doesn't return anything
and I am not sure about the custom function being applied. 
I am gonna go after that now...


Marc


Le mardi 26 décembre 2017 à 14:00 -0500, ajs6f a écrit :
> I don't understand how that config is getting parsed at all. It's not
> valid Turtle.
> 
> > ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
> > .
> 
> is not a triple at all. It should probably be:
> 
> [] ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
> .
> 
> Riot gives "Expected IRI for predicate: got:
> [STRING:io.bdrc.ldsearch.query.functions.CustomARQFunctions]"
> 
> It's not clear to me how you could be successfully loading your
> extension function with invalid config.
> 
> Adam Soroka
> 
> 
> > On Dec 26, 2017, at 1:42 PM, Marc Agate <ag...@gmail.com>
> > wrote:
> > 
> > Well...
> > 
> > Here is the query
> > 
> > PREFIX : <http://purl.bdrc.io/ontology/core/>  
> > PREFIX adm: <http://purl.bdrc.io/ontology/admin/>  
> > PREFIX bdr: <http://purl.bdrc.io/resource/>  
> > PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> 
> > PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#> 
> > PREFIX skos: <http://www.w3.org/2004/02/skos/core#>  
> > PREFIX tbr: <http://purl.bdrc.io/ontology/toberemoved/> 
> > PREFIX xsd: <http://www.w3.org/2001/XMLSchema#> 
> > PREFIX f:
> > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
> > 
> > SELECT DISTINCT ?l 
> > WHERE { 
> > ?x skos:prefLabel ?l .  
> > FILTER (f:myFilter(?l) < 100) 
> > }
> > 
> > Note that whene I change FILTER (f:myFilter(?l) < 100)  to FILTER
> > (STRLEN(?l) < 100), I don't get the 404 exception...
> > Therefore it's not a connection issue. I thing it's more like a
> > unresolved URI or so.
> > 
> > Now, since you'are asking for it, here is the full fuseki config:
> > 
> > ################################################################
> > # Fuseki configuration for BDRC, configures two endpoints:
> > #   - /bdrc is read-only
> > #   - /bdrcrw is read-write
> > #
> > # This was painful to come up with but the web interface basically
> > allows no option
> > # and there is no subclass inference by default so such a
> > configuration
> > file is necessary.
> > #
> > # The main doc sources are:
> > #  - https://jena.apache.org/documentation/fuseki2/fuseki-configura
> > tion
> > .html
> > #  - https://jena.apache.org/documentation/assembler/assembler-howt
> > o.ht
> > ml
> > #  - https://jena.apache.org/documentation/assembler/assembler.ttl
> > #
> > # See https://jena.apache.org/documentation/fuseki2/fuseki-layout.h
> > tml
> > for the destination of this file.
> > 
> > @prefix fuseki:  <http://jena.apache.org/fuseki#> .
> > @prefix rdf:     <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> > @prefix rdfs:    <http://www.w3.org/2000/01/rdf-schema#> .
> > @prefix tdb:     <http://jena.hpl.hp.com/2008/tdb#> .
> > # @prefix tdb2:    <http://jena.apache.org/2016/tdb#> .
> > @prefix ja:      <http://jena.hpl.hp.com/2005/11/Assembler#> .
> > @prefix :        <http://base/#> .
> > @prefix text:    <http://jena.apache.org/text#> .
> > @prefix skos:    <http://www.w3.org/2004/02/skos/core#> .
> > @prefix adm:     <http://purl.bdrc.io/ontology/admin/> .
> > @prefix bdd:     <http://purl.bdrc.io/data/> .
> > @prefix bdo:     <http://purl.bdrc.io/ontology/core/> .
> > @prefix bdr:     <http://purl.bdrc.io/resource/> .
> > @prefix f:
> > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
> > .
> > 
> > ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
> > .
> > # [] ja:loadClass "org.seaborne.tdb2.TDB2" .
> > # tdb2:DatasetTDB2  rdfs:subClassOf  ja:RDFDataset .
> > # tdb2:GraphTDB2    rdfs:subClassOf  ja:Model .
> > 
> > [] rdf:type fuseki:Server ;
> >    fuseki:services (
> >      :bdrcrw
> > #      :bdrcro
> >    ) .
> > 
> > :bdrcrw rdf:type fuseki:Service ;
> >     fuseki:name                       "bdrcrw" ;     # name of the
> > dataset in the url
> >     fuseki:serviceQuery               "query" ;    # SPARQL query
> > service
> >     fuseki:serviceUpdate              "update" ;   # SPARQL update
> > service
> >     fuseki:serviceUpload              "upload" ;   # Non-SPARQL
> > upload
> > service
> >     fuseki:serviceReadWriteGraphStore "data" ;     # SPARQL Graph
> > store
> > protocol (read and write)
> >     fuseki:dataset                    :bdrc_text_dataset ;
> >     .
> > 
> > # :bdrcro rdf:type fuseki:Service ;
> > #     fuseki:name                     "bdrc" ;
> > #     fuseki:serviceQuery             "query" ;
> > #     fuseki:serviceReadGraphStore    "data" ;
> > #     fuseki:dataset           		:bdrc_text_dataset ;
> > #     .
> > 
> > # using TDB
> > :dataset_bdrc rdf:type      tdb:DatasetTDB ;
> >      tdb:location "/etc/fuseki/databases/bdrc" ;
> >      tdb:unionDefaultGraph true ;
> >      .
> > 
> > # # try using TDB2
> > # :dataset_bdrc rdf:type      tdb2:DatasetTDB2 ;
> > #      tdb2:location "/etc/fuseki/databases/bdrc" ;
> > #      tdb2:unionDefaultGraph true ;
> > #   .
> > 
> > :bdrc_text_dataset rdf:type     text:TextDataset ;
> >     text:dataset   :dataset_bdrc ;
> >     text:index     :bdrc_lucene_index ;
> >     .
> > 
> > # Text index description
> > :bdrc_lucene_index a text:TextIndexLucene ;
> >     text:directory <file:/etc/fuseki/lucene-bdrc> ;
> >     text:storeValues true ;
> >     text:multilingualSupport true ;
> >     text:entityMap :bdrc_entmap ;
> >     text:defineAnalyzers (
> >         [ text:addLang "bo" ; 
> >           text:analyzer [ 
> >             a text:GenericAnalyzer ;
> >             text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
> >             text:params (
> >                 [ text:paramName "segmentInWords" ;
> >                   text:paramType text:TypeBoolean ; 
> >                   text:paramValue false ]
> >                 [ text:paramName "lemmatize" ;
> >                   text:paramType text:TypeBoolean ;
> >                   text:paramValue true ]
> >                 [ text:paramName "filterChars" ;
> >                   text:paramType text:TypeBoolean ;
> >                   text:paramValue false ]
> >                 [ text:paramName "fromEwts" ;
> >                   text:paramType text:TypeBoolean ;
> >                   text:paramValue false ]
> >                 )
> >             ] ; 
> >           ]
> >         [ text:addLang "bo-x-ewts" ; 
> >           text:analyzer [ 
> >             a text:GenericAnalyzer ;
> >             text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
> >             text:params (
> >                 [ text:paramName "segmentInWords" ;
> >                   text:paramType text:TypeBoolean ; 
> >                   text:paramValue false ]
> >                 [ text:paramName "lemmatize" ;
> >                   text:paramType text:TypeBoolean ;
> >                   text:paramValue true ]
> >                 [ text:paramName "filterChars" ;
> >                   text:paramType text:TypeBoolean ;
> >                   text:paramValue false ]
> >                 [ text:paramName "fromEwts" ;
> >                   text:paramType text:TypeBoolean ;
> >                   text:paramValue true ]
> >                 )
> >             ] ; 
> >           ]
> >       ) ;
> >     .
> > 
> > # Index mappings
> > :bdrc_entmap a text:EntityMap ;
> >     text:entityField      "uri" ;
> >     text:uidField         "uid" ;
> >     text:defaultField     "label" ;
> >     text:langField        "lang" ;
> >     text:graphField       "graph" ; ## enable graph-specific
> > indexing
> >     text:map (
> >          [ text:field "label" ; 
> >            text:predicate skos:prefLabel ]
> >          [ text:field "altLabel" ; 
> >            text:predicate skos:altLabel ; ]
> >          [ text:field "rdfsLabel" ;
> >            text:predicate rdfs:label ; ]
> >          [ text:field "chunkContents" ;
> >            text:predicate bdo:chunkContents ; ]
> >          [ text:field "eTextTitle" ;
> >            text:predicate bdo:eTextTitle ; ]
> >          [ text:field "logMessage" ;
> >            text:predicate adm:logMessage ; ]
> >          [ text:field "noteText" ;
> >            text:predicate bdo:noteText ; ]
> >          [ text:field "workAuthorshipStatement" ;
> >            text:predicate bdo:workAuthorshipStatement ; ]
> >          [ text:field "workColophon" ; 
> >            text:predicate bdo:workColophon ; ]
> >          [ text:field "workEditionStatement" ;
> >            text:predicate bdo:workEditionStatement ; ]
> >          [ text:field "workPublisherLocation" ;
> >            text:predicate bdo:workPublisherLocation ; ]
> >          [ text:field "workPublisherName" ;
> >            text:predicate bdo:workPublisherName ; ]
> >          [ text:field "workSeriesName" ;
> >            text:predicate bdo:workSeriesName ; ]
> >          ) ;
> >     .
> > ###################################################################
> > 
> > It would be wonderful to have the java:URI scheme thing working
> > (all
> > custom functions in a single class and method calls done directly
> > in
> > the sparql query : sounds like a dream !)
> > 
> > Marc
> > 
> > Le mardi 26 décembre 2017 à 13:22 -0500, ajs6f a écrit :
> > > That exception doesn't appear to have anything to do with
> > > extension
> > > functions. It indicates a problem between client and server.
> > > 
> > > Please show at _least_ your actual query execution code, your
> > > complete Fuseki config, and a complete stacktrace.
> > > 
> > > 
> > > ajs6f
> > > 
> > > > On Dec 26, 2017, at 1:17 PM, Marc Agate <ag...@gmail.com>
> > > > wrote:
> > > > 
> > > > I forgot to mention that according to 
> > > > https://jena.apache.org/documentation/query/java-uri.html
> > > > I tried for testing purpose to set a PREFIX f:
> > > > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>and
> > > > added
> > > > the following to fuseki config : 
> > > > ja:loadClass
> > > > "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
> > > > .
> > > > where CustomARQFunctions is :
> > > > public class CustomARQFunctions {		public static
> > > > NodeValue myFilter(NodeValue value1){		        in
> > > > t i
> > > > =
> > > > value1.asString().length();         return
> > > > NodeValue.makeInteger(i);     }
> > > > }
> > > > since according to 
> > > > https://jena.apache.org/documentation/query/writing_functions.h
> > > > tml
> > > > using the java:URI scheme "dynamically loads the code, which
> > > > must
> > > > be on
> > > > the Java classpath. With this scheme, the function URI gives
> > > > the
> > > > class
> > > > name. There is automatic registration of a wrapper into the
> > > > function
> > > > registry. This way, no explicit registration step is needed by
> > > > the
> > > > application and queries issues with the command line tools can
> > > > load
> > > > custom functions."
> > > > but no luck: I keep getting the following exception:
> > > > Exception in thread "main" HttpException: 404	at
> > > > org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.
> > > > java
> > > > :328
> > > > )	at
> > > > org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQuery.jav
> > > > a:28
> > > > 8)	
> > > > at
> > > > org.apache.jena.sparql.engine.http.QueryEngineHTTP.execResultSe
> > > > tInn
> > > > er(Q
> > > > ueryEngineHTTP.java:348)	at
> > > > org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSelect(Q
> > > > uery
> > > > Engi
> > > > neHTTP.java:340)
> > > > I am stuck !
> > > > Marc
> > > > Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a écrit :
> > > > > Hi,
> > > > > Adam's gave me the right direction.
> > > > > I managed to load my function class in fuseki config using
> > > > > ja:loadClass
> > > > > but now remains the following issue (the function is not
> > > > > registered)seefuseki logs :
> > > > > [2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdrc.
> > > > > io/f
> > > > > unct
> > > > > ions#MyFilterFunction> has no registered function factory
> > > > > How can I register this function now that I have the code
> > > > > available
> > > > > onthe endpoint side ?
> > > > > Thanks for helping
> > > > > Marc.
> > > > > 
> > > > > Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a
> > > > > écrit :
> > > > > > As well s Adam's point (and all the libraries your function
> > > > > > needs, transitively)
> > > > > > What is in the Fuseki log file?How was the data loaded into
> > > > > > Fuseki?
> > > > > >  >> I printed out the >> FunctionRegistry
> > > > > >      And
> > > > > > On 26/12/17 14:51, ajs6f wrote:
> > > > > > > I'm not as familiar with the extension points of ARQ as I
> > > > > > > wouldlike to be, but as I understand what you are doing,
> > > > > > > you
> > > > > > > areregistering a new function with your _local_ registry,
> > > > > > > then
> > > > > > > firinga query at a _remote_ endpoint (which has a
> > > > > > > completely
> > > > > > > independentregistry in a different JVM in a different
> > > > > > > process,
> > > > > > > potentially ina different _system_).
> > > > > > > The query is getting interpreted and executed by that
> > > > > > > remoteservice, not locally. So you need to register the
> > > > > > > function
> > > > > > > _there_.
> > > > > > > Take a look at this thread:
> > > > > > > https://lists.apache.org/thread.html/1cda23332af4264883e8
> > > > > > > 8697
> > > > > > > d994
> > > > > > > 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E
> > > > > > > It should get you started as to how to register
> > > > > > > extensionfunctionality in Fuseki.
> > > > > > > 
> > > > > > > Adam Soroka
> > > > > > > > On Dec 26, 2017, at 9:34 AM, Marc Agate <agate.marc@gma
> > > > > > > > il.c
> > > > > > > > om>
> > > > > > > > wrote:
> > > > > > > > 
> > > > > > > > Hi !
> > > > > > > > 
> > > > > > > > I successfully implemented sparql queries using custom
> > > > > > > > ARQ
> > > > > > > > functions
> > > > > > > > using the following (custom function code):
> > > > > > > > 
> > > > > > > > ****************
> > > > > > > > public class LevenshteinFilter extends FunctionBase2 {
> > > > > > > > 
> > > > > > > >      public LevenshteinFilter() { super() ; }
> > > > > > > > 
> > > > > > > >      public NodeValue exec(NodeValue value1, NodeValue
> > > > > > > > value2){
> > > > > > > >          LevenshteinDistance LD=new
> > > > > > > > LevenshteinDistance();
> > > > > > > >          int i = LD.apply(value1.asString(),
> > > > > > > > value2.asString());
> > > > > > > >          return NodeValue.makeInteger(i);
> > > > > > > >      }
> > > > > > > > }
> > > > > > > > ***************
> > > > > > > > 
> > > > > > > > it works fine when I query against a Model loaded from
> > > > > > > > a
> > > > > > > > turtle
> > > > > > > > file,
> > > > > > > > like this:
> > > > > > > > 
> > > > > > > > ***************
> > > > > > > > InputStream input =
> > > > > > > > QueryProcessor.class.getClassLoader().getResourceAsStre
> > > > > > > > am("
> > > > > > > > full
> > > > > > > > .t
> > > > > > > > tl");
> > > > > > > >              model =
> > > > > > > > ModelFactory.createMemModelMaker().createModel("default
> > > > > > > > ");
> > > > > > > >              model.read(input,null,"TURTLE"); // null
> > > > > > > > base
> > > > > > > > URI,
> > > > > > > > since
> > > > > > > > model URIs are absolute
> > > > > > > >              input.close();
> > > > > > > > ***************
> > > > > > > > 
> > > > > > > > with the query being sent like this :
> > > > > > > > 
> > > > > > > > ***************
> > > > > > > > String functionUri = "http://www.example1.org/Levenshte
> > > > > > > > inFu
> > > > > > > > ncti
> > > > > > > > on
> > > > > > > > ";
> > > > > > > >          FunctionRegistry.get().put(functionUri ,
> > > > > > > > LevenshteinFilter.class);
> > > > > > > > 
> > > > > > > >          String s = "whatever you want";
> > > > > > > >          String sparql = prefixes+" SELECT DISTINCT ?l
> > > > > > > > WHERE {
> > > > > > > > ?x
> > > > > > > > rdfs:label ?l . "
> > > > > > > > +  "FILTER(fct:LevenshteinFunction(?l,
> > > > > > > > \"" +
> > > > > > > > s
> > > > > > > > + "\")
> > > > > > > > < 4) }";
> > > > > > > >          Query query = QueryFactory.create(sparql);
> > > > > > > >          QueryExecution qexec =
> > > > > > > > QueryExecutionFactory.create(query,
> > > > > > > > model);
> > > > > > > >          ResultSet rs = qexec.execSelect();
> > > > > > > > ***************
> > > > > > > > 
> > > > > > > > However, if i use a working fuseki endpoint for the
> > > > > > > > same
> > > > > > > > dataset
> > > > > > > > (full.ttl) like this :
> > > > > > > > 
> > > > > > > > ***************
> > > > > > > > fusekiUrl="http://localhost:3030/ds/query";
> > > > > > > > ***************
> > > > > > > > 
> > > > > > > > sending the query like this (using
> > > > > > > > QueryExecutionFactory.sparqlService(fusekiUrl,query)
> > > > > > > > instead of
> > > > > > > > QueryExecutionFactory.create(query,model) ):
> > > > > > > > 
> > > > > > > > ***************
> > > > > > > > String functionUri = "http://www.example1.org/Levenshte
> > > > > > > > inFu
> > > > > > > > ncti
> > > > > > > > on
> > > > > > > > ";
> > > > > > > >          FunctionRegistry.get().put(functionUri ,
> > > > > > > > LevenshteinFilter.class);
> > > > > > > > 
> > > > > > > >          String s = "whatever you want";
> > > > > > > >          String sparql = prefixes+" SELECT DISTINCT ?l
> > > > > > > > WHERE {
> > > > > > > > ?x
> > > > > > > > rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l,
> > > > > > > > \""
> > > > > > > > + s
> > > > > > > > +
> > > > > > > > "\")
> > > > > > > > < 4) }";
> > > > > > > >          Query query = QueryFactory.create(sparql);
> > > > > > > >          QueryExecution qexec =
> > > > > > > > QueryExecutionFactory.sparqlService(fusekiUrl,query);
> > > > > > > >          ResultSet rs = qexec.execSelect();
> > > > > > > > ***************
> > > > > > > > 
> > > > > > > > Then I don't get any results back. In both cases I
> > > > > > > > printed
> > > > > > > > out
> > > > > > > > the
> > > > > > > > FunctionRegistry and they contain exactly the same
> > > > > > > > entries,
> > > > > > > > especially
> > > > > > > > :
> > > > > > > > 
> > > > > > > > key=http://www.example1.org/LevenshteinFunction value:
> > > > > > > > org.apache.jena.sparql.function.FunctionFactoryAuto@5a4
> > > > > > > > 5133
> > > > > > > > e
> > > > > > > > 
> > > > > > > > Any clue ?
> > > > > > > > 
> > > > > > > > Thanks
> > > 
> > > 
> 
> 

Re: Custom ARQ function not working with fuseki endpoint

Posted by ajs6f <aj...@apache.org>.
I don't understand how that config is getting parsed at all. It's not valid Turtle.

> ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .

is not a triple at all. It should probably be:

[] ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .

Riot gives "Expected IRI for predicate: got: [STRING:io.bdrc.ldsearch.query.functions.CustomARQFunctions]"

It's not clear to me how you could be successfully loading your extension function with invalid config.

Adam Soroka


> On Dec 26, 2017, at 1:42 PM, Marc Agate <ag...@gmail.com> wrote:
> 
> Well...
> 
> Here is the query
> 
> PREFIX : <http://purl.bdrc.io/ontology/core/>  
> PREFIX adm: <http://purl.bdrc.io/ontology/admin/>  
> PREFIX bdr: <http://purl.bdrc.io/resource/>  
> PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> 
> PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#> 
> PREFIX skos: <http://www.w3.org/2004/02/skos/core#>  
> PREFIX tbr: <http://purl.bdrc.io/ontology/toberemoved/> 
> PREFIX xsd: <http://www.w3.org/2001/XMLSchema#> 
> PREFIX f: <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
> 
> SELECT DISTINCT ?l 
> WHERE { 
> ?x skos:prefLabel ?l .  
> FILTER (f:myFilter(?l) < 100) 
> }
> 
> Note that whene I change FILTER (f:myFilter(?l) < 100)  to FILTER
> (STRLEN(?l) < 100), I don't get the 404 exception...
> Therefore it's not a connection issue. I thing it's more like a
> unresolved URI or so.
> 
> Now, since you'are asking for it, here is the full fuseki config:
> 
> ################################################################
> # Fuseki configuration for BDRC, configures two endpoints:
> #   - /bdrc is read-only
> #   - /bdrcrw is read-write
> #
> # This was painful to come up with but the web interface basically
> allows no option
> # and there is no subclass inference by default so such a configuration
> file is necessary.
> #
> # The main doc sources are:
> #  - https://jena.apache.org/documentation/fuseki2/fuseki-configuration
> .html
> #  - https://jena.apache.org/documentation/assembler/assembler-howto.ht
> ml
> #  - https://jena.apache.org/documentation/assembler/assembler.ttl
> #
> # See https://jena.apache.org/documentation/fuseki2/fuseki-layout.html
> for the destination of this file.
> 
> @prefix fuseki:  <http://jena.apache.org/fuseki#> .
> @prefix rdf:     <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
> @prefix rdfs:    <http://www.w3.org/2000/01/rdf-schema#> .
> @prefix tdb:     <http://jena.hpl.hp.com/2008/tdb#> .
> # @prefix tdb2:    <http://jena.apache.org/2016/tdb#> .
> @prefix ja:      <http://jena.hpl.hp.com/2005/11/Assembler#> .
> @prefix :        <http://base/#> .
> @prefix text:    <http://jena.apache.org/text#> .
> @prefix skos:    <http://www.w3.org/2004/02/skos/core#> .
> @prefix adm:     <http://purl.bdrc.io/ontology/admin/> .
> @prefix bdd:     <http://purl.bdrc.io/data/> .
> @prefix bdo:     <http://purl.bdrc.io/ontology/core/> .
> @prefix bdr:     <http://purl.bdrc.io/resource/> .
> @prefix f: <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
> .
> 
> ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> # [] ja:loadClass "org.seaborne.tdb2.TDB2" .
> # tdb2:DatasetTDB2  rdfs:subClassOf  ja:RDFDataset .
> # tdb2:GraphTDB2    rdfs:subClassOf  ja:Model .
> 
> [] rdf:type fuseki:Server ;
>    fuseki:services (
>      :bdrcrw
> #      :bdrcro
>    ) .
> 
> :bdrcrw rdf:type fuseki:Service ;
>     fuseki:name                       "bdrcrw" ;     # name of the
> dataset in the url
>     fuseki:serviceQuery               "query" ;    # SPARQL query
> service
>     fuseki:serviceUpdate              "update" ;   # SPARQL update
> service
>     fuseki:serviceUpload              "upload" ;   # Non-SPARQL upload
> service
>     fuseki:serviceReadWriteGraphStore "data" ;     # SPARQL Graph store
> protocol (read and write)
>     fuseki:dataset                    :bdrc_text_dataset ;
>     .
> 
> # :bdrcro rdf:type fuseki:Service ;
> #     fuseki:name                     "bdrc" ;
> #     fuseki:serviceQuery             "query" ;
> #     fuseki:serviceReadGraphStore    "data" ;
> #     fuseki:dataset           		:bdrc_text_dataset ;
> #     .
> 
> # using TDB
> :dataset_bdrc rdf:type      tdb:DatasetTDB ;
>      tdb:location "/etc/fuseki/databases/bdrc" ;
>      tdb:unionDefaultGraph true ;
>      .
> 
> # # try using TDB2
> # :dataset_bdrc rdf:type      tdb2:DatasetTDB2 ;
> #      tdb2:location "/etc/fuseki/databases/bdrc" ;
> #      tdb2:unionDefaultGraph true ;
> #   .
> 
> :bdrc_text_dataset rdf:type     text:TextDataset ;
>     text:dataset   :dataset_bdrc ;
>     text:index     :bdrc_lucene_index ;
>     .
> 
> # Text index description
> :bdrc_lucene_index a text:TextIndexLucene ;
>     text:directory <file:/etc/fuseki/lucene-bdrc> ;
>     text:storeValues true ;
>     text:multilingualSupport true ;
>     text:entityMap :bdrc_entmap ;
>     text:defineAnalyzers (
>         [ text:addLang "bo" ; 
>           text:analyzer [ 
>             a text:GenericAnalyzer ;
>             text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
>             text:params (
>                 [ text:paramName "segmentInWords" ;
>                   text:paramType text:TypeBoolean ; 
>                   text:paramValue false ]
>                 [ text:paramName "lemmatize" ;
>                   text:paramType text:TypeBoolean ;
>                   text:paramValue true ]
>                 [ text:paramName "filterChars" ;
>                   text:paramType text:TypeBoolean ;
>                   text:paramValue false ]
>                 [ text:paramName "fromEwts" ;
>                   text:paramType text:TypeBoolean ;
>                   text:paramValue false ]
>                 )
>             ] ; 
>           ]
>         [ text:addLang "bo-x-ewts" ; 
>           text:analyzer [ 
>             a text:GenericAnalyzer ;
>             text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
>             text:params (
>                 [ text:paramName "segmentInWords" ;
>                   text:paramType text:TypeBoolean ; 
>                   text:paramValue false ]
>                 [ text:paramName "lemmatize" ;
>                   text:paramType text:TypeBoolean ;
>                   text:paramValue true ]
>                 [ text:paramName "filterChars" ;
>                   text:paramType text:TypeBoolean ;
>                   text:paramValue false ]
>                 [ text:paramName "fromEwts" ;
>                   text:paramType text:TypeBoolean ;
>                   text:paramValue true ]
>                 )
>             ] ; 
>           ]
>       ) ;
>     .
> 
> # Index mappings
> :bdrc_entmap a text:EntityMap ;
>     text:entityField      "uri" ;
>     text:uidField         "uid" ;
>     text:defaultField     "label" ;
>     text:langField        "lang" ;
>     text:graphField       "graph" ; ## enable graph-specific indexing
>     text:map (
>          [ text:field "label" ; 
>            text:predicate skos:prefLabel ]
>          [ text:field "altLabel" ; 
>            text:predicate skos:altLabel ; ]
>          [ text:field "rdfsLabel" ;
>            text:predicate rdfs:label ; ]
>          [ text:field "chunkContents" ;
>            text:predicate bdo:chunkContents ; ]
>          [ text:field "eTextTitle" ;
>            text:predicate bdo:eTextTitle ; ]
>          [ text:field "logMessage" ;
>            text:predicate adm:logMessage ; ]
>          [ text:field "noteText" ;
>            text:predicate bdo:noteText ; ]
>          [ text:field "workAuthorshipStatement" ;
>            text:predicate bdo:workAuthorshipStatement ; ]
>          [ text:field "workColophon" ; 
>            text:predicate bdo:workColophon ; ]
>          [ text:field "workEditionStatement" ;
>            text:predicate bdo:workEditionStatement ; ]
>          [ text:field "workPublisherLocation" ;
>            text:predicate bdo:workPublisherLocation ; ]
>          [ text:field "workPublisherName" ;
>            text:predicate bdo:workPublisherName ; ]
>          [ text:field "workSeriesName" ;
>            text:predicate bdo:workSeriesName ; ]
>          ) ;
>     .
> ###################################################################
> 
> It would be wonderful to have the java:URI scheme thing working (all
> custom functions in a single class and method calls done directly in
> the sparql query : sounds like a dream !)
> 
> Marc
> 
> Le mardi 26 décembre 2017 à 13:22 -0500, ajs6f a écrit :
>> That exception doesn't appear to have anything to do with extension
>> functions. It indicates a problem between client and server.
>> 
>> Please show at _least_ your actual query execution code, your
>> complete Fuseki config, and a complete stacktrace.
>> 
>> 
>> ajs6f
>> 
>>> On Dec 26, 2017, at 1:17 PM, Marc Agate <ag...@gmail.com>
>>> wrote:
>>> 
>>> I forgot to mention that according to 
>>> https://jena.apache.org/documentation/query/java-uri.html
>>> I tried for testing purpose to set a PREFIX f:
>>> <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>and
>>> added
>>> the following to fuseki config : 
>>> ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
>>> .
>>> where CustomARQFunctions is :
>>> public class CustomARQFunctions {		public static
>>> NodeValue myFilter(NodeValue value1){		        int i
>>> =
>>> value1.asString().length();         return
>>> NodeValue.makeInteger(i);     }
>>> }
>>> since according to 
>>> https://jena.apache.org/documentation/query/writing_functions.html
>>> using the java:URI scheme "dynamically loads the code, which must
>>> be on
>>> the Java classpath. With this scheme, the function URI gives the
>>> class
>>> name. There is automatic registration of a wrapper into the
>>> function
>>> registry. This way, no explicit registration step is needed by the
>>> application and queries issues with the command line tools can load
>>> custom functions."
>>> but no luck: I keep getting the following exception:
>>> Exception in thread "main" HttpException: 404	at
>>> org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.java
>>> :328
>>> )	at
>>> org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQuery.java:28
>>> 8)	
>>> at
>>> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execResultSetInn
>>> er(Q
>>> ueryEngineHTTP.java:348)	at
>>> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSelect(Query
>>> Engi
>>> neHTTP.java:340)
>>> I am stuck !
>>> Marc
>>> Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a écrit :
>>>> Hi,
>>>> Adam's gave me the right direction.
>>>> I managed to load my function class in fuseki config using
>>>> ja:loadClass
>>>> but now remains the following issue (the function is not
>>>> registered)seefuseki logs :
>>>> [2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdrc.io/f
>>>> unct
>>>> ions#MyFilterFunction> has no registered function factory
>>>> How can I register this function now that I have the code
>>>> available
>>>> onthe endpoint side ?
>>>> Thanks for helping
>>>> Marc.
>>>> 
>>>> Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a écrit :
>>>>> As well s Adam's point (and all the libraries your function
>>>>> needs, transitively)
>>>>> What is in the Fuseki log file?How was the data loaded into
>>>>> Fuseki?
>>>>>  >> I printed out the >> FunctionRegistry
>>>>>      And
>>>>> On 26/12/17 14:51, ajs6f wrote:
>>>>>> I'm not as familiar with the extension points of ARQ as I
>>>>>> wouldlike to be, but as I understand what you are doing, you
>>>>>> areregistering a new function with your _local_ registry,
>>>>>> then
>>>>>> firinga query at a _remote_ endpoint (which has a completely
>>>>>> independentregistry in a different JVM in a different
>>>>>> process,
>>>>>> potentially ina different _system_).
>>>>>> The query is getting interpreted and executed by that
>>>>>> remoteservice, not locally. So you need to register the
>>>>>> function
>>>>>> _there_.
>>>>>> Take a look at this thread:
>>>>>> https://lists.apache.org/thread.html/1cda23332af4264883e88697
>>>>>> d994
>>>>>> 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E
>>>>>> It should get you started as to how to register
>>>>>> extensionfunctionality in Fuseki.
>>>>>> 
>>>>>> Adam Soroka
>>>>>>> On Dec 26, 2017, at 9:34 AM, Marc Agate <agate.marc@gmail.c
>>>>>>> om>
>>>>>>> wrote:
>>>>>>> 
>>>>>>> Hi !
>>>>>>> 
>>>>>>> I successfully implemented sparql queries using custom ARQ
>>>>>>> functions
>>>>>>> using the following (custom function code):
>>>>>>> 
>>>>>>> ****************
>>>>>>> public class LevenshteinFilter extends FunctionBase2 {
>>>>>>> 
>>>>>>>      public LevenshteinFilter() { super() ; }
>>>>>>> 
>>>>>>>      public NodeValue exec(NodeValue value1, NodeValue
>>>>>>> value2){
>>>>>>>          LevenshteinDistance LD=new LevenshteinDistance();
>>>>>>>          int i = LD.apply(value1.asString(),
>>>>>>> value2.asString());
>>>>>>>          return NodeValue.makeInteger(i);
>>>>>>>      }
>>>>>>> }
>>>>>>> ***************
>>>>>>> 
>>>>>>> it works fine when I query against a Model loaded from a
>>>>>>> turtle
>>>>>>> file,
>>>>>>> like this:
>>>>>>> 
>>>>>>> ***************
>>>>>>> InputStream input =
>>>>>>> QueryProcessor.class.getClassLoader().getResourceAsStream("
>>>>>>> full
>>>>>>> .t
>>>>>>> tl");
>>>>>>>              model =
>>>>>>> ModelFactory.createMemModelMaker().createModel("default");
>>>>>>>              model.read(input,null,"TURTLE"); // null base
>>>>>>> URI,
>>>>>>> since
>>>>>>> model URIs are absolute
>>>>>>>              input.close();
>>>>>>> ***************
>>>>>>> 
>>>>>>> with the query being sent like this :
>>>>>>> 
>>>>>>> ***************
>>>>>>> String functionUri = "http://www.example1.org/LevenshteinFu
>>>>>>> ncti
>>>>>>> on
>>>>>>> ";
>>>>>>>          FunctionRegistry.get().put(functionUri ,
>>>>>>> LevenshteinFilter.class);
>>>>>>> 
>>>>>>>          String s = "whatever you want";
>>>>>>>          String sparql = prefixes+" SELECT DISTINCT ?l
>>>>>>> WHERE {
>>>>>>> ?x
>>>>>>> rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l,
>>>>>>> \"" +
>>>>>>> s
>>>>>>> + "\")
>>>>>>> < 4) }";
>>>>>>>          Query query = QueryFactory.create(sparql);
>>>>>>>          QueryExecution qexec =
>>>>>>> QueryExecutionFactory.create(query,
>>>>>>> model);
>>>>>>>          ResultSet rs = qexec.execSelect();
>>>>>>> ***************
>>>>>>> 
>>>>>>> However, if i use a working fuseki endpoint for the same
>>>>>>> dataset
>>>>>>> (full.ttl) like this :
>>>>>>> 
>>>>>>> ***************
>>>>>>> fusekiUrl="http://localhost:3030/ds/query";
>>>>>>> ***************
>>>>>>> 
>>>>>>> sending the query like this (using
>>>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query)
>>>>>>> instead of
>>>>>>> QueryExecutionFactory.create(query,model) ):
>>>>>>> 
>>>>>>> ***************
>>>>>>> String functionUri = "http://www.example1.org/LevenshteinFu
>>>>>>> ncti
>>>>>>> on
>>>>>>> ";
>>>>>>>          FunctionRegistry.get().put(functionUri ,
>>>>>>> LevenshteinFilter.class);
>>>>>>> 
>>>>>>>          String s = "whatever you want";
>>>>>>>          String sparql = prefixes+" SELECT DISTINCT ?l
>>>>>>> WHERE {
>>>>>>> ?x
>>>>>>> rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \""
>>>>>>> + s
>>>>>>> +
>>>>>>> "\")
>>>>>>> < 4) }";
>>>>>>>          Query query = QueryFactory.create(sparql);
>>>>>>>          QueryExecution qexec =
>>>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query);
>>>>>>>          ResultSet rs = qexec.execSelect();
>>>>>>> ***************
>>>>>>> 
>>>>>>> Then I don't get any results back. In both cases I printed
>>>>>>> out
>>>>>>> the
>>>>>>> FunctionRegistry and they contain exactly the same entries,
>>>>>>> especially
>>>>>>> :
>>>>>>> 
>>>>>>> key=http://www.example1.org/LevenshteinFunction value:
>>>>>>> org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133
>>>>>>> e
>>>>>>> 
>>>>>>> Any clue ?
>>>>>>> 
>>>>>>> Thanks
>> 
>> 


Re: Custom ARQ function not working with fuseki endpoint

Posted by Marc Agate <ag...@gmail.com>.
Well...

Here is the query

PREFIX : <http://purl.bdrc.io/ontology/core/>  
PREFIX adm: <http://purl.bdrc.io/ontology/admin/>  
PREFIX bdr: <http://purl.bdrc.io/resource/>  
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> 
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#> 
PREFIX skos: <http://www.w3.org/2004/02/skos/core#>  
PREFIX tbr: <http://purl.bdrc.io/ontology/toberemoved/> 
PREFIX xsd: <http://www.w3.org/2001/XMLSchema#> 
PREFIX f: <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>

SELECT DISTINCT ?l 
WHERE { 
?x skos:prefLabel ?l .  
FILTER (f:myFilter(?l) < 100) 
}

Note that whene I change FILTER (f:myFilter(?l) < 100)  to FILTER
(STRLEN(?l) < 100), I don't get the 404 exception...
Therefore it's not a connection issue. I thing it's more like a
unresolved URI or so.

Now, since you'are asking for it, here is the full fuseki config:

################################################################
# Fuseki configuration for BDRC, configures two endpoints:
#   - /bdrc is read-only
#   - /bdrcrw is read-write
#
# This was painful to come up with but the web interface basically
allows no option
# and there is no subclass inference by default so such a configuration
file is necessary.
#
# The main doc sources are:
#  - https://jena.apache.org/documentation/fuseki2/fuseki-configuration
.html
#  - https://jena.apache.org/documentation/assembler/assembler-howto.ht
ml
#  - https://jena.apache.org/documentation/assembler/assembler.ttl
#
# See https://jena.apache.org/documentation/fuseki2/fuseki-layout.html
for the destination of this file.

@prefix fuseki:  <http://jena.apache.org/fuseki#> .
@prefix rdf:     <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs:    <http://www.w3.org/2000/01/rdf-schema#> .
@prefix tdb:     <http://jena.hpl.hp.com/2008/tdb#> .
# @prefix tdb2:    <http://jena.apache.org/2016/tdb#> .
@prefix ja:      <http://jena.hpl.hp.com/2005/11/Assembler#> .
@prefix :        <http://base/#> .
@prefix text:    <http://jena.apache.org/text#> .
@prefix skos:    <http://www.w3.org/2004/02/skos/core#> .
@prefix adm:     <http://purl.bdrc.io/ontology/admin/> .
@prefix bdd:     <http://purl.bdrc.io/data/> .
@prefix bdo:     <http://purl.bdrc.io/ontology/core/> .
@prefix bdr:     <http://purl.bdrc.io/resource/> .
@prefix f: <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>
.

ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
# [] ja:loadClass "org.seaborne.tdb2.TDB2" .
# tdb2:DatasetTDB2  rdfs:subClassOf  ja:RDFDataset .
# tdb2:GraphTDB2    rdfs:subClassOf  ja:Model .

[] rdf:type fuseki:Server ;
   fuseki:services (
     :bdrcrw
#      :bdrcro
   ) .

:bdrcrw rdf:type fuseki:Service ;
    fuseki:name                       "bdrcrw" ;     # name of the
dataset in the url
    fuseki:serviceQuery               "query" ;    # SPARQL query
service
    fuseki:serviceUpdate              "update" ;   # SPARQL update
service
    fuseki:serviceUpload              "upload" ;   # Non-SPARQL upload
service
    fuseki:serviceReadWriteGraphStore "data" ;     # SPARQL Graph store
protocol (read and write)
    fuseki:dataset                    :bdrc_text_dataset ;
    .

# :bdrcro rdf:type fuseki:Service ;
#     fuseki:name                     "bdrc" ;
#     fuseki:serviceQuery             "query" ;
#     fuseki:serviceReadGraphStore    "data" ;
#     fuseki:dataset           		:bdrc_text_dataset ;
#     .

# using TDB
:dataset_bdrc rdf:type      tdb:DatasetTDB ;
     tdb:location "/etc/fuseki/databases/bdrc" ;
     tdb:unionDefaultGraph true ;
     .

# # try using TDB2
# :dataset_bdrc rdf:type      tdb2:DatasetTDB2 ;
#      tdb2:location "/etc/fuseki/databases/bdrc" ;
#      tdb2:unionDefaultGraph true ;
#   .

:bdrc_text_dataset rdf:type     text:TextDataset ;
    text:dataset   :dataset_bdrc ;
    text:index     :bdrc_lucene_index ;
    .

# Text index description
:bdrc_lucene_index a text:TextIndexLucene ;
    text:directory <file:/etc/fuseki/lucene-bdrc> ;
    text:storeValues true ;
    text:multilingualSupport true ;
    text:entityMap :bdrc_entmap ;
    text:defineAnalyzers (
        [ text:addLang "bo" ; 
          text:analyzer [ 
            a text:GenericAnalyzer ;
            text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
            text:params (
                [ text:paramName "segmentInWords" ;
                  text:paramType text:TypeBoolean ; 
                  text:paramValue false ]
                [ text:paramName "lemmatize" ;
                  text:paramType text:TypeBoolean ;
                  text:paramValue true ]
                [ text:paramName "filterChars" ;
                  text:paramType text:TypeBoolean ;
                  text:paramValue false ]
                [ text:paramName "fromEwts" ;
                  text:paramType text:TypeBoolean ;
                  text:paramValue false ]
                )
            ] ; 
          ]
        [ text:addLang "bo-x-ewts" ; 
          text:analyzer [ 
            a text:GenericAnalyzer ;
            text:class "io.bdrc.lucene.bo.TibetanAnalyzer" ;
            text:params (
                [ text:paramName "segmentInWords" ;
                  text:paramType text:TypeBoolean ; 
                  text:paramValue false ]
                [ text:paramName "lemmatize" ;
                  text:paramType text:TypeBoolean ;
                  text:paramValue true ]
                [ text:paramName "filterChars" ;
                  text:paramType text:TypeBoolean ;
                  text:paramValue false ]
                [ text:paramName "fromEwts" ;
                  text:paramType text:TypeBoolean ;
                  text:paramValue true ]
                )
            ] ; 
          ]
      ) ;
    .

# Index mappings
:bdrc_entmap a text:EntityMap ;
    text:entityField      "uri" ;
    text:uidField         "uid" ;
    text:defaultField     "label" ;
    text:langField        "lang" ;
    text:graphField       "graph" ; ## enable graph-specific indexing
    text:map (
         [ text:field "label" ; 
           text:predicate skos:prefLabel ]
         [ text:field "altLabel" ; 
           text:predicate skos:altLabel ; ]
         [ text:field "rdfsLabel" ;
           text:predicate rdfs:label ; ]
         [ text:field "chunkContents" ;
           text:predicate bdo:chunkContents ; ]
         [ text:field "eTextTitle" ;
           text:predicate bdo:eTextTitle ; ]
         [ text:field "logMessage" ;
           text:predicate adm:logMessage ; ]
         [ text:field "noteText" ;
           text:predicate bdo:noteText ; ]
         [ text:field "workAuthorshipStatement" ;
           text:predicate bdo:workAuthorshipStatement ; ]
         [ text:field "workColophon" ; 
           text:predicate bdo:workColophon ; ]
         [ text:field "workEditionStatement" ;
           text:predicate bdo:workEditionStatement ; ]
         [ text:field "workPublisherLocation" ;
           text:predicate bdo:workPublisherLocation ; ]
         [ text:field "workPublisherName" ;
           text:predicate bdo:workPublisherName ; ]
         [ text:field "workSeriesName" ;
           text:predicate bdo:workSeriesName ; ]
         ) ;
    .
###################################################################

It would be wonderful to have the java:URI scheme thing working (all
custom functions in a single class and method calls done directly in
the sparql query : sounds like a dream !)

Marc

Le mardi 26 décembre 2017 à 13:22 -0500, ajs6f a écrit :
> That exception doesn't appear to have anything to do with extension
> functions. It indicates a problem between client and server.
> 
> Please show at _least_ your actual query execution code, your
> complete Fuseki config, and a complete stacktrace.
> 
> 
> ajs6f
> 
> > On Dec 26, 2017, at 1:17 PM, Marc Agate <ag...@gmail.com>
> > wrote:
> > 
> > I forgot to mention that according to 
> > https://jena.apache.org/documentation/query/java-uri.html
> > I tried for testing purpose to set a PREFIX f:
> > <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>and
> > added
> > the following to fuseki config : 
> > ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions"
> > .
> > where CustomARQFunctions is :
> > public class CustomARQFunctions {		public static
> > NodeValue myFilter(NodeValue value1){		        int i
> > =
> > value1.asString().length();         return
> > NodeValue.makeInteger(i);     }
> > }
> > since according to 
> > https://jena.apache.org/documentation/query/writing_functions.html
> > using the java:URI scheme "dynamically loads the code, which must
> > be on
> > the Java classpath. With this scheme, the function URI gives the
> > class
> > name. There is automatic registration of a wrapper into the
> > function
> > registry. This way, no explicit registration step is needed by the
> > application and queries issues with the command line tools can load
> > custom functions."
> > but no luck: I keep getting the following exception:
> > Exception in thread "main" HttpException: 404	at
> > org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.java
> > :328
> > )	at
> > org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQuery.java:28
> > 8)	
> > at
> > org.apache.jena.sparql.engine.http.QueryEngineHTTP.execResultSetInn
> > er(Q
> > ueryEngineHTTP.java:348)	at
> > org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSelect(Query
> > Engi
> > neHTTP.java:340)
> > I am stuck !
> > Marc
> > Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a écrit :
> > > Hi,
> > > Adam's gave me the right direction.
> > > I managed to load my function class in fuseki config using
> > > ja:loadClass
> > > but now remains the following issue (the function is not
> > > registered)seefuseki logs :
> > > [2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdrc.io/f
> > > unct
> > > ions#MyFilterFunction> has no registered function factory
> > > How can I register this function now that I have the code
> > > available
> > > onthe endpoint side ?
> > > Thanks for helping
> > > Marc.
> > > 
> > > Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a écrit :
> > > > As well s Adam's point (and all the libraries your function
> > > > needs, transitively)
> > > > What is in the Fuseki log file?How was the data loaded into
> > > > Fuseki?
> > > >  >> I printed out the >> FunctionRegistry
> > > >      And
> > > > On 26/12/17 14:51, ajs6f wrote:
> > > > > I'm not as familiar with the extension points of ARQ as I
> > > > > wouldlike to be, but as I understand what you are doing, you
> > > > > areregistering a new function with your _local_ registry,
> > > > > then
> > > > > firinga query at a _remote_ endpoint (which has a completely
> > > > > independentregistry in a different JVM in a different
> > > > > process,
> > > > > potentially ina different _system_).
> > > > > The query is getting interpreted and executed by that
> > > > > remoteservice, not locally. So you need to register the
> > > > > function
> > > > > _there_.
> > > > > Take a look at this thread:
> > > > > https://lists.apache.org/thread.html/1cda23332af4264883e88697
> > > > > d994
> > > > > 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E
> > > > > It should get you started as to how to register
> > > > > extensionfunctionality in Fuseki.
> > > > > 
> > > > > Adam Soroka
> > > > > > On Dec 26, 2017, at 9:34 AM, Marc Agate <agate.marc@gmail.c
> > > > > > om>
> > > > > > wrote:
> > > > > > 
> > > > > > Hi !
> > > > > > 
> > > > > > I successfully implemented sparql queries using custom ARQ
> > > > > > functions
> > > > > > using the following (custom function code):
> > > > > > 
> > > > > > ****************
> > > > > > public class LevenshteinFilter extends FunctionBase2 {
> > > > > > 
> > > > > >      public LevenshteinFilter() { super() ; }
> > > > > > 
> > > > > >      public NodeValue exec(NodeValue value1, NodeValue
> > > > > > value2){
> > > > > >          LevenshteinDistance LD=new LevenshteinDistance();
> > > > > >          int i = LD.apply(value1.asString(),
> > > > > > value2.asString());
> > > > > >          return NodeValue.makeInteger(i);
> > > > > >      }
> > > > > > }
> > > > > > ***************
> > > > > > 
> > > > > > it works fine when I query against a Model loaded from a
> > > > > > turtle
> > > > > > file,
> > > > > > like this:
> > > > > > 
> > > > > > ***************
> > > > > > InputStream input =
> > > > > > QueryProcessor.class.getClassLoader().getResourceAsStream("
> > > > > > full
> > > > > > .t
> > > > > > tl");
> > > > > >              model =
> > > > > > ModelFactory.createMemModelMaker().createModel("default");
> > > > > >              model.read(input,null,"TURTLE"); // null base
> > > > > > URI,
> > > > > > since
> > > > > > model URIs are absolute
> > > > > >              input.close();
> > > > > > ***************
> > > > > > 
> > > > > > with the query being sent like this :
> > > > > > 
> > > > > > ***************
> > > > > > String functionUri = "http://www.example1.org/LevenshteinFu
> > > > > > ncti
> > > > > > on
> > > > > > ";
> > > > > >          FunctionRegistry.get().put(functionUri ,
> > > > > > LevenshteinFilter.class);
> > > > > > 
> > > > > >          String s = "whatever you want";
> > > > > >          String sparql = prefixes+" SELECT DISTINCT ?l
> > > > > > WHERE {
> > > > > > ?x
> > > > > > rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l,
> > > > > > \"" +
> > > > > > s
> > > > > > + "\")
> > > > > > < 4) }";
> > > > > >          Query query = QueryFactory.create(sparql);
> > > > > >          QueryExecution qexec =
> > > > > > QueryExecutionFactory.create(query,
> > > > > > model);
> > > > > >          ResultSet rs = qexec.execSelect();
> > > > > > ***************
> > > > > > 
> > > > > > However, if i use a working fuseki endpoint for the same
> > > > > > dataset
> > > > > > (full.ttl) like this :
> > > > > > 
> > > > > > ***************
> > > > > > fusekiUrl="http://localhost:3030/ds/query";
> > > > > > ***************
> > > > > > 
> > > > > > sending the query like this (using
> > > > > > QueryExecutionFactory.sparqlService(fusekiUrl,query)
> > > > > > instead of
> > > > > > QueryExecutionFactory.create(query,model) ):
> > > > > > 
> > > > > > ***************
> > > > > > String functionUri = "http://www.example1.org/LevenshteinFu
> > > > > > ncti
> > > > > > on
> > > > > > ";
> > > > > >          FunctionRegistry.get().put(functionUri ,
> > > > > > LevenshteinFilter.class);
> > > > > > 
> > > > > >          String s = "whatever you want";
> > > > > >          String sparql = prefixes+" SELECT DISTINCT ?l
> > > > > > WHERE {
> > > > > > ?x
> > > > > > rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \""
> > > > > > + s
> > > > > > +
> > > > > > "\")
> > > > > > < 4) }";
> > > > > >          Query query = QueryFactory.create(sparql);
> > > > > >          QueryExecution qexec =
> > > > > > QueryExecutionFactory.sparqlService(fusekiUrl,query);
> > > > > >          ResultSet rs = qexec.execSelect();
> > > > > > ***************
> > > > > > 
> > > > > > Then I don't get any results back. In both cases I printed
> > > > > > out
> > > > > > the
> > > > > > FunctionRegistry and they contain exactly the same entries,
> > > > > > especially
> > > > > > :
> > > > > > 
> > > > > > key=http://www.example1.org/LevenshteinFunction value:
> > > > > > org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133
> > > > > > e
> > > > > > 
> > > > > > Any clue ?
> > > > > > 
> > > > > > Thanks
> 
> 

Re: Custom ARQ function not working with fuseki endpoint

Posted by ajs6f <aj...@apache.org>.
That exception doesn't appear to have anything to do with extension functions. It indicates a problem between client and server.

Please show at _least_ your actual query execution code, your complete Fuseki config, and a complete stacktrace.


ajs6f

> On Dec 26, 2017, at 1:17 PM, Marc Agate <ag...@gmail.com> wrote:
> 
> I forgot to mention that according to 
> https://jena.apache.org/documentation/query/java-uri.html
> I tried for testing purpose to set a PREFIX f:
> <java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>and added
> the following to fuseki config : 
> ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
> where CustomARQFunctions is :
> public class CustomARQFunctions {		public static
> NodeValue myFilter(NodeValue value1){		        int i =
> value1.asString().length();         return
> NodeValue.makeInteger(i);     }
> }
> since according to 
> https://jena.apache.org/documentation/query/writing_functions.html
> using the java:URI scheme "dynamically loads the code, which must be on
> the Java classpath. With this scheme, the function URI gives the class
> name. There is automatic registration of a wrapper into the function
> registry. This way, no explicit registration step is needed by the
> application and queries issues with the command line tools can load
> custom functions."
> but no luck: I keep getting the following exception:
> Exception in thread "main" HttpException: 404	at
> org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.java:328
> )	at
> org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQuery.java:288)	
> at
> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execResultSetInner(Q
> ueryEngineHTTP.java:348)	at
> org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSelect(QueryEngi
> neHTTP.java:340)
> I am stuck !
> Marc
> Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a écrit :
>> Hi,
>> Adam's gave me the right direction.
>> I managed to load my function class in fuseki config using
>> ja:loadClass
>> but now remains the following issue (the function is not
>> registered)seefuseki logs :
>> [2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdrc.io/funct
>> ions#MyFilterFunction> has no registered function factory
>> How can I register this function now that I have the code available
>> onthe endpoint side ?
>> Thanks for helping
>> Marc.
>> 
>> Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a écrit :
>>> As well s Adam's point (and all the libraries your function
>>> needs, transitively)
>>> What is in the Fuseki log file?How was the data loaded into Fuseki?
>>>  >> I printed out the >> FunctionRegistry
>>>      And
>>> On 26/12/17 14:51, ajs6f wrote:
>>>> I'm not as familiar with the extension points of ARQ as I
>>>> wouldlike to be, but as I understand what you are doing, you
>>>> areregistering a new function with your _local_ registry, then
>>>> firinga query at a _remote_ endpoint (which has a completely
>>>> independentregistry in a different JVM in a different process,
>>>> potentially ina different _system_).
>>>> The query is getting interpreted and executed by that
>>>> remoteservice, not locally. So you need to register the function
>>>> _there_.
>>>> Take a look at this thread:
>>>> https://lists.apache.org/thread.html/1cda23332af4264883e88697d994
>>>> 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E
>>>> It should get you started as to how to register
>>>> extensionfunctionality in Fuseki.
>>>> 
>>>> Adam Soroka
>>>>> On Dec 26, 2017, at 9:34 AM, Marc Agate <ag...@gmail.com>
>>>>> wrote:
>>>>> 
>>>>> Hi !
>>>>> 
>>>>> I successfully implemented sparql queries using custom ARQ
>>>>> functions
>>>>> using the following (custom function code):
>>>>> 
>>>>> ****************
>>>>> public class LevenshteinFilter extends FunctionBase2 {
>>>>> 
>>>>>      public LevenshteinFilter() { super() ; }
>>>>> 
>>>>>      public NodeValue exec(NodeValue value1, NodeValue value2){
>>>>>          LevenshteinDistance LD=new LevenshteinDistance();
>>>>>          int i = LD.apply(value1.asString(),
>>>>> value2.asString());
>>>>>          return NodeValue.makeInteger(i);
>>>>>      }
>>>>> }
>>>>> ***************
>>>>> 
>>>>> it works fine when I query against a Model loaded from a turtle
>>>>> file,
>>>>> like this:
>>>>> 
>>>>> ***************
>>>>> InputStream input =
>>>>> QueryProcessor.class.getClassLoader().getResourceAsStream("full
>>>>> .t
>>>>> tl");
>>>>>              model =
>>>>> ModelFactory.createMemModelMaker().createModel("default");
>>>>>              model.read(input,null,"TURTLE"); // null base URI,
>>>>> since
>>>>> model URIs are absolute
>>>>>              input.close();
>>>>> ***************
>>>>> 
>>>>> with the query being sent like this :
>>>>> 
>>>>> ***************
>>>>> String functionUri = "http://www.example1.org/LevenshteinFuncti
>>>>> on
>>>>> ";
>>>>>          FunctionRegistry.get().put(functionUri ,
>>>>> LevenshteinFilter.class);
>>>>> 
>>>>>          String s = "whatever you want";
>>>>>          String sparql = prefixes+" SELECT DISTINCT ?l WHERE {
>>>>> ?x
>>>>> rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l, \"" +
>>>>> s
>>>>> + "\")
>>>>> < 4) }";
>>>>>          Query query = QueryFactory.create(sparql);
>>>>>          QueryExecution qexec =
>>>>> QueryExecutionFactory.create(query,
>>>>> model);
>>>>>          ResultSet rs = qexec.execSelect();
>>>>> ***************
>>>>> 
>>>>> However, if i use a working fuseki endpoint for the same
>>>>> dataset
>>>>> (full.ttl) like this :
>>>>> 
>>>>> ***************
>>>>> fusekiUrl="http://localhost:3030/ds/query";
>>>>> ***************
>>>>> 
>>>>> sending the query like this (using
>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query) instead of
>>>>> QueryExecutionFactory.create(query,model) ):
>>>>> 
>>>>> ***************
>>>>> String functionUri = "http://www.example1.org/LevenshteinFuncti
>>>>> on
>>>>> ";
>>>>>          FunctionRegistry.get().put(functionUri ,
>>>>> LevenshteinFilter.class);
>>>>> 
>>>>>          String s = "whatever you want";
>>>>>          String sparql = prefixes+" SELECT DISTINCT ?l WHERE {
>>>>> ?x
>>>>> rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \"" + s
>>>>> +
>>>>> "\")
>>>>> < 4) }";
>>>>>          Query query = QueryFactory.create(sparql);
>>>>>          QueryExecution qexec =
>>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query);
>>>>>          ResultSet rs = qexec.execSelect();
>>>>> ***************
>>>>> 
>>>>> Then I don't get any results back. In both cases I printed out
>>>>> the
>>>>> FunctionRegistry and they contain exactly the same entries,
>>>>> especially
>>>>> :
>>>>> 
>>>>> key=http://www.example1.org/LevenshteinFunction value:
>>>>> org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133e
>>>>> 
>>>>> Any clue ?
>>>>> 
>>>>> Thanks


Re: Custom ARQ function not working with fuseki endpoint

Posted by Marc Agate <ag...@gmail.com>.
I forgot to mention that according to 
https://jena.apache.org/documentation/query/java-uri.html
I tried for testing purpose to set a PREFIX f:
<java:io.bdrc.ldsearch.query.functions.CustomARQFunctions.>and added
the following to fuseki config : 
ja:loadClass "io.bdrc.ldsearch.query.functions.CustomARQFunctions" .
where CustomARQFunctions is :
public class CustomARQFunctions {		public static
NodeValue myFilter(NodeValue value1){		        int i =
value1.asString().length();         return
NodeValue.makeInteger(i);     }
}
since according to 
https://jena.apache.org/documentation/query/writing_functions.html
using the java:URI scheme "dynamically loads the code, which must be on
the Java classpath. With this scheme, the function URI gives the class
name. There is automatic registration of a wrapper into the function
registry. This way, no explicit registration step is needed by the
application and queries issues with the command line tools can load
custom functions."
but no luck: I keep getting the following exception:
Exception in thread "main" HttpException: 404	at
org.apache.jena.sparql.engine.http.HttpQuery.execGet(HttpQuery.java:328
)	at
org.apache.jena.sparql.engine.http.HttpQuery.exec(HttpQuery.java:288)	
at
org.apache.jena.sparql.engine.http.QueryEngineHTTP.execResultSetInner(Q
ueryEngineHTTP.java:348)	at
org.apache.jena.sparql.engine.http.QueryEngineHTTP.execSelect(QueryEngi
neHTTP.java:340)
I am stuck !
Marc
Le mardi 26 décembre 2017 à 18:56 +0100, Marc Agate a écrit :
> Hi,
> Adam's gave me the right direction.
> I managed to load my function class in fuseki config using
> ja:loadClass
> but now remains the following issue (the function is not
> registered)seefuseki logs :
> [2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdrc.io/funct
> ions#MyFilterFunction> has no registered function factory
> How can I register this function now that I have the code available
> onthe endpoint side ?
> Thanks for helping
> Marc.
> 
> Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a écrit :
> > As well s Adam's point (and all the libraries your function
> > needs, transitively)
> > What is in the Fuseki log file?How was the data loaded into Fuseki?
> >  >> I printed out the >> FunctionRegistry
> >      And
> > On 26/12/17 14:51, ajs6f wrote:
> > > I'm not as familiar with the extension points of ARQ as I
> > > wouldlike to be, but as I understand what you are doing, you
> > > areregistering a new function with your _local_ registry, then
> > > firinga query at a _remote_ endpoint (which has a completely
> > > independentregistry in a different JVM in a different process,
> > > potentially ina different _system_).
> > > The query is getting interpreted and executed by that
> > > remoteservice, not locally. So you need to register the function
> > > _there_.
> > > Take a look at this thread:
> > > https://lists.apache.org/thread.html/1cda23332af4264883e88697d994
> > > 605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E
> > > It should get you started as to how to register
> > > extensionfunctionality in Fuseki.
> > > 
> > > Adam Soroka
> > > > On Dec 26, 2017, at 9:34 AM, Marc Agate <ag...@gmail.com>
> > > > wrote:
> > > > 
> > > > Hi !
> > > > 
> > > > I successfully implemented sparql queries using custom ARQ
> > > > functions
> > > > using the following (custom function code):
> > > > 
> > > > ****************
> > > > public class LevenshteinFilter extends FunctionBase2 {
> > > > 
> > > >      public LevenshteinFilter() { super() ; }
> > > > 
> > > >      public NodeValue exec(NodeValue value1, NodeValue value2){
> > > >          LevenshteinDistance LD=new LevenshteinDistance();
> > > >          int i = LD.apply(value1.asString(),
> > > > value2.asString());
> > > >          return NodeValue.makeInteger(i);
> > > >      }
> > > > }
> > > > ***************
> > > > 
> > > > it works fine when I query against a Model loaded from a turtle
> > > > file,
> > > > like this:
> > > > 
> > > > ***************
> > > > InputStream input =
> > > > QueryProcessor.class.getClassLoader().getResourceAsStream("full
> > > > .t
> > > > tl");
> > > >              model =
> > > > ModelFactory.createMemModelMaker().createModel("default");
> > > >              model.read(input,null,"TURTLE"); // null base URI,
> > > > since
> > > > model URIs are absolute
> > > >              input.close();
> > > > ***************
> > > > 
> > > > with the query being sent like this :
> > > > 
> > > > ***************
> > > > String functionUri = "http://www.example1.org/LevenshteinFuncti
> > > > on
> > > > ";
> > > >          FunctionRegistry.get().put(functionUri ,
> > > > LevenshteinFilter.class);
> > > > 
> > > >          String s = "whatever you want";
> > > >          String sparql = prefixes+" SELECT DISTINCT ?l WHERE {
> > > > ?x
> > > > rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l, \"" +
> > > > s
> > > > + "\")
> > > > < 4) }";
> > > >          Query query = QueryFactory.create(sparql);
> > > >          QueryExecution qexec =
> > > > QueryExecutionFactory.create(query,
> > > > model);
> > > >          ResultSet rs = qexec.execSelect();
> > > > ***************
> > > > 
> > > > However, if i use a working fuseki endpoint for the same
> > > > dataset
> > > > (full.ttl) like this :
> > > > 
> > > > ***************
> > > > fusekiUrl="http://localhost:3030/ds/query";
> > > > ***************
> > > > 
> > > > sending the query like this (using
> > > > QueryExecutionFactory.sparqlService(fusekiUrl,query) instead of
> > > > QueryExecutionFactory.create(query,model) ):
> > > > 
> > > > ***************
> > > > String functionUri = "http://www.example1.org/LevenshteinFuncti
> > > > on
> > > > ";
> > > >          FunctionRegistry.get().put(functionUri ,
> > > > LevenshteinFilter.class);
> > > > 
> > > >          String s = "whatever you want";
> > > >          String sparql = prefixes+" SELECT DISTINCT ?l WHERE {
> > > > ?x
> > > > rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \"" + s
> > > > +
> > > > "\")
> > > > < 4) }";
> > > >          Query query = QueryFactory.create(sparql);
> > > >          QueryExecution qexec =
> > > > QueryExecutionFactory.sparqlService(fusekiUrl,query);
> > > >          ResultSet rs = qexec.execSelect();
> > > > ***************
> > > > 
> > > > Then I don't get any results back. In both cases I printed out
> > > > the
> > > > FunctionRegistry and they contain exactly the same entries,
> > > > especially
> > > > :
> > > > 
> > > > key=http://www.example1.org/LevenshteinFunction value:
> > > > org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133e
> > > > 
> > > > Any clue ?
> > > > 
> > > > Thanks

Re: Custom ARQ function not working with fuseki endpoint

Posted by Andy Seaborne <an...@apache.org>.

On 26/12/17 17:56, Marc Agate wrote:
> Hi,
> 
> Adam's gave me the right direction.
> 
> I managed to load my function class in fuseki config using ja:loadClass
> 
> but now remains the following issue (the function is not registered)see
> fuseki logs :
> 
> [2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdrc.io/functio
> ns#MyFilterFunction> has no registered function factory


Your original email had:

 >> String functionUri = "http://www.example1.org/LevenshteinFunction";

This is a different function -
   <http://purl.bdrc.io/functions#MyFilterFunction>

When you load code with ja:loadClass, you need to do the registration 
for http://purl.bdrc.io/functions#MyFilterFunction

     Andy


> 
> How can I register this function now that I have the code available on
> the endpoint side ?
> 
> Thanks for helping
> 
> Marc.
> 
> 
> Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a écrit :
>> As well s Adam's point (and all the libraries your function needs,
>> transitively)
>>
>> What is in the Fuseki log file?
>> How was the data loaded into Fuseki?
>>
>>   >> I printed out the
>>   >> FunctionRegistry
>>
>>       And
>>
>> On 26/12/17 14:51, ajs6f wrote:
>>> I'm not as familiar with the extension points of ARQ as I would
>>> like to be, but as I understand what you are doing, you are
>>> registering a new function with your _local_ registry, then firing
>>> a query at a _remote_ endpoint (which has a completely independent
>>> registry in a different JVM in a different process, potentially in
>>> a different _system_).
>>>
>>> The query is getting interpreted and executed by that remote
>>> service, not locally. So you need to register the function _there_.
>>>
>>> Take a look at this thread:
>>>
>>> https://lists.apache.org/thread.html/1cda23332af4264883e88697d99460
>>> 5770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E
>>>
>>> It should get you started as to how to register extension
>>> functionality in Fuseki.
>>>
>>>
>>> Adam Soroka
>>>
>>>> On Dec 26, 2017, at 9:34 AM, Marc Agate <ag...@gmail.com>
>>>> wrote:
>>>>
>>>> Hi !
>>>>
>>>> I successfully implemented sparql queries using custom ARQ
>>>> functions
>>>> using the following (custom function code):
>>>>
>>>> ****************
>>>> public class LevenshteinFilter extends FunctionBase2 {
>>>>
>>>>       public LevenshteinFilter() { super() ; }
>>>>
>>>>       public NodeValue exec(NodeValue value1, NodeValue value2){
>>>>           LevenshteinDistance LD=new LevenshteinDistance();
>>>>           int i = LD.apply(value1.asString(), value2.asString());
>>>>           return NodeValue.makeInteger(i);
>>>>       }
>>>> }
>>>> ***************
>>>>
>>>> it works fine when I query against a Model loaded from a turtle
>>>> file,
>>>> like this:
>>>>
>>>> ***************
>>>> InputStream input =
>>>> QueryProcessor.class.getClassLoader().getResourceAsStream("full.t
>>>> tl");
>>>>               model =
>>>> ModelFactory.createMemModelMaker().createModel("default");
>>>>               model.read(input,null,"TURTLE"); // null base URI,
>>>> since
>>>> model URIs are absolute
>>>>               input.close();
>>>> ***************
>>>>
>>>> with the query being sent like this :
>>>>
>>>> ***************
>>>> String functionUri = "http://www.example1.org/LevenshteinFunction
>>>> ";
>>>>           FunctionRegistry.get().put(functionUri ,
>>>> LevenshteinFilter.class);
>>>>
>>>>           String s = "whatever you want";
>>>>           String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
>>>> rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l, \"" + s
>>>> + "\")
>>>> < 4) }";
>>>>           Query query = QueryFactory.create(sparql);
>>>>           QueryExecution qexec =
>>>> QueryExecutionFactory.create(query,
>>>> model);
>>>>           ResultSet rs = qexec.execSelect();
>>>> ***************
>>>>
>>>> However, if i use a working fuseki endpoint for the same dataset
>>>> (full.ttl) like this :
>>>>
>>>> ***************
>>>> fusekiUrl="http://localhost:3030/ds/query";
>>>> ***************
>>>>
>>>> sending the query like this (using
>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query) instead of
>>>> QueryExecutionFactory.create(query,model) ):
>>>>
>>>> ***************
>>>> String functionUri = "http://www.example1.org/LevenshteinFunction
>>>> ";
>>>>           FunctionRegistry.get().put(functionUri ,
>>>> LevenshteinFilter.class);
>>>>
>>>>           String s = "whatever you want";
>>>>           String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
>>>> rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \"" + s +
>>>> "\")
>>>> < 4) }";
>>>>           Query query = QueryFactory.create(sparql);
>>>>           QueryExecution qexec =
>>>> QueryExecutionFactory.sparqlService(fusekiUrl,query);
>>>>           ResultSet rs = qexec.execSelect();
>>>> ***************
>>>>
>>>> Then I don't get any results back. In both cases I printed out
>>>> the
>>>> FunctionRegistry and they contain exactly the same entries,
>>>> especially
>>>> :
>>>>
>>>> key=http://www.example1.org/LevenshteinFunction value:
>>>> org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133e
>>>>
>>>> Any clue ?
>>>>
>>>> Thanks

Re: Custom ARQ function not working with fuseki endpoint

Posted by Marc Agate <ag...@gmail.com>.
Hi,

Adam's gave me the right direction.

I managed to load my function class in fuseki config using ja:loadClass

but now remains the following issue (the function is not registered)see
fuseki logs :

[2017-12-26 16:10:13] exec       WARN  URI <http://purl.bdrc.io/functio
ns#MyFilterFunction> has no registered function factory

How can I register this function now that I have the code available on
the endpoint side ?

Thanks for helping

Marc.


Le mardi 26 décembre 2017 à 17:43 +0000, Andy Seaborne a écrit :
> As well s Adam's point (and all the libraries your function needs, 
> transitively)
> 
> What is in the Fuseki log file?
> How was the data loaded into Fuseki?
> 
>  >> I printed out the
>  >> FunctionRegistry
> 
>      And
> 
> On 26/12/17 14:51, ajs6f wrote:
> > I'm not as familiar with the extension points of ARQ as I would
> > like to be, but as I understand what you are doing, you are
> > registering a new function with your _local_ registry, then firing
> > a query at a _remote_ endpoint (which has a completely independent
> > registry in a different JVM in a different process, potentially in
> > a different _system_).
> > 
> > The query is getting interpreted and executed by that remote
> > service, not locally. So you need to register the function _there_.
> > 
> > Take a look at this thread:
> > 
> > https://lists.apache.org/thread.html/1cda23332af4264883e88697d99460
> > 5770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E
> > 
> > It should get you started as to how to register extension
> > functionality in Fuseki.
> > 
> > 
> > Adam Soroka
> > 
> > > On Dec 26, 2017, at 9:34 AM, Marc Agate <ag...@gmail.com>
> > > wrote:
> > > 
> > > Hi !
> > > 
> > > I successfully implemented sparql queries using custom ARQ
> > > functions
> > > using the following (custom function code):
> > > 
> > > ****************
> > > public class LevenshteinFilter extends FunctionBase2 {
> > > 
> > >      public LevenshteinFilter() { super() ; }
> > > 
> > >      public NodeValue exec(NodeValue value1, NodeValue value2){
> > >          LevenshteinDistance LD=new LevenshteinDistance();
> > >          int i = LD.apply(value1.asString(), value2.asString());
> > >          return NodeValue.makeInteger(i);
> > >      }
> > > }
> > > ***************
> > > 
> > > it works fine when I query against a Model loaded from a turtle
> > > file,
> > > like this:
> > > 
> > > ***************
> > > InputStream input =
> > > QueryProcessor.class.getClassLoader().getResourceAsStream("full.t
> > > tl");
> > >              model =
> > > ModelFactory.createMemModelMaker().createModel("default");
> > >              model.read(input,null,"TURTLE"); // null base URI,
> > > since
> > > model URIs are absolute
> > >              input.close();
> > > ***************
> > > 
> > > with the query being sent like this :
> > > 
> > > ***************
> > > String functionUri = "http://www.example1.org/LevenshteinFunction
> > > ";
> > >          FunctionRegistry.get().put(functionUri ,
> > > LevenshteinFilter.class);
> > > 
> > >          String s = "whatever you want";
> > >          String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
> > > rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l, \"" + s
> > > + "\")
> > > < 4) }";
> > >          Query query = QueryFactory.create(sparql);
> > >          QueryExecution qexec =
> > > QueryExecutionFactory.create(query,
> > > model);
> > >          ResultSet rs = qexec.execSelect();
> > > ***************
> > > 
> > > However, if i use a working fuseki endpoint for the same dataset
> > > (full.ttl) like this :
> > > 
> > > ***************
> > > fusekiUrl="http://localhost:3030/ds/query";
> > > ***************
> > > 
> > > sending the query like this (using
> > > QueryExecutionFactory.sparqlService(fusekiUrl,query) instead of
> > > QueryExecutionFactory.create(query,model) ):
> > > 
> > > ***************
> > > String functionUri = "http://www.example1.org/LevenshteinFunction
> > > ";
> > >          FunctionRegistry.get().put(functionUri ,
> > > LevenshteinFilter.class);
> > > 
> > >          String s = "whatever you want";
> > >          String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
> > > rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \"" + s +
> > > "\")
> > > < 4) }";
> > >          Query query = QueryFactory.create(sparql);
> > >          QueryExecution qexec =
> > > QueryExecutionFactory.sparqlService(fusekiUrl,query);
> > >          ResultSet rs = qexec.execSelect();
> > > ***************
> > > 
> > > Then I don't get any results back. In both cases I printed out
> > > the
> > > FunctionRegistry and they contain exactly the same entries,
> > > especially
> > > :
> > > 
> > > key=http://www.example1.org/LevenshteinFunction value:
> > > org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133e
> > > 
> > > Any clue ?
> > > 
> > > Thanks

Re: Custom ARQ function not working with fuseki endpoint

Posted by Andy Seaborne <an...@apache.org>.
As well s Adam's point (and all the libraries your function needs, 
transitively)

What is in the Fuseki log file?
How was the data loaded into Fuseki?

 >> I printed out the
 >> FunctionRegistry

     And

On 26/12/17 14:51, ajs6f wrote:
> I'm not as familiar with the extension points of ARQ as I would like to be, but as I understand what you are doing, you are registering a new function with your _local_ registry, then firing a query at a _remote_ endpoint (which has a completely independent registry in a different JVM in a different process, potentially in a different _system_).
> 
> The query is getting interpreted and executed by that remote service, not locally. So you need to register the function _there_.
> 
> Take a look at this thread:
> 
> https://lists.apache.org/thread.html/1cda23332af4264883e88697d994605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E
> 
> It should get you started as to how to register extension functionality in Fuseki.
> 
> 
> Adam Soroka
> 
>> On Dec 26, 2017, at 9:34 AM, Marc Agate <ag...@gmail.com> wrote:
>>
>> Hi !
>>
>> I successfully implemented sparql queries using custom ARQ functions
>> using the following (custom function code):
>>
>> ****************
>> public class LevenshteinFilter extends FunctionBase2 {
>>
>>      public LevenshteinFilter() { super() ; }
>>
>>      public NodeValue exec(NodeValue value1, NodeValue value2){
>>          LevenshteinDistance LD=new LevenshteinDistance();
>>          int i = LD.apply(value1.asString(), value2.asString());
>>          return NodeValue.makeInteger(i);
>>      }
>> }
>> ***************
>>
>> it works fine when I query against a Model loaded from a turtle file,
>> like this:
>>
>> ***************
>> InputStream input =
>> QueryProcessor.class.getClassLoader().getResourceAsStream("full.ttl");
>>              model =
>> ModelFactory.createMemModelMaker().createModel("default");
>>              model.read(input,null,"TURTLE"); // null base URI, since
>> model URIs are absolute
>>              input.close();
>> ***************
>>
>> with the query being sent like this :
>>
>> ***************
>> String functionUri = "http://www.example1.org/LevenshteinFunction";
>>          FunctionRegistry.get().put(functionUri ,
>> LevenshteinFilter.class);
>>
>>          String s = "whatever you want";
>>          String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
>> rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l, \"" + s + "\")
>> < 4) }";
>>          Query query = QueryFactory.create(sparql);
>>          QueryExecution qexec = QueryExecutionFactory.create(query,
>> model);
>>          ResultSet rs = qexec.execSelect();
>> ***************
>>
>> However, if i use a working fuseki endpoint for the same dataset
>> (full.ttl) like this :
>>
>> ***************
>> fusekiUrl="http://localhost:3030/ds/query";
>> ***************
>>
>> sending the query like this (using
>> QueryExecutionFactory.sparqlService(fusekiUrl,query) instead of
>> QueryExecutionFactory.create(query,model) ):
>>
>> ***************
>> String functionUri = "http://www.example1.org/LevenshteinFunction";
>>          FunctionRegistry.get().put(functionUri ,
>> LevenshteinFilter.class);
>>
>>          String s = "whatever you want";
>>          String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
>> rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \"" + s + "\")
>> < 4) }";
>>          Query query = QueryFactory.create(sparql);
>>          QueryExecution qexec =
>> QueryExecutionFactory.sparqlService(fusekiUrl,query);
>>          ResultSet rs = qexec.execSelect();
>> ***************
>>
>> Then I don't get any results back. In both cases I printed out the
>> FunctionRegistry and they contain exactly the same entries, especially
>> :
>>
>> key=http://www.example1.org/LevenshteinFunction value:
>> org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133e
>>
>> Any clue ?
>>
>> Thanks
> 

Re: Custom ARQ function not working with fuseki endpoint

Posted by ajs6f <aj...@apache.org>.
I'm not as familiar with the extension points of ARQ as I would like to be, but as I understand what you are doing, you are registering a new function with your _local_ registry, then firing a query at a _remote_ endpoint (which has a completely independent registry in a different JVM in a different process, potentially in a different _system_).

The query is getting interpreted and executed by that remote service, not locally. So you need to register the function _there_.

Take a look at this thread:

https://lists.apache.org/thread.html/1cda23332af4264883e88697d994605770edcde2f93ddea51240e4b8@%3Cusers.jena.apache.org%3E

It should get you started as to how to register extension functionality in Fuseki.


Adam Soroka

> On Dec 26, 2017, at 9:34 AM, Marc Agate <ag...@gmail.com> wrote:
> 
> Hi !
> 
> I successfully implemented sparql queries using custom ARQ functions
> using the following (custom function code):
> 
> ****************
> public class LevenshteinFilter extends FunctionBase2 {
> 
>     public LevenshteinFilter() { super() ; }
> 
>     public NodeValue exec(NodeValue value1, NodeValue value2){
>         LevenshteinDistance LD=new LevenshteinDistance();
>         int i = LD.apply(value1.asString(), value2.asString()); 
>         return NodeValue.makeInteger(i); 
>     }
> }
> ***************
> 
> it works fine when I query against a Model loaded from a turtle file,
> like this:
> 
> ***************
> InputStream input =
> QueryProcessor.class.getClassLoader().getResourceAsStream("full.ttl");
>             model =
> ModelFactory.createMemModelMaker().createModel("default");
>             model.read(input,null,"TURTLE"); // null base URI, since
> model URIs are absolute
>             input.close();
> ***************
> 
> with the query being sent like this :
> 
> ***************
> String functionUri = "http://www.example1.org/LevenshteinFunction"; 
>         FunctionRegistry.get().put(functionUri ,
> LevenshteinFilter.class);
> 
>         String s = "whatever you want";
>         String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
> rdfs:label ?l . " +  "FILTER(fct:LevenshteinFunction(?l, \"" + s + "\")
> < 4) }";                                              
>         Query query = QueryFactory.create(sparql);
>         QueryExecution qexec = QueryExecutionFactory.create(query,
> model); 
>         ResultSet rs = qexec.execSelect();
> ***************
> 
> However, if i use a working fuseki endpoint for the same dataset
> (full.ttl) like this :
> 
> ***************
> fusekiUrl="http://localhost:3030/ds/query";
> ***************
> 
> sending the query like this (using
> QueryExecutionFactory.sparqlService(fusekiUrl,query) instead of
> QueryExecutionFactory.create(query,model) ):
> 
> ***************
> String functionUri = "http://www.example1.org/LevenshteinFunction"; 
>         FunctionRegistry.get().put(functionUri ,
> LevenshteinFilter.class);
> 
>         String s = "whatever you want";
>         String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x
> rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \"" + s + "\")
> < 4) }";                                       
>         Query query = QueryFactory.create(sparql);
>         QueryExecution qexec =
> QueryExecutionFactory.sparqlService(fusekiUrl,query); 
>         ResultSet rs = qexec.execSelect();
> ***************
> 
> Then I don't get any results back. In both cases I printed out the
> FunctionRegistry and they contain exactly the same entries, especially
> :
> 
> key=http://www.example1.org/LevenshteinFunction value:
> org.apache.jena.sparql.function.FunctionFactoryAuto@5a45133e
> 
> Any clue ?
> 
> Thanks