You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@uima.apache.org by Andreas Niekler <an...@informatik.uni-leipzig.de> on 2012/08/22 08:47:46 UTC

Re: Problems instantitating an Annotator

Hello,

there is an aggregate annotator within the Descriptors named 
OpenNlpTextAnalyzer.xml. Within there under Resources the Model files 
for the openNLP Tools are set. In the Chunker xml there is just an 
information that this Resource needs to be defined.
The openNLP Tools are made to build aggregate annotators that define the 
differnt model file locations itself.

All the best

Andreas

Am 21.08.2012 23:01, schrieb yair even-zohar:
> Hi
>   I was trying to execute an annotator from the opennlp-uima project.
>
> I have imported the code to Eclipse and it compiles just fine, but when I trying to use an annotator using something like:
>
> I get:
> org.apache.uima.resource.ResourceInitializationException: There is no resource satisfying the required resource dependency with key "opennlp.uima.ModelName". (Descriptor: file:/C:/Temp/opennlp.uima.OpenNlpTextAnalyzer/desc/Chunker.xml)
>      at com.fetch5.util.UimaHelper.getAE(UimaHelper.java:45)
>      at com.fetch5.tgni.uima.annotators.nlp.NounPhraseAnnotatorTest.testNounPhraseAnnotation(NounPhraseAnnotatorTest.java:44)
>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>      at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
>      at java.lang.reflect.Method.invoke(Unknown Source)
>      at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
>      at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
>      at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
>      at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
>      at org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79)
>      at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71)
>      at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49)
>      at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
>      at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
>      at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
>      at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
>      at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
>      at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
>      at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
>      at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
>      at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
>      at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
>      at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
>      at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
> Caused by: org.apache.uima.resource.ResourceInitializationException: There is no resource satisfying the required resource dependency with key "opennlp.uima.ModelName". (Descriptor: file:/C:/Temp/opennlp.uima.OpenNlpTextAnalyzer/desc/Chunker.xml)
>      at org.apache.uima.resource.impl.ResourceManager_impl.resolveAndValidateResourceDependencies(ResourceManager_impl.java:513)
>      at org.apache.uima.resource.Resource_ImplBase.initialize(Resource_ImplBase.java:161)
>      at org.apache.uima.analysis_engine.impl.AnalysisEngineImplBase.initialize(AnalysisEngineImplBase.java:157)
>      at org.apache.uima.analysis_engine.impl.PrimitiveAnalysisEngine_impl.initialize(PrimitiveAnalysisEngine_impl.java:123)
>      at org.apache.uima.impl.AnalysisEngineFactory_impl.produceResource(AnalysisEngineFactory_impl.java:94)
>      at org.apache.uima.impl.CompositeResourceFactory_impl.produceResource(CompositeResourceFactory_impl.java:62)
>      at org.apache.uima.UIMAFramework.produceResource(UIMAFramework.java:269)
>      at org.apache.uima.UIMAFramework.produceAnalysisEngine(UIMAFramework.java:354)
>      at com.fetch5.util.UimaHelper.getAE(UimaHelper.java:43)
>      ... 24 more
>
>
> Here is the relevant /C:/Temp/opennlp.uima.OpenNlpTextAnalyzer/desc/Chunker.xml.  It looks like it has a "opennlp.uima.ModelName"
>
>
> <?xml version="1.0" encoding="UTF-8"?>
>
> <!--
>     Licensed to the Apache Software Foundation (ASF) under one
>     or more contributor license agreements.  See the NOTICE file
>     distributed with this work for additional information
>     regarding copyright ownership.  The ASF licenses this file
>     to you under the Apache License, Version 2.0 (the
>     "License"); you may not use this file except in compliance
>     with the License.  You may obtain a copy of the License at
>
>       http://www.apache.org/licenses/LICENSE-2.0
>
>     Unless required by applicable law or agreed to in writing,
>     software distributed under the License is distributed on an
>     "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
>     KIND, either express or implied.  See the License for the
>     specific language governing permissions and limitations
>     under the License.
> -->
>
> <analysisEngineDescription xmlns="http://uima.apache.org/resourceSpecifier">
>      <frameworkImplementation>org.apache.uima.java</frameworkImplementation>
>      <primitive>true</primitive>
>      <annotatorImplementationName>opennlp.uima.chunker.Chunker</annotatorImplementationName>
>      <analysisEngineMetaData>
>          <name>Chunker</name>
>          <description></description>
>          <version>1.5.2-incubating</version>
>          <vendor>Apache Software Foundation</vendor>
>          <configurationParameters>
>
>              <configurationParameter>
>                  <name>opennlp.uima.SentenceType</name>
>                  <type>String</type>
>                  <multiValued>false</multiValued>
>                  <mandatory>true</mandatory>
>              </configurationParameter>
>
>              <configurationParameter>
>                  <name>opennlp.uima.TokenType</name>
>                  <type>String</type>
>                  <multiValued>false</multiValued>
>                  <mandatory>true</mandatory>
>              </configurationParameter>
>
>              <configurationParameter>
>                  <name>opennlp.uima.POSFeature</name>
>                  <type>String</type>
>                  <multiValued>false</multiValued>
>                  <mandatory>true</mandatory>
>              </configurationParameter>
>
>              <configurationParameter>
>                  <name>opennlp.uima.ChunkType</name>
>                  <type>String</type>
>                  <multiValued>false</multiValued>
>                  <mandatory>true</mandatory>
>              </configurationParameter>
>
>              <configurationParameter>
>                  <name>opennlp.uima.ChunkTagFeature</name>
>                  <type>String</type>
>                  <multiValued>false</multiValued>
>                  <mandatory>true</mandatory>
>              </configurationParameter>
>          </configurationParameters>
>          <configurationParameterSettings>
>
>              <nameValuePair>
>                  <name>opennlp.uima.SentenceType</name>
>                  <value>
>                      <string>opennlp.uima.Sentence</string>
>                  </value>
>              </nameValuePair>
>
>              <nameValuePair>
>                  <name>opennlp.uima.TokenType</name>
>                  <value>
>                      <string>opennlp.uima.Token</string>
>                  </value>
>              </nameValuePair>
>
>              <nameValuePair>
>                  <name>opennlp.uima.POSFeature</name>
>                  <value>
>                      <string>pos</string>
>                  </value>
>              </nameValuePair>
>
>              <nameValuePair>
>                  <name>opennlp.uima.ChunkType</name>
>                  <value>
>                      <string>opennlp.uima.Chunk</string>
>                  </value>
>              </nameValuePair>
>
>              <nameValuePair>
>                  <name>opennlp.uima.ChunkTagFeature</name>
>                  <value>
>                      <string>chunkType</string>
>                  </value>
>              </nameValuePair>
>          </configurationParameterSettings>
>
>          <typeSystemDescription>
>              <imports>
>                  <import location="TypeSystem.xml" />
>              </imports>
>          </typeSystemDescription>
>
>          <capabilities>
>              <capability>
>                  <inputs />
>                  <outputs />
>                  <languagesSupported>
>                      <language>en</language>
>                  </languagesSupported>
>              </capability>
>          </capabilities>
>      </analysisEngineMetaData>
>
>      <externalResourceDependencies>
>          <externalResourceDependency>
>              <key>opennlp.uima.ModelName</key>
>              <interfaceName>opennlp.uima.chunker.ChunkerModelResource</interfaceName>
>          </externalResourceDependency>
>      </externalResourceDependencies>
>
>      <resourceManagerConfiguration/>
> </analysisEngineDescription>
>
>
>
> Thanks
> -Yair
>

-- 
Andreas Niekler, Dipl. Ing. (FH)
NLP Group | Department of Computer Science
University of Leipzig
Johannisgasse 26 | 04103 Leipzig

mail: aniekler@informatik.uni-leipzig.deg.de

Re: Problems instantitating an Annotator

Posted by Jörn Kottmann <ko...@gmail.com>.
On 08/22/2012 08:47 AM, Andreas Niekler wrote:
> there is an aggregate annotator within the Descriptors named 
> OpenNlpTextAnalyzer.xml. Within there under Resources the Model files 
> for the openNLP Tools are set. In the Chunker xml there is just an 
> information that this Resource needs to be defined.
> The openNLP Tools are made to build aggregate annotators that define 
> the differnt model file locations itself. 


We host our OpenNLP models on an Apache and use http URLs to refer to them,
this way you can avoid copying around model files manually. Also works 
very nicely
with UIMA-AS.

Jörn