You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@directory.apache.org by Dan Armbrust <da...@gmail.com> on 2005/05/20 18:07:57 UTC

trying to launch with custom schema

I can't seem to figure out how to launch the server with my custom schema.

I used the maven plugin to generate java files for my schema.  I 
compiled these classes, and added them to apacheds-main-0.9.jar.  I 
added the following line to the properties file that I am launching the 
server with:


server.schemas=org.apache.ldap.server.schema.bootstrap.NamingAuthoritySchema 
org.apache.ldap.server.schema.bootstrap.TerminologyServiceObjectsSchema 
org.apache.ldap.server.schema.bootstrap.TerminologyServiceAttributesSchema 
org.apache.ldap.server.schema.bootstrap.TerminologyAssociationsMasterSchema 
org.apache.ldap.server.schema.bootstrap.TerminologyAssociationsCommonSchema


Now, when I try to lanuch the server I get:

C:\apacheds-0.9>java -jar apacheds-main-0.9.jar test.properties
server: loading properties from test.properties
Exception in thread "main" java.lang.NullPointerException
         at 
org.apache.ldap.server.schema.bootstrap.BootstrapSchemaLoader.getProducer(BootstrapSchemaLoader.java:317)
         at 
org.apache.ldap.server.schema.bootstrap.BootstrapSchemaLoader.load(BootstrapSchemaLoader.java:192)
         at 
org.apache.ldap.server.schema.bootstrap.BootstrapSchemaLoader.load(BootstrapSchemaLoader.java:100)
         at 
org.apache.ldap.server.jndi.CoreContextFactory.initialize(CoreContextFactory.java:515)
         at 
org.apache.ldap.server.jndi.CoreContextFactory.getInitialContext(CoreContextFactory.java:212)
         at 
org.apache.ldap.server.jndi.ServerContextFactory.getInitialContext(ServerContextFactory.java:153)
         at javax.naming.spi.NamingManager.getInitialContext(Unknown Source)
         at javax.naming.InitialContext.getDefaultInitCtx(Unknown Source)
         at javax.naming.InitialContext.init(Unknown Source)
         at javax.naming.InitialContext.<init>(Unknown Source)
         at javax.naming.directory.InitialDirContext.<init>(Unknown Source)
         at org.apache.ldap.server.ServerMain.main(ServerMain.java:76)


Is there a step I missed?  The one thing that the documentation mentions 
that I didn't do was define the dependancies for my schemas - could that 
be the problem?

Re: trying to launch with custom schema

Posted by Dan Armbrust <da...@gmail.com>.
I just tested out the new jars with my schema, everything appears to be 
working fine.

Thanks for making the changes,

Dan


Alex Karasulu wrote:
> I may also need to deploy the other jars like main if you are using that 
> as well.
> 
> Doing that now.  Should be done in 10 minutes.
> 
> Alex
> 
> 

Re: trying to launch with custom schema

Posted by Alex Karasulu <ao...@bellsouth.net>.
I may also need to deploy the other jars like main if you are using that 
as well.

Doing that now.  Should be done in 10 minutes.

Alex


Re: trying to launch with custom schema

Posted by Alex Karasulu <ao...@bellsouth.net>.
<snip/>

These are matching rules that were defined within rfc 3698 which were 
not added.  I have to add this to code because at this point there is no 
other mechanism to define how to match other than using normalizers and 
comparators.  I checked in changes and deployed the jars out there for 
the core jar snapshots on 0.9.1-SNAPSHOT

http://cvs.apache.org/repository/directory/jars/apacheds-core-0.9.1-SNAPSHOT.jar 
should be about 2 minutes old try that.

>>>>
>>>> caseExactOrderingMatch
>>>> integerOrderingMatch
>>>
Checked in changes.

Alex


Re: trying to launch with custom schema

Posted by Alex Karasulu <ao...@bellsouth.net>.
Dan Armbrust wrote:

> I think that the maven schema generating tooling caught all of my RFC 
> ordering errors in the attributeTypes.  I had to fix a few :)
>
> A search of all of the files in the source tree didn't turn up any 
> hits on these two items.
>
> Dan
>
>
> Alex Karasulu wrote:
>
>> Dan Armbrust wrote:
>>
>>> After I added the schemas I was missing, it comes down to just these 
>>> two that don't appear to be supported:
>>>
>>> caseExactOrderingMatch
>>> integerOrderingMatch
>>
Gimme a couple minutes to figure out where (RFC) they are defined and 
add them appropriately as matching rules to the server.  Note that 
without your addition of certain component and schema checking turned 
the matching rules will only allow schema to work and you can create 
attributes with these matching rules.

We have a long way to go for schema management to get up to speed.

Alex
 

Re: trying to launch with custom schema

Posted by Dan Armbrust <da...@gmail.com>.
I think that the maven schema generating tooling caught all of my RFC 
ordering errors in the attributeTypes.  I had to fix a few :)

A search of all of the files in the source tree didn't turn up any hits 
on these two items.

Dan


Alex Karasulu wrote:
> Dan Armbrust wrote:
> 
>> After I added the schemas I was missing, it comes down to just these 
>> two that don't appear to be supported:
>>
>> caseExactOrderingMatch
>> integerOrderingMatch

Re: trying to launch with custom schema

Posted by Alex Karasulu <ao...@bellsouth.net>.
Dan Armbrust wrote:

> After I added the schemas I was missing, it comes down to just these 
> two that don't appear to be supported:
>
> caseExactOrderingMatch
> integerOrderingMatch
>
>
> When I comment out the appropriate ORDERING lines from my schema 
> (snippits below) then it works fine.
>
> Should I file a bug report on these?
> I think they are valid schema rules (though I don't know schema very 
> well - I wasn't the author of the schema files that I'm using)
>
> Thanks,
>
> Dan
>
>
> Dan Armbrust wrote:
>
>> Now it is telling me that
>>
>> javax.naming.NamingException: OID for name 'caseExactOrderingMatch' 
>> was not found within the OID registry
>>
>> for the following names:
>>
>> caseExactOrderingMatch
>> integerOrderingMatch
>>
>>
>> Are these not supported, or am I still missing some schemas?
>>
Can you check if they are in any of the schemas under the src/schemas 
direectory?  Might also be the order of terms in the attributeType 
syntax.  Parser follows strict order in RFC.


Re: trying to launch with custom schema

Posted by Dan Armbrust <da...@gmail.com>.
After I added the schemas I was missing, it comes down to just these two 
that don't appear to be supported:

caseExactOrderingMatch
integerOrderingMatch


When I comment out the appropriate ORDERING lines from my schema 
(snippits below) then it works fine.

Should I file a bug report on these?
I think they are valid schema rules (though I don't know schema very 
well - I wasn't the author of the schema files that I'm using)

Thanks,

Dan


Dan Armbrust wrote:
> Now it is telling me that
> 
> javax.naming.NamingException: OID for name 'caseExactOrderingMatch' was 
> not found within the OID registry
> 
> for the following names:
> 
> caseExactOrderingMatch
> integerOrderingMatch
> 
> 
> Are these not supported, or am I still missing some schemas?
> 
> These stem from the following schema entries that I have:
> 
> # Integer -
> attributetype (1.3.6.1.4.1.2114.108.1.4.6
>     NAME 'tsInteger'
>     EQUALITY integerMatch
>     ORDERING integerOrderingMatch
>     SYNTAX 1.3.6.1.4.1.1466.115.121.1.27)
> 
> # Directory String - UTF 8 / Unicode - case sensitive
> attributetype (1.3.6.1.4.1.2114.108.1.4.4
>     NAME 'tsCaseSensitiveDirectoryString'
>     EQUALITY caseExactMatch
>     ORDERING caseExactOrderingMatch
>     SUBSTR   caseExactSubstringsMatch
>     SYNTAX 1.3.6.1.4.1.1466.115.121.1.15)
> 
> 

Re: trying to launch with custom schema

Posted by Dan Armbrust <da...@gmail.com>.
Now it is telling me that

javax.naming.NamingException: OID for name 'caseExactOrderingMatch' was 
not found within the OID registry

for the following names:

caseExactOrderingMatch
integerOrderingMatch
caseExactIA5SubstringsMatch


Are these not supported, or am I still missing some schemas?

These stem from the following schema entries that I have:

# IA5 (International alphabet 5 - AKA Ascii) case sensitivie
attributetype (1.3.6.1.4.1.2114.108.1.4.2
	NAME 'tsCaseSensitiveIA5String'
	EQUALITY caseExactIA5Match
	SUBSTR   caseExactIA5SubstringsMatch
	SYNTAX 1.3.6.1.4.1.1466.115.121.1.26)

# Integer -
attributetype (1.3.6.1.4.1.2114.108.1.4.6
	NAME 'tsInteger'
	EQUALITY integerMatch
	ORDERING integerOrderingMatch
	SYNTAX 1.3.6.1.4.1.1466.115.121.1.27)

# Directory String - UTF 8 / Unicode - case sensitive
attributetype (1.3.6.1.4.1.2114.108.1.4.4
	NAME 'tsCaseSensitiveDirectoryString'
	EQUALITY caseExactMatch
	ORDERING caseExactOrderingMatch
	SUBSTR   caseExactSubstringsMatch
	SYNTAX 1.3.6.1.4.1.1466.115.121.1.15)


Re: trying to launch with custom schema

Posted by Dan Armbrust <da...@gmail.com>.
Thanks, I had not found that example server.properties file before. 
Just what I needed.

Dan

Alex Karasulu wrote:

> Take a look at the example server.properties file in the 
> apacheds/trunk/main and look at how the project.properties are used to 
> configure schemas.
> 

Re: trying to launch with custom schema

Posted by Alex Karasulu <ao...@bellsouth.net>.
Dan Armbrust wrote:

> Hmm, it appears that the null pointer is happening when it tries to 
> load the "system" schema - and makes the assumption that it will 
> always exist.

Yes it must always exist as is without being touched.

>
> When I define a server.schemas line in the properties file, it must 
> override all the default schama definition, rather than append to it?
>
Take a look at the example server.properties file in the 
apacheds/trunk/main and look at how the project.properties are used to 
configure schemas. 

> Where can I find the list of all of the default schemas that I need to 
> put on the line?  I'm going to add the ones that are listed in the 
> example on the jndi properties page, but I'm not sure if that is all 
> of them.

So you only had your schema and it barfed I guess.  You should have at 
least the default schemas that are loaded normally but might not need em 
all obviously.  Here's the list ...

    *protected* *static* *final* String[] DEFAULT_SCHEMAS = *new* String[]
    {
        *"org.apache.ldap.server.schema.bootstrap.CoreSchema"*,
        *"org.apache.ldap.server.schema.bootstrap.CosineSchema"*,
        *"org.apache.ldap.server.schema.bootstrap.ApacheSchema"*,
        *"org.apache.ldap.server.schema.bootstrap.InetorgpersonSchema"*,
        *"org.apache.ldap.server.schema.bootstrap.JavaSchema"*,
        *"org.apache.ldap.server.schema.bootstrap.SystemSchema"*
    };


You obviously don't need InetorgpersonSchema but why not :).

Just add this list to the server.schemas env property making sure your 
schema is appended to it.

Good luck,
Alex



Re: trying to launch with custom schema

Posted by Dan Armbrust <da...@gmail.com>.
Hmm, it appears that the null pointer is happening when it tries to load 
the "system" schema - and makes the assumption that it will always exist.

When I define a server.schemas line in the properties file, it must 
override all the default schama definition, rather than append to it?

Where can I find the list of all of the default schemas that I need to 
put on the line?  I'm going to add the ones that are listed in the 
example on the jndi properties page, but I'm not sure if that is all of 
them.

Dan