You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@tuscany.apache.org by Fuhwei Lwo <fu...@bricemedia.com> on 2007/04/13 01:43:31 UTC

How to access a composite's data model scope in an application?

Hi,

In my composite, I defined <import.sdo location="wsdl/mydatamodel.wsdl"> in the default.scdl file that would prompt the SCA container to register my data types using SDO databinding. The question I have is what API I should use in my service implementation code to obtain the registered data types.  If I have two composites that are using two different data type definition but with the same namespace URI, I definitely don't want to obtain the wrong data type definition. Thanks for your help.

Below is the previous message from Raymond Feng about associating databinding type system context/scope with a composite. I think this is related to my question but from Tuscany SCA development perspective.

How to associate some context with a composite?
http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e

Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Frank Budinsky <fr...@ca.ibm.com>.
Hi Dave,

The default context binds to the same helpers as all the INSTANCE fields - 
e.g., TypeHelper.INSTANCE == 
HelperProvider.getDefaultContext().getTypeHelper(). HelperContext was a 
last minute addition. At that late stage (and in a point release) we 
couldn't get agreement on how to support multiple scopes - but we knew it 
is a future goal, so we felt it was important to at least set the 
groundwork for people to move away from coding with INSTANCEs and instead 
access their helpers from the context. The remaining unspecified API is 
how to get a HelperContext in a standard way. I think that if we design 
something clean for this in Tuscany, we can later feed it into SDO 3.

Frank

"scabooz" <sc...@gmail.com> wrote on 04/23/2007 07:10:34 PM:

> Hi Frank,
> 
> Can you help me understand why defaultContext is a singleton?  I don't 
see 
> it
> described that way in the specs, so I'm hoping you can help me 
understand 
> this.
> 
> Dave
> 
> 
> ----- Original Message ----- 
> From: "Frank Budinsky" <fr...@ca.ibm.com>
> To: <tu...@ws.apache.org>
> Sent: Monday, April 23, 2007 2:40 PM
> Subject: Re: Scoping SDO metadata, was: How to access a composite's data 

> model scope in an application?
> 
> 
> > Hi Guys,
> >
> > I finally managed to catch up on this. I think you're heading in the 
wrong
> > direction by using HelperProvider.getDefaultContext() because:
> >
> > 1) it's a singleton, and we're trying to get away from singletons:
> >
> >    static HelperContext defaultContext = new DefaultHelperContext();
> >
> > The fact that the default context in Tuscany has some special 
TCCL-based
> > managment (under the covers - thanks to EMF), is a Tuscany-specific
> > behavior - but the default context is itself a single object 
(singleton).
> >
> > 2) HelperProvider is an impl class (in package commonj.sdo.impl) ... 
It's
> > bad to make that the application API
> >
> > The reason we added HelperContext to the spec to get away from the
> > singleton problems. The default context was needed as an interim step,
> > until the spec comes up with other ways to access scoped 
HelperContexts.
> > Many of the problems we had with EMF were exactly because we were 
trying
> > to replace the global EMF INSTANCEs - e.g., EcoreFactory.eINSTANCE - 
but
> > then we ran into problems because there were multiple apps trying to
> > control the value. We don't want a repeat of that with the default
> > context.
> >
> > That said, I think that a better way to look at this is that we want a
> > pluggable mechanism for managing HelperContexts in Tuscany. Since the
> > ability to manage scope-specific context's is a general issue, not 
just
> > for SCA, it does seem to make sense to put this into Tuscany/SDO. 
Maybe we
> > can feed back our (hopefully successful) result to the spec, so in the
> > future there will be proper SDO APIs for this.
> >
> > For now, however, we're talking about some new SDOUtil methods for 
this.
> > For example, what if we provide a HelperContextManager, where you can
> > register keyed HelperContexts. The key could be a ClassLoader, Thread, 
or
> > anything else. Given such a manager, we could provide some SDOUtil 
methods
> > to register and access the HelperContext. Maybe something like this:
> >
> > HelperContext SDOUtil.getHelperContext(Object key);
> >
> > and maybe even some convenience methods like this:
> >
> > HelperContext SDOUtil.getHelperContext(Thread thread);
> > HelperContext SDOUtil.getLoaderHelperContext(Class Loader 
classLoader);
> >
> > These helper methods would be used by clients to access the 
HelperContext.
> > The implementation would just use the HelperContextManager to get the
> > keyed HelperContext. Maybe we could also provide ways to set up 
chained
> > (according to parent CL's, for example) HelperContext's, etc.
> >
> > Thoughts?
> >
> > Frank
> >
> > Fuhwei Lwo <fu...@bricemedia.com> wrote on 04/20/2007 01:36:37 PM:
> >
> >> Raymond,
> >>
> >> I agree with your suggestion below. In addition, I think SCA still
> >> needs to provide an option (injection or API) for the applications
> >> to explicitly retrieve the data model scope from the Contribution.
> >> Other databinding technology APIs beside SDO may not have default
> >> context helper concept.
> >>
> >> 1) SDO defines the pluggability to supply the default HelperContext.
> >> 2) SCA plugs its own scoping scheme to the SDO default HelperContext.
> >> The
> >> HelperContext will be populated based on the Contribution.
> >> 3) Application code will use HelperProvider.getDefaultContext() to
> >> retrieve
> >> the default HelperContext.
> >>
> >>
> >> Raymond Feng <en...@gmail.com> wrote: Hi,
> >>
> >> Please see my comments inline.
> >>
> >> Thanks,
> >> Raymond
> >>
> >> ----- Original Message ----- 
> >> From: "Jean-Sebastien Delfino"
> >> To:
> >> Sent: Friday, April 20, 2007 9:23 AM
> >> Subject: Re: Scoping SDO metadata, was: How to access a composite's 
data
> >
> >> model scope in an application?
> >>
> >>
> >> > Fuhwei Lwo wrote:
> >> >> Hi Sebastien,
> >> >>
> >> >> Here is my understanding of requirements about getting rid of
> > import.sdo
> >> >> and switching to contribution -
> >> >>
> >> >> 1) A contribution will be created by contribution processor for 
each
> >> >> application. - Contribution processor has been done for Jar and 
file
> >> >> system.
> >> >>
> >> >>
> >> >
> >> > Yes
> >> >
> >> >> 2) The contribution processor will create a SDO scope 
(HelperContext
> >> >> instance) to associate with the contribution. Currently calling
> >> >> SDOUtil.createHelperContext() is enough.
> >> >>
> >> >
> >> > That's what I was poking at in my previous email. Creating our own
> >> > context, different from the default SDO context forces SCA to
> > introduce a
> >> > new API to get to that context, and forces all SDO users to use 
that
> > new
> >> > API. So I'm wondering if it wouldn't be better to play more nicely
> > with
> >> > SDO, and have the SCA runtime just populate the default SDO context 
in
> > use
> >> > in a particular application in the server environment.
> >> >
> >>
> >> I have a slightly different view here. IMHO, the SDO should provide 
the
> >> scoping mechanism and the pluggability of scoping schemes. I assume 
the
> >> HelperContext is provided by SDO for scoping metadata. What's missing
> > from
> >> SDO is the pluggability of the scoping schemes. Currently, the 
default
> >> HelperContext is based on TCCL and it's not replaceable. I agree SDO
> > cannot
> >> define scoping schemes for all environment so the pluggability is
> > desirable.
> >>
> >> >> 3) Tuscany SCA needs to provide a way for the application to get 
hold
> > of
> >> >> the HelperContext in association with the contribution in step 2
> > above.
> >> >> Currently the  application is forced to use SDO API -
> >> >> HelperProvider.getDefaultContext() which is using TCCL.
> >> >>
> >> >
> >> > I'm not getting this one :) Is it bad for an SDO user to be "forced
> > to"
> >> > use an SDO API to get an SDO context? It seems better to me than
> > forcing
> >> > an SDO user to use an SCA API, simply because his code may be used 
at
> > some
> >> > point in an SCA environment... and then his code wouldn't work in a
> > JSP, a
> >> > servlet, or any other non-SCA environment...
> >> >
> >> > If the fact that HelperProvider.getDefaultContext() is using the 
TCCL
> > to
> >> > find the correct SDO context is a problem, then we just need to fix
> > that.
> >> > We went through the same discussion with SCA CompositeContext about 
a
> > year
> >> > ago. Associating context with the TCCL is not always convenient in 
a
> >> > server environment, and it may be better to associate context with 
the
> >
> >> > current Thread (using a threadlocal or an inheritable thread local 
for
> >
> >> > example). This is what we did for SCA CompositeContext. Maybe SDO
> > could
> >> > provide a way to associate an SDO context with the current thread
> > instead
> >> > or in addition to associating the SDO context with the TCCL?
> >>
> >> I agree that we should try to use the SDO API to retrieve the current
> >> context. But I think in the SCA application, the default context 
should
> > be
> >> associated with the Contribution. Then it would be a win-win 
situation
> > if we
> >> can do the following:
> >>
> >> 1) SDO defines the pluggability to supply the default HelperContext.
> >> 2) SCA plugs its own scoping scheme to the SDO default HelperContext.
> > The
> >> HelperContext will be populated based on the Contribution.
> >> 3) Application code will use HelperProvider.getDefaultContext() to
> > retrieve
> >> the default HelperContext.
> >>
> >> >
> >> > This would seem a good thing to have anyway since these contexts 
are
> > not
> >> > thread safe as far as I know :)
> >> >
> >> > Thoughts?
> >> >> I am not sure my understanding above is correct so please bear 
with
> > me.
> >> >> Based on my understanding above, currently there is no additional
> >> >> requirement from SDO.
> >> >
> >> > I wouldn't reach that conclusion so fast :) I think that there is a
> >> > requirement to provide a way  to get to an SDO context independent 
of
> > TCCL
> >> > if people don't like that association with TCCL.
> >> >
> >> >> In the future, if we decided to support contribution import/export
> > that
> >> >> may require SDO scoping hierarchy support. But I think we should
> > start
> >> >> using contribution and getting rid of import.sdo as the first 
step.
> >> >>
> >> >>
> >> >
> >> > Yes I'd like to get rid of import.sdo, as I indicated earlier in 
this
> >> > discussion thread.
> >> >
> >> > I would like to support contribution import/export at some point. 
I'm
> > not
> >> > sure that we'll be able to use SDO scope hierarchy support as an 
SCA
> >> > contribution import does not necessarily import the whole scope of
> > another
> >> > SCA contribution, but I guess we'll know more when we start to look 
at
> > the
> >> > details.
> >>
> >> I'm thinking of the following approach to discover SDO metadata from 
a
> > SCA
> >> contribution.
> >>
> >> When the Contribution is processed, the generated SDO factories (the
> > class
> >> name and the namespace) are recognized. Other models such as WSDL/XSD
> > are
> >> handled as well. We don't have to convert all of them into SDO model
> > upfront
> >> as the conversion can be performed on-demand upon the query of a
> > particular
> >> namespace.
> >>
> >> >
> >> >> What do you think?  Thanks for your reply.
> >> >>
> >> >> Fuhwei Lwo
> >> >>
> >> >> Jean-Sebastien Delfino  wrote: Fuhwei Lwo wrote:
> >> >>
> >> >>> Hi,
> >> >>>
> >> >>> In my composite, I defined  in the default.scdl file that would
> > prompt
> >> >>> the SCA container to register my data types using SDO 
databinding.
> > The
> >> >>> question I have is what API I should use in my service
> > implementation
> >> >>> code to obtain the registered data types.  If I have two 
composites
> > that
> >> >>> are using two different data type definition but with the same
> > namespace
> >> >>> URI, I definitely don't want to obtain the wrong data type
> > definition.
> >> >>> Thanks for your help.
> >> >>>
> >> >>> Below is the previous message from Raymond Feng about associating
> >> >>> databinding type system context/scope with a composite. I think 
this
> > is
> >> >>> related to my question but from Tuscany SCA development 
perspective.
> >> >>>
> >> >>> How to associate some context with a composite?
> >> >>> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.
> >> mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
> >> >>>
> >> >>
> >> >> Hi,
> >> >>
> >> >> The short (and not perfect) answer to your question is. With the
> > current
> >> >> code in trunk, use:
> >> >> commonj.sdo.impl.HelperProvider.getDefaultContext()
> >> >>
> >> >> But I thought about this a bit and your question triggered some
> > comments,
> >> >> and more questions :)
> >> >>
> >> >> Import.sdo extension:
> >> >> I think we should be able to remove that Tuscany extension to SCA
> >> >> assembly XML, now that we have the SCA contribution service in 
place.
> > We
> >> >> know which WSDLs and XSDs are available in a given SCA 
contribution
> > and,
> >> >> with sca-contribution.xml import elements, we also know which XML
> >> >> namespaces are imported from other SCA contributions or other
> > locations
> >> >> outside of an SCA domain. So we probably don't need another 
element
> >> >> duplicating part of this information in .composite files.
> >> >>
> >> >> Scope of XML metadata:
> >> >> My understanding of the SCA assembly spec is that the scope of XML
> >> >> metadata is an SCA contribution (plus what it imports from 
outside)
> > and
> >> >> not an individual Composite.
> >> >>
> >> >> Scope of metadata contributed by Java classes:
> >> >> Our runtime currently supports SCA contributions packaged as JARs 
or
> > file
> >> >> system folders. With these packaging schemes an SCA contribution 
is
> > self
> >> >> contained and cannot reference application classes in other SCA
> >> >> contributions. At some point we'll probably want to support 
packaging
> > of
> >> >> SCA contributions as OSGI bundles and then leverage OSGI to allow 
an
> > OSGI
> >> >> bundle to see classes in another bundle, but we don't support that
> > OSGI
> >> >> packaging scheme yet. As a side comment I'd like to see if we 
could
> >> >> reactivate some work on the OSGI extensions that we have under
> >> >> java/sca/contrib/ and are not integrated in our build at the 
moment.
> > So,
> >> >> the scope of Java metadata is an SCA contribution as well, with no
> >> >> external import mechanism.
> >> >>
> >> >> So the bottom line is:
> >> >> References to types in SCA artifacts are resolved at the SCA
> > contribution
> >> >> level. There is no relationship between an SCA composite and a
> > metadata
> >> >> scope.
> >> >>
> >> >> More comments, on databinding specific handling of metadata:
> >> >> We need to support multiple databindings. Each databinding comes 
with
> > its
> >> >> own form of metadata and different APIs to get to that metadata 
and
> >> >> define metadata scopes. I guess it's important for a databinding
> >> >> technology to define a way to scope metadata if it wants to be
> >> >> successfully used in a server environment, and isolate the 
metadata
> > for
> >> >> the different applications running on the server.
> >> >>
> >> >> In such an environment, our SCA runtime should play nicely with 
the
> > other
> >> >> pieces of runtime and application code (not necessarily running as
> > SCA
> >> >> components), and use the metadata scoping mechanism defined by 
each
> >> >> databinding in such a way that non-SCA code and SCA component code
> >> >> running together in the server environment are able to see the 
same
> >> >> metadata for a given application.
> >> >>
> >> >> I'd like to start a discussion to cover this aspect for our 
various
> >> >> databindings and make sure that the metadata story for each
> > databinding
> >> >> holds together.
> >> >>
> >> >> To help feed this discussion with concrete data, could the SDO 
folks
> > jump
> >> >> in here, and describe the various ways of maintaining SDO metadata
> > scopes
> >> >> in a server environment, running with multiple classloaders and
> > threads?
> >> >>
> >> >> Thanks,
> >> >>
> >> >>
> >> >
> >> >
> >> > -- 
> >> > Jean-Sebastien
> >> >
> >> >
> >> > 
---------------------------------------------------------------------
> >> > To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> >> > For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> >> >
> >>
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> >> For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> >>
> >>
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> > For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> > 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org


Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by scabooz <sc...@gmail.com>.
Hi Frank,

Can you help me understand why defaultContext is a singleton?  I don't see 
it
described that way in the specs, so I'm hoping you can help me understand 
this.

Dave


----- Original Message ----- 
From: "Frank Budinsky" <fr...@ca.ibm.com>
To: <tu...@ws.apache.org>
Sent: Monday, April 23, 2007 2:40 PM
Subject: Re: Scoping SDO metadata, was: How to access a composite's data 
model scope in an application?


> Hi Guys,
>
> I finally managed to catch up on this. I think you're heading in the wrong
> direction by using HelperProvider.getDefaultContext() because:
>
> 1) it's a singleton, and we're trying to get away from singletons:
>
>    static HelperContext defaultContext = new DefaultHelperContext();
>
> The fact that the default context in Tuscany has some special TCCL-based
> managment (under the covers - thanks to EMF), is a Tuscany-specific
> behavior - but the default context is itself a single object (singleton).
>
> 2) HelperProvider is an impl class (in package commonj.sdo.impl) ... It's
> bad to make that the application API
>
> The reason we added HelperContext to the spec to get away from the
> singleton problems. The default context was needed as an interim step,
> until the spec comes up with other ways to access scoped HelperContexts.
> Many of the problems we had with EMF were exactly because we were trying
> to replace the global EMF INSTANCEs - e.g., EcoreFactory.eINSTANCE - but
> then we ran into problems because there were multiple apps trying to
> control the value. We don't want a repeat of that with the default
> context.
>
> That said, I think that a better way to look at this is that we want a
> pluggable mechanism for managing HelperContexts in Tuscany. Since the
> ability to manage scope-specific context's is a general issue, not just
> for SCA, it does seem to make sense to put this into Tuscany/SDO. Maybe we
> can feed back our (hopefully successful) result to the spec, so in the
> future there will be proper SDO APIs for this.
>
> For now, however, we're talking about some new SDOUtil methods for this.
> For example, what if we provide a HelperContextManager, where you can
> register keyed HelperContexts. The key could be a ClassLoader, Thread, or
> anything else. Given such a manager, we could provide some SDOUtil methods
> to register and access the HelperContext. Maybe something like this:
>
> HelperContext SDOUtil.getHelperContext(Object key);
>
> and maybe even some convenience methods like this:
>
> HelperContext SDOUtil.getHelperContext(Thread thread);
> HelperContext SDOUtil.getLoaderHelperContext(Class Loader classLoader);
>
> These helper methods would be used by clients to access the HelperContext.
> The implementation would just use the HelperContextManager to get the
> keyed HelperContext. Maybe we could also provide ways to set up chained
> (according to parent CL's, for example) HelperContext's, etc.
>
> Thoughts?
>
> Frank
>
> Fuhwei Lwo <fu...@bricemedia.com> wrote on 04/20/2007 01:36:37 PM:
>
>> Raymond,
>>
>> I agree with your suggestion below. In addition, I think SCA still
>> needs to provide an option (injection or API) for the applications
>> to explicitly retrieve the data model scope from the Contribution.
>> Other databinding technology APIs beside SDO may not have default
>> context helper concept.
>>
>> 1) SDO defines the pluggability to supply the default HelperContext.
>> 2) SCA plugs its own scoping scheme to the SDO default HelperContext.
>> The
>> HelperContext will be populated based on the Contribution.
>> 3) Application code will use HelperProvider.getDefaultContext() to
>> retrieve
>> the default HelperContext.
>>
>>
>> Raymond Feng <en...@gmail.com> wrote: Hi,
>>
>> Please see my comments inline.
>>
>> Thanks,
>> Raymond
>>
>> ----- Original Message ----- 
>> From: "Jean-Sebastien Delfino"
>> To:
>> Sent: Friday, April 20, 2007 9:23 AM
>> Subject: Re: Scoping SDO metadata, was: How to access a composite's data
>
>> model scope in an application?
>>
>>
>> > Fuhwei Lwo wrote:
>> >> Hi Sebastien,
>> >>
>> >> Here is my understanding of requirements about getting rid of
> import.sdo
>> >> and switching to contribution -
>> >>
>> >> 1) A contribution will be created by contribution processor for each
>> >> application. - Contribution processor has been done for Jar and file
>> >> system.
>> >>
>> >>
>> >
>> > Yes
>> >
>> >> 2) The contribution processor will create a SDO scope (HelperContext
>> >> instance) to associate with the contribution. Currently calling
>> >> SDOUtil.createHelperContext() is enough.
>> >>
>> >
>> > That's what I was poking at in my previous email. Creating our own
>> > context, different from the default SDO context forces SCA to
> introduce a
>> > new API to get to that context, and forces all SDO users to use that
> new
>> > API. So I'm wondering if it wouldn't be better to play more nicely
> with
>> > SDO, and have the SCA runtime just populate the default SDO context in
> use
>> > in a particular application in the server environment.
>> >
>>
>> I have a slightly different view here. IMHO, the SDO should provide the
>> scoping mechanism and the pluggability of scoping schemes. I assume the
>> HelperContext is provided by SDO for scoping metadata. What's missing
> from
>> SDO is the pluggability of the scoping schemes. Currently, the default
>> HelperContext is based on TCCL and it's not replaceable. I agree SDO
> cannot
>> define scoping schemes for all environment so the pluggability is
> desirable.
>>
>> >> 3) Tuscany SCA needs to provide a way for the application to get hold
> of
>> >> the HelperContext in association with the contribution in step 2
> above.
>> >> Currently the  application is forced to use SDO API -
>> >> HelperProvider.getDefaultContext() which is using TCCL.
>> >>
>> >
>> > I'm not getting this one :) Is it bad for an SDO user to be "forced
> to"
>> > use an SDO API to get an SDO context? It seems better to me than
> forcing
>> > an SDO user to use an SCA API, simply because his code may be used at
> some
>> > point in an SCA environment... and then his code wouldn't work in a
> JSP, a
>> > servlet, or any other non-SCA environment...
>> >
>> > If the fact that HelperProvider.getDefaultContext() is using the TCCL
> to
>> > find the correct SDO context is a problem, then we just need to fix
> that.
>> > We went through the same discussion with SCA CompositeContext about a
> year
>> > ago. Associating context with the TCCL is not always convenient in a
>> > server environment, and it may be better to associate context with the
>
>> > current Thread (using a threadlocal or an inheritable thread local for
>
>> > example). This is what we did for SCA CompositeContext. Maybe SDO
> could
>> > provide a way to associate an SDO context with the current thread
> instead
>> > or in addition to associating the SDO context with the TCCL?
>>
>> I agree that we should try to use the SDO API to retrieve the current
>> context. But I think in the SCA application, the default context should
> be
>> associated with the Contribution. Then it would be a win-win situation
> if we
>> can do the following:
>>
>> 1) SDO defines the pluggability to supply the default HelperContext.
>> 2) SCA plugs its own scoping scheme to the SDO default HelperContext.
> The
>> HelperContext will be populated based on the Contribution.
>> 3) Application code will use HelperProvider.getDefaultContext() to
> retrieve
>> the default HelperContext.
>>
>> >
>> > This would seem a good thing to have anyway since these contexts are
> not
>> > thread safe as far as I know :)
>> >
>> > Thoughts?
>> >> I am not sure my understanding above is correct so please bear with
> me.
>> >> Based on my understanding above, currently there is no additional
>> >> requirement from SDO.
>> >
>> > I wouldn't reach that conclusion so fast :) I think that there is a
>> > requirement to provide a way  to get to an SDO context independent of
> TCCL
>> > if people don't like that association with TCCL.
>> >
>> >> In the future, if we decided to support contribution import/export
> that
>> >> may require SDO scoping hierarchy support. But I think we should
> start
>> >> using contribution and getting rid of import.sdo as the first step.
>> >>
>> >>
>> >
>> > Yes I'd like to get rid of import.sdo, as I indicated earlier in this
>> > discussion thread.
>> >
>> > I would like to support contribution import/export at some point. I'm
> not
>> > sure that we'll be able to use SDO scope hierarchy support as an SCA
>> > contribution import does not necessarily import the whole scope of
> another
>> > SCA contribution, but I guess we'll know more when we start to look at
> the
>> > details.
>>
>> I'm thinking of the following approach to discover SDO metadata from a
> SCA
>> contribution.
>>
>> When the Contribution is processed, the generated SDO factories (the
> class
>> name and the namespace) are recognized. Other models such as WSDL/XSD
> are
>> handled as well. We don't have to convert all of them into SDO model
> upfront
>> as the conversion can be performed on-demand upon the query of a
> particular
>> namespace.
>>
>> >
>> >> What do you think?  Thanks for your reply.
>> >>
>> >> Fuhwei Lwo
>> >>
>> >> Jean-Sebastien Delfino  wrote: Fuhwei Lwo wrote:
>> >>
>> >>> Hi,
>> >>>
>> >>> In my composite, I defined  in the default.scdl file that would
> prompt
>> >>> the SCA container to register my data types using SDO databinding.
> The
>> >>> question I have is what API I should use in my service
> implementation
>> >>> code to obtain the registered data types.  If I have two composites
> that
>> >>> are using two different data type definition but with the same
> namespace
>> >>> URI, I definitely don't want to obtain the wrong data type
> definition.
>> >>> Thanks for your help.
>> >>>
>> >>> Below is the previous message from Raymond Feng about associating
>> >>> databinding type system context/scope with a composite. I think this
> is
>> >>> related to my question but from Tuscany SCA development perspective.
>> >>>
>> >>> How to associate some context with a composite?
>> >>> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.
>> mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
>> >>>
>> >>
>> >> Hi,
>> >>
>> >> The short (and not perfect) answer to your question is. With the
> current
>> >> code in trunk, use:
>> >> commonj.sdo.impl.HelperProvider.getDefaultContext()
>> >>
>> >> But I thought about this a bit and your question triggered some
> comments,
>> >> and more questions :)
>> >>
>> >> Import.sdo extension:
>> >> I think we should be able to remove that Tuscany extension to SCA
>> >> assembly XML, now that we have the SCA contribution service in place.
> We
>> >> know which WSDLs and XSDs are available in a given SCA contribution
> and,
>> >> with sca-contribution.xml import elements, we also know which XML
>> >> namespaces are imported from other SCA contributions or other
> locations
>> >> outside of an SCA domain. So we probably don't need another  element
>> >> duplicating part of this information in .composite files.
>> >>
>> >> Scope of XML metadata:
>> >> My understanding of the SCA assembly spec is that the scope of XML
>> >> metadata is an SCA contribution (plus what it imports from outside)
> and
>> >> not an individual Composite.
>> >>
>> >> Scope of metadata contributed by Java classes:
>> >> Our runtime currently supports SCA contributions packaged as JARs or
> file
>> >> system folders. With these packaging schemes an SCA contribution is
> self
>> >> contained and cannot reference application classes in other SCA
>> >> contributions. At some point we'll probably want to support packaging
> of
>> >> SCA contributions as OSGI bundles and then leverage OSGI to allow an
> OSGI
>> >> bundle to see classes in another bundle, but we don't support that
> OSGI
>> >> packaging scheme yet. As a side comment I'd like to see if we could
>> >> reactivate some work on the OSGI extensions that we have under
>> >> java/sca/contrib/ and are not integrated in our build at the moment.
> So,
>> >> the scope of Java metadata is an SCA contribution as well, with no
>> >> external import mechanism.
>> >>
>> >> So the bottom line is:
>> >> References to types in SCA artifacts are resolved at the SCA
> contribution
>> >> level. There is no relationship between an SCA composite and a
> metadata
>> >> scope.
>> >>
>> >> More comments, on databinding specific handling of metadata:
>> >> We need to support multiple databindings. Each databinding comes with
> its
>> >> own form of metadata and different APIs to get to that metadata and
>> >> define metadata scopes. I guess it's important for a databinding
>> >> technology to define a way to scope metadata if it wants to be
>> >> successfully used in a server environment, and isolate the metadata
> for
>> >> the different applications running on the server.
>> >>
>> >> In such an environment, our SCA runtime should play nicely with the
> other
>> >> pieces of runtime and application code (not necessarily running as
> SCA
>> >> components), and use the metadata scoping mechanism defined by each
>> >> databinding in such a way that non-SCA code and SCA component code
>> >> running together in the server environment are able to see the same
>> >> metadata for a given application.
>> >>
>> >> I'd like to start a discussion to cover this aspect for our various
>> >> databindings and make sure that the metadata story for each
> databinding
>> >> holds together.
>> >>
>> >> To help feed this discussion with concrete data, could the SDO folks
> jump
>> >> in here, and describe the various ways of maintaining SDO metadata
> scopes
>> >> in a server environment, running with multiple classloaders and
> threads?
>> >>
>> >> Thanks,
>> >>
>> >>
>> >
>> >
>> > -- 
>> > Jean-Sebastien
>> >
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
>> > For additional commands, e-mail: tuscany-dev-help@ws.apache.org
>> >
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
>> For additional commands, e-mail: tuscany-dev-help@ws.apache.org
>>
>>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org


Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Fuhwei Lwo <fu...@bricemedia.com>.
Frank,

The approach I am thinking is to make DefaultHelperContext a public interface that its implementation can be provided by others. This is similar to the way o.a.t.s.h.HelperProviderImpl was discovered in the META-INF/services/commonj.sdo.impl.HelperProvider file and bootstrapped. Of course, the default impl will be still using TCCL.

Fuhwei

Frank Budinsky <fr...@ca.ibm.com> wrote: Hi Guys,

I finally managed to catch up on this. I think you're heading in the wrong 
direction by using HelperProvider.getDefaultContext() because:

1) it's a singleton, and we're trying to get away from singletons:

    static HelperContext defaultContext = new DefaultHelperContext();

The fact that the default context in Tuscany has some special TCCL-based 
managment (under the covers - thanks to EMF), is a Tuscany-specific 
behavior - but the default context is itself a single object (singleton).

2) HelperProvider is an impl class (in package commonj.sdo.impl) ... It's 
bad to make that the application API

The reason we added HelperContext to the spec to get away from the 
singleton problems. The default context was needed as an interim step, 
until the spec comes up with other ways to access scoped HelperContexts. 
Many of the problems we had with EMF were exactly because we were trying 
to replace the global EMF INSTANCEs - e.g., EcoreFactory.eINSTANCE - but 
then we ran into problems because there were multiple apps trying to 
control the value. We don't want a repeat of that with the default 
context.

That said, I think that a better way to look at this is that we want a 
pluggable mechanism for managing HelperContexts in Tuscany. Since the 
ability to manage scope-specific context's is a general issue, not just 
for SCA, it does seem to make sense to put this into Tuscany/SDO. Maybe we 
can feed back our (hopefully successful) result to the spec, so in the 
future there will be proper SDO APIs for this.

For now, however, we're talking about some new SDOUtil methods for this. 
For example, what if we provide a HelperContextManager, where you can 
register keyed HelperContexts. The key could be a ClassLoader, Thread, or 
anything else. Given such a manager, we could provide some SDOUtil methods 
to register and access the HelperContext. Maybe something like this:

HelperContext SDOUtil.getHelperContext(Object key);

and maybe even some convenience methods like this:

HelperContext SDOUtil.getHelperContext(Thread thread);
HelperContext SDOUtil.getLoaderHelperContext(Class Loader classLoader);

These helper methods would be used by clients to access the HelperContext. 
The implementation would just use the HelperContextManager to get the 
keyed HelperContext. Maybe we could also provide ways to set up chained 
(according to parent CL's, for example) HelperContext's, etc.

Thoughts?

Frank

Fuhwei Lwo  wrote on 04/20/2007 01:36:37 PM:

> Raymond,
> 
> I agree with your suggestion below. In addition, I think SCA still 
> needs to provide an option (injection or API) for the applications 
> to explicitly retrieve the data model scope from the Contribution. 
> Other databinding technology APIs beside SDO may not have default 
> context helper concept.
> 
> 1) SDO defines the pluggability to supply the default HelperContext.
> 2) SCA plugs its own scoping scheme to the SDO default HelperContext. 
> The 
> HelperContext will be populated based on the Contribution.
> 3) Application code will use HelperProvider.getDefaultContext() to 
> retrieve 
> the default HelperContext.
> 
> 
> Raymond Feng  wrote: Hi,
> 
> Please see my comments inline.
> 
> Thanks,
> Raymond
> 
> ----- Original Message ----- 
> From: "Jean-Sebastien Delfino" 
> To: 
> Sent: Friday, April 20, 2007 9:23 AM
> Subject: Re: Scoping SDO metadata, was: How to access a composite's data 

> model scope in an application?
> 
> 
> > Fuhwei Lwo wrote:
> >> Hi Sebastien,
> >>
> >> Here is my understanding of requirements about getting rid of 
import.sdo 
> >> and switching to contribution -
> >>
> >> 1) A contribution will be created by contribution processor for each 
> >> application. - Contribution processor has been done for Jar and file 
> >> system.
> >>
> >>
> >
> > Yes
> >
> >> 2) The contribution processor will create a SDO scope (HelperContext 
> >> instance) to associate with the contribution. Currently calling 
> >> SDOUtil.createHelperContext() is enough.
> >>
> >
> > That's what I was poking at in my previous email. Creating our own 
> > context, different from the default SDO context forces SCA to 
introduce a 
> > new API to get to that context, and forces all SDO users to use that 
new 
> > API. So I'm wondering if it wouldn't be better to play more nicely 
with 
> > SDO, and have the SCA runtime just populate the default SDO context in 
use 
> > in a particular application in the server environment.
> >
> 
> I have a slightly different view here. IMHO, the SDO should provide the 
> scoping mechanism and the pluggability of scoping schemes. I assume the 
> HelperContext is provided by SDO for scoping metadata. What's missing 
from 
> SDO is the pluggability of the scoping schemes. Currently, the default 
> HelperContext is based on TCCL and it's not replaceable. I agree SDO 
cannot 
> define scoping schemes for all environment so the pluggability is 
desirable.
> 
> >> 3) Tuscany SCA needs to provide a way for the application to get hold 
of 
> >> the HelperContext in association with the contribution in step 2 
above. 
> >> Currently the  application is forced to use SDO API - 
> >> HelperProvider.getDefaultContext() which is using TCCL.
> >>
> >
> > I'm not getting this one :) Is it bad for an SDO user to be "forced 
to" 
> > use an SDO API to get an SDO context? It seems better to me than 
forcing 
> > an SDO user to use an SCA API, simply because his code may be used at 
some 
> > point in an SCA environment... and then his code wouldn't work in a 
JSP, a 
> > servlet, or any other non-SCA environment...
> >
> > If the fact that HelperProvider.getDefaultContext() is using the TCCL 
to 
> > find the correct SDO context is a problem, then we just need to fix 
that. 
> > We went through the same discussion with SCA CompositeContext about a 
year 
> > ago. Associating context with the TCCL is not always convenient in a 
> > server environment, and it may be better to associate context with the 

> > current Thread (using a threadlocal or an inheritable thread local for 

> > example). This is what we did for SCA CompositeContext. Maybe SDO 
could 
> > provide a way to associate an SDO context with the current thread 
instead 
> > or in addition to associating the SDO context with the TCCL?
> 
> I agree that we should try to use the SDO API to retrieve the current 
> context. But I think in the SCA application, the default context should 
be 
> associated with the Contribution. Then it would be a win-win situation 
if we 
> can do the following:
> 
> 1) SDO defines the pluggability to supply the default HelperContext.
> 2) SCA plugs its own scoping scheme to the SDO default HelperContext. 
The 
> HelperContext will be populated based on the Contribution.
> 3) Application code will use HelperProvider.getDefaultContext() to 
retrieve 
> the default HelperContext.
> 
> >
> > This would seem a good thing to have anyway since these contexts are 
not 
> > thread safe as far as I know :)
> >
> > Thoughts?
> >> I am not sure my understanding above is correct so please bear with 
me. 
> >> Based on my understanding above, currently there is no additional 
> >> requirement from SDO.
> >
> > I wouldn't reach that conclusion so fast :) I think that there is a 
> > requirement to provide a way  to get to an SDO context independent of 
TCCL 
> > if people don't like that association with TCCL.
> >
> >> In the future, if we decided to support contribution import/export 
that 
> >> may require SDO scoping hierarchy support. But I think we should 
start 
> >> using contribution and getting rid of import.sdo as the first step.
> >>
> >>
> >
> > Yes I'd like to get rid of import.sdo, as I indicated earlier in this 
> > discussion thread.
> >
> > I would like to support contribution import/export at some point. I'm 
not 
> > sure that we'll be able to use SDO scope hierarchy support as an SCA 
> > contribution import does not necessarily import the whole scope of 
another 
> > SCA contribution, but I guess we'll know more when we start to look at 
the 
> > details.
> 
> I'm thinking of the following approach to discover SDO metadata from a 
SCA 
> contribution.
> 
> When the Contribution is processed, the generated SDO factories (the 
class 
> name and the namespace) are recognized. Other models such as WSDL/XSD 
are 
> handled as well. We don't have to convert all of them into SDO model 
upfront 
> as the conversion can be performed on-demand upon the query of a 
particular 
> namespace.
> 
> >
> >> What do you think?  Thanks for your reply.
> >>
> >> Fuhwei Lwo
> >>
> >> Jean-Sebastien Delfino  wrote: Fuhwei Lwo wrote:
> >>
> >>> Hi,
> >>>
> >>> In my composite, I defined  in the default.scdl file that would 
prompt 
> >>> the SCA container to register my data types using SDO databinding. 
The 
> >>> question I have is what API I should use in my service 
implementation 
> >>> code to obtain the registered data types.  If I have two composites 
that 
> >>> are using two different data type definition but with the same 
namespace 
> >>> URI, I definitely don't want to obtain the wrong data type 
definition. 
> >>> Thanks for your help.
> >>>
> >>> Below is the previous message from Raymond Feng about associating 
> >>> databinding type system context/scope with a composite. I think this 
is 
> >>> related to my question but from Tuscany SCA development perspective.
> >>>
> >>> How to associate some context with a composite?
> >>> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.
> mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
> >>>
> >>
> >> Hi,
> >>
> >> The short (and not perfect) answer to your question is. With the 
current 
> >> code in trunk, use:
> >> commonj.sdo.impl.HelperProvider.getDefaultContext()
> >>
> >> But I thought about this a bit and your question triggered some 
comments, 
> >> and more questions :)
> >>
> >> Import.sdo extension:
> >> I think we should be able to remove that Tuscany extension to SCA 
> >> assembly XML, now that we have the SCA contribution service in place. 
We 
> >> know which WSDLs and XSDs are available in a given SCA contribution 
and, 
> >> with sca-contribution.xml import elements, we also know which XML 
> >> namespaces are imported from other SCA contributions or other 
locations 
> >> outside of an SCA domain. So we probably don't need another  element 
> >> duplicating part of this information in .composite files.
> >>
> >> Scope of XML metadata:
> >> My understanding of the SCA assembly spec is that the scope of XML 
> >> metadata is an SCA contribution (plus what it imports from outside) 
and 
> >> not an individual Composite.
> >>
> >> Scope of metadata contributed by Java classes:
> >> Our runtime currently supports SCA contributions packaged as JARs or 
file 
> >> system folders. With these packaging schemes an SCA contribution is 
self 
> >> contained and cannot reference application classes in other SCA 
> >> contributions. At some point we'll probably want to support packaging 
of 
> >> SCA contributions as OSGI bundles and then leverage OSGI to allow an 
OSGI 
> >> bundle to see classes in another bundle, but we don't support that 
OSGI 
> >> packaging scheme yet. As a side comment I'd like to see if we could 
> >> reactivate some work on the OSGI extensions that we have under 
> >> java/sca/contrib/ and are not integrated in our build at the moment. 
So, 
> >> the scope of Java metadata is an SCA contribution as well, with no 
> >> external import mechanism.
> >>
> >> So the bottom line is:
> >> References to types in SCA artifacts are resolved at the SCA 
contribution 
> >> level. There is no relationship between an SCA composite and a 
metadata 
> >> scope.
> >>
> >> More comments, on databinding specific handling of metadata:
> >> We need to support multiple databindings. Each databinding comes with 
its 
> >> own form of metadata and different APIs to get to that metadata and 
> >> define metadata scopes. I guess it's important for a databinding 
> >> technology to define a way to scope metadata if it wants to be 
> >> successfully used in a server environment, and isolate the metadata 
for 
> >> the different applications running on the server.
> >>
> >> In such an environment, our SCA runtime should play nicely with the 
other 
> >> pieces of runtime and application code (not necessarily running as 
SCA 
> >> components), and use the metadata scoping mechanism defined by each 
> >> databinding in such a way that non-SCA code and SCA component code 
> >> running together in the server environment are able to see the same 
> >> metadata for a given application.
> >>
> >> I'd like to start a discussion to cover this aspect for our various 
> >> databindings and make sure that the metadata story for each 
databinding 
> >> holds together.
> >>
> >> To help feed this discussion with concrete data, could the SDO folks 
jump 
> >> in here, and describe the various ways of maintaining SDO metadata 
scopes 
> >> in a server environment, running with multiple classloaders and 
threads?
> >>
> >> Thanks,
> >>
> >>
> >
> >
> > -- 
> > Jean-Sebastien
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> > For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> > 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> 
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org



Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Frank Budinsky <fr...@ca.ibm.com>.
Hi Guys,

I finally managed to catch up on this. I think you're heading in the wrong 
direction by using HelperProvider.getDefaultContext() because:

1) it's a singleton, and we're trying to get away from singletons:

    static HelperContext defaultContext = new DefaultHelperContext();

The fact that the default context in Tuscany has some special TCCL-based 
managment (under the covers - thanks to EMF), is a Tuscany-specific 
behavior - but the default context is itself a single object (singleton).

2) HelperProvider is an impl class (in package commonj.sdo.impl) ... It's 
bad to make that the application API

The reason we added HelperContext to the spec to get away from the 
singleton problems. The default context was needed as an interim step, 
until the spec comes up with other ways to access scoped HelperContexts. 
Many of the problems we had with EMF were exactly because we were trying 
to replace the global EMF INSTANCEs - e.g., EcoreFactory.eINSTANCE - but 
then we ran into problems because there were multiple apps trying to 
control the value. We don't want a repeat of that with the default 
context.

That said, I think that a better way to look at this is that we want a 
pluggable mechanism for managing HelperContexts in Tuscany. Since the 
ability to manage scope-specific context's is a general issue, not just 
for SCA, it does seem to make sense to put this into Tuscany/SDO. Maybe we 
can feed back our (hopefully successful) result to the spec, so in the 
future there will be proper SDO APIs for this.

For now, however, we're talking about some new SDOUtil methods for this. 
For example, what if we provide a HelperContextManager, where you can 
register keyed HelperContexts. The key could be a ClassLoader, Thread, or 
anything else. Given such a manager, we could provide some SDOUtil methods 
to register and access the HelperContext. Maybe something like this:

HelperContext SDOUtil.getHelperContext(Object key);

and maybe even some convenience methods like this:

HelperContext SDOUtil.getHelperContext(Thread thread);
HelperContext SDOUtil.getLoaderHelperContext(Class Loader classLoader);

These helper methods would be used by clients to access the HelperContext. 
The implementation would just use the HelperContextManager to get the 
keyed HelperContext. Maybe we could also provide ways to set up chained 
(according to parent CL's, for example) HelperContext's, etc.

Thoughts?

Frank

Fuhwei Lwo <fu...@bricemedia.com> wrote on 04/20/2007 01:36:37 PM:

> Raymond,
> 
> I agree with your suggestion below. In addition, I think SCA still 
> needs to provide an option (injection or API) for the applications 
> to explicitly retrieve the data model scope from the Contribution. 
> Other databinding technology APIs beside SDO may not have default 
> context helper concept.
> 
> 1) SDO defines the pluggability to supply the default HelperContext.
> 2) SCA plugs its own scoping scheme to the SDO default HelperContext. 
> The 
> HelperContext will be populated based on the Contribution.
> 3) Application code will use HelperProvider.getDefaultContext() to 
> retrieve 
> the default HelperContext.
> 
> 
> Raymond Feng <en...@gmail.com> wrote: Hi,
> 
> Please see my comments inline.
> 
> Thanks,
> Raymond
> 
> ----- Original Message ----- 
> From: "Jean-Sebastien Delfino" 
> To: 
> Sent: Friday, April 20, 2007 9:23 AM
> Subject: Re: Scoping SDO metadata, was: How to access a composite's data 

> model scope in an application?
> 
> 
> > Fuhwei Lwo wrote:
> >> Hi Sebastien,
> >>
> >> Here is my understanding of requirements about getting rid of 
import.sdo 
> >> and switching to contribution -
> >>
> >> 1) A contribution will be created by contribution processor for each 
> >> application. - Contribution processor has been done for Jar and file 
> >> system.
> >>
> >>
> >
> > Yes
> >
> >> 2) The contribution processor will create a SDO scope (HelperContext 
> >> instance) to associate with the contribution. Currently calling 
> >> SDOUtil.createHelperContext() is enough.
> >>
> >
> > That's what I was poking at in my previous email. Creating our own 
> > context, different from the default SDO context forces SCA to 
introduce a 
> > new API to get to that context, and forces all SDO users to use that 
new 
> > API. So I'm wondering if it wouldn't be better to play more nicely 
with 
> > SDO, and have the SCA runtime just populate the default SDO context in 
use 
> > in a particular application in the server environment.
> >
> 
> I have a slightly different view here. IMHO, the SDO should provide the 
> scoping mechanism and the pluggability of scoping schemes. I assume the 
> HelperContext is provided by SDO for scoping metadata. What's missing 
from 
> SDO is the pluggability of the scoping schemes. Currently, the default 
> HelperContext is based on TCCL and it's not replaceable. I agree SDO 
cannot 
> define scoping schemes for all environment so the pluggability is 
desirable.
> 
> >> 3) Tuscany SCA needs to provide a way for the application to get hold 
of 
> >> the HelperContext in association with the contribution in step 2 
above. 
> >> Currently the  application is forced to use SDO API - 
> >> HelperProvider.getDefaultContext() which is using TCCL.
> >>
> >
> > I'm not getting this one :) Is it bad for an SDO user to be "forced 
to" 
> > use an SDO API to get an SDO context? It seems better to me than 
forcing 
> > an SDO user to use an SCA API, simply because his code may be used at 
some 
> > point in an SCA environment... and then his code wouldn't work in a 
JSP, a 
> > servlet, or any other non-SCA environment...
> >
> > If the fact that HelperProvider.getDefaultContext() is using the TCCL 
to 
> > find the correct SDO context is a problem, then we just need to fix 
that. 
> > We went through the same discussion with SCA CompositeContext about a 
year 
> > ago. Associating context with the TCCL is not always convenient in a 
> > server environment, and it may be better to associate context with the 

> > current Thread (using a threadlocal or an inheritable thread local for 

> > example). This is what we did for SCA CompositeContext. Maybe SDO 
could 
> > provide a way to associate an SDO context with the current thread 
instead 
> > or in addition to associating the SDO context with the TCCL?
> 
> I agree that we should try to use the SDO API to retrieve the current 
> context. But I think in the SCA application, the default context should 
be 
> associated with the Contribution. Then it would be a win-win situation 
if we 
> can do the following:
> 
> 1) SDO defines the pluggability to supply the default HelperContext.
> 2) SCA plugs its own scoping scheme to the SDO default HelperContext. 
The 
> HelperContext will be populated based on the Contribution.
> 3) Application code will use HelperProvider.getDefaultContext() to 
retrieve 
> the default HelperContext.
> 
> >
> > This would seem a good thing to have anyway since these contexts are 
not 
> > thread safe as far as I know :)
> >
> > Thoughts?
> >> I am not sure my understanding above is correct so please bear with 
me. 
> >> Based on my understanding above, currently there is no additional 
> >> requirement from SDO.
> >
> > I wouldn't reach that conclusion so fast :) I think that there is a 
> > requirement to provide a way  to get to an SDO context independent of 
TCCL 
> > if people don't like that association with TCCL.
> >
> >> In the future, if we decided to support contribution import/export 
that 
> >> may require SDO scoping hierarchy support. But I think we should 
start 
> >> using contribution and getting rid of import.sdo as the first step.
> >>
> >>
> >
> > Yes I'd like to get rid of import.sdo, as I indicated earlier in this 
> > discussion thread.
> >
> > I would like to support contribution import/export at some point. I'm 
not 
> > sure that we'll be able to use SDO scope hierarchy support as an SCA 
> > contribution import does not necessarily import the whole scope of 
another 
> > SCA contribution, but I guess we'll know more when we start to look at 
the 
> > details.
> 
> I'm thinking of the following approach to discover SDO metadata from a 
SCA 
> contribution.
> 
> When the Contribution is processed, the generated SDO factories (the 
class 
> name and the namespace) are recognized. Other models such as WSDL/XSD 
are 
> handled as well. We don't have to convert all of them into SDO model 
upfront 
> as the conversion can be performed on-demand upon the query of a 
particular 
> namespace.
> 
> >
> >> What do you think?  Thanks for your reply.
> >>
> >> Fuhwei Lwo
> >>
> >> Jean-Sebastien Delfino  wrote: Fuhwei Lwo wrote:
> >>
> >>> Hi,
> >>>
> >>> In my composite, I defined  in the default.scdl file that would 
prompt 
> >>> the SCA container to register my data types using SDO databinding. 
The 
> >>> question I have is what API I should use in my service 
implementation 
> >>> code to obtain the registered data types.  If I have two composites 
that 
> >>> are using two different data type definition but with the same 
namespace 
> >>> URI, I definitely don't want to obtain the wrong data type 
definition. 
> >>> Thanks for your help.
> >>>
> >>> Below is the previous message from Raymond Feng about associating 
> >>> databinding type system context/scope with a composite. I think this 
is 
> >>> related to my question but from Tuscany SCA development perspective.
> >>>
> >>> How to associate some context with a composite?
> >>> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.
> mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
> >>>
> >>
> >> Hi,
> >>
> >> The short (and not perfect) answer to your question is. With the 
current 
> >> code in trunk, use:
> >> commonj.sdo.impl.HelperProvider.getDefaultContext()
> >>
> >> But I thought about this a bit and your question triggered some 
comments, 
> >> and more questions :)
> >>
> >> Import.sdo extension:
> >> I think we should be able to remove that Tuscany extension to SCA 
> >> assembly XML, now that we have the SCA contribution service in place. 
We 
> >> know which WSDLs and XSDs are available in a given SCA contribution 
and, 
> >> with sca-contribution.xml import elements, we also know which XML 
> >> namespaces are imported from other SCA contributions or other 
locations 
> >> outside of an SCA domain. So we probably don't need another  element 
> >> duplicating part of this information in .composite files.
> >>
> >> Scope of XML metadata:
> >> My understanding of the SCA assembly spec is that the scope of XML 
> >> metadata is an SCA contribution (plus what it imports from outside) 
and 
> >> not an individual Composite.
> >>
> >> Scope of metadata contributed by Java classes:
> >> Our runtime currently supports SCA contributions packaged as JARs or 
file 
> >> system folders. With these packaging schemes an SCA contribution is 
self 
> >> contained and cannot reference application classes in other SCA 
> >> contributions. At some point we'll probably want to support packaging 
of 
> >> SCA contributions as OSGI bundles and then leverage OSGI to allow an 
OSGI 
> >> bundle to see classes in another bundle, but we don't support that 
OSGI 
> >> packaging scheme yet. As a side comment I'd like to see if we could 
> >> reactivate some work on the OSGI extensions that we have under 
> >> java/sca/contrib/ and are not integrated in our build at the moment. 
So, 
> >> the scope of Java metadata is an SCA contribution as well, with no 
> >> external import mechanism.
> >>
> >> So the bottom line is:
> >> References to types in SCA artifacts are resolved at the SCA 
contribution 
> >> level. There is no relationship between an SCA composite and a 
metadata 
> >> scope.
> >>
> >> More comments, on databinding specific handling of metadata:
> >> We need to support multiple databindings. Each databinding comes with 
its 
> >> own form of metadata and different APIs to get to that metadata and 
> >> define metadata scopes. I guess it's important for a databinding 
> >> technology to define a way to scope metadata if it wants to be 
> >> successfully used in a server environment, and isolate the metadata 
for 
> >> the different applications running on the server.
> >>
> >> In such an environment, our SCA runtime should play nicely with the 
other 
> >> pieces of runtime and application code (not necessarily running as 
SCA 
> >> components), and use the metadata scoping mechanism defined by each 
> >> databinding in such a way that non-SCA code and SCA component code 
> >> running together in the server environment are able to see the same 
> >> metadata for a given application.
> >>
> >> I'd like to start a discussion to cover this aspect for our various 
> >> databindings and make sure that the metadata story for each 
databinding 
> >> holds together.
> >>
> >> To help feed this discussion with concrete data, could the SDO folks 
jump 
> >> in here, and describe the various ways of maintaining SDO metadata 
scopes 
> >> in a server environment, running with multiple classloaders and 
threads?
> >>
> >> Thanks,
> >>
> >>
> >
> >
> > -- 
> > Jean-Sebastien
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> > For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> > 
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> 
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org


Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Fuhwei Lwo <fu...@bricemedia.com>.
Raymond,

I agree with your suggestion below. In addition, I think SCA still needs to provide an option (injection or API) for the applications to explicitly retrieve the data model scope from the Contribution. Other databinding technology APIs beside SDO may not have default context helper concept.

1) SDO defines the pluggability to supply the default HelperContext.
2) SCA plugs its own scoping scheme to the SDO default HelperContext. 
The 
HelperContext will be populated based on the Contribution.
3) Application code will use HelperProvider.getDefaultContext() to 
retrieve 
the default HelperContext.


Raymond Feng <en...@gmail.com> wrote: Hi,

Please see my comments inline.

Thanks,
Raymond

----- Original Message ----- 
From: "Jean-Sebastien Delfino" 
To: 
Sent: Friday, April 20, 2007 9:23 AM
Subject: Re: Scoping SDO metadata, was: How to access a composite's data 
model scope in an application?


> Fuhwei Lwo wrote:
>> Hi Sebastien,
>>
>> Here is my understanding of requirements about getting rid of import.sdo 
>> and switching to contribution -
>>
>> 1) A contribution will be created by contribution processor for each 
>> application. - Contribution processor has been done for Jar and file 
>> system.
>>
>>
>
> Yes
>
>> 2) The contribution processor will create a SDO scope (HelperContext 
>> instance) to associate with the contribution. Currently calling 
>> SDOUtil.createHelperContext() is enough.
>>
>
> That's what I was poking at in my previous email. Creating our own 
> context, different from the default SDO context forces SCA to introduce a 
> new API to get to that context, and forces all SDO users to use that new 
> API. So I'm wondering if it wouldn't be better to play more nicely with 
> SDO, and have the SCA runtime just populate the default SDO context in use 
> in a particular application in the server environment.
>

I have a slightly different view here. IMHO, the SDO should provide the 
scoping mechanism and the pluggability of scoping schemes. I assume the 
HelperContext is provided by SDO for scoping metadata. What's missing from 
SDO is the pluggability of the scoping schemes. Currently, the default 
HelperContext is based on TCCL and it's not replaceable. I agree SDO cannot 
define scoping schemes for all environment so the pluggability is desirable.

>> 3) Tuscany SCA needs to provide a way for the application to get hold of 
>> the HelperContext in association with the contribution in step 2 above. 
>> Currently the  application is forced to use SDO API - 
>> HelperProvider.getDefaultContext() which is using TCCL.
>>
>
> I'm not getting this one :) Is it bad for an SDO user to be "forced to" 
> use an SDO API to get an SDO context? It seems better to me than forcing 
> an SDO user to use an SCA API, simply because his code may be used at some 
> point in an SCA environment... and then his code wouldn't work in a JSP, a 
> servlet, or any other non-SCA environment...
>
> If the fact that HelperProvider.getDefaultContext() is using the TCCL to 
> find the correct SDO context is a problem, then we just need to fix that. 
> We went through the same discussion with SCA CompositeContext about a year 
> ago. Associating context with the TCCL is not always convenient in a 
> server environment, and it may be better to associate context with the 
> current Thread (using a threadlocal or an inheritable thread local for 
> example). This is what we did for SCA CompositeContext. Maybe SDO could 
> provide a way to associate an SDO context with the current thread instead 
> or in addition to associating the SDO context with the TCCL?

I agree that we should try to use the SDO API to retrieve the current 
context. But I think in the SCA application, the default context should be 
associated with the Contribution. Then it would be a win-win situation if we 
can do the following:

1) SDO defines the pluggability to supply the default HelperContext.
2) SCA plugs its own scoping scheme to the SDO default HelperContext. The 
HelperContext will be populated based on the Contribution.
3) Application code will use HelperProvider.getDefaultContext() to retrieve 
the default HelperContext.

>
> This would seem a good thing to have anyway since these contexts are not 
> thread safe as far as I know :)
>
> Thoughts?
>> I am not sure my understanding above is correct so please bear with me. 
>> Based on my understanding above, currently there is no additional 
>> requirement from SDO.
>
> I wouldn't reach that conclusion so fast :) I think that there is a 
> requirement to provide a way  to get to an SDO context independent of TCCL 
> if people don't like that association with TCCL.
>
>> In the future, if we decided to support contribution import/export that 
>> may require SDO scoping hierarchy support. But I think we should start 
>> using contribution and getting rid of import.sdo as the first step.
>>
>>
>
> Yes I'd like to get rid of import.sdo, as I indicated earlier in this 
> discussion thread.
>
> I would like to support contribution import/export at some point. I'm not 
> sure that we'll be able to use SDO scope hierarchy support as an SCA 
> contribution import does not necessarily import the whole scope of another 
> SCA contribution, but I guess we'll know more when we start to look at the 
> details.

I'm thinking of the following approach to discover SDO metadata from a SCA 
contribution.

When the Contribution is processed, the generated SDO factories (the class 
name and the namespace) are recognized. Other models such as WSDL/XSD are 
handled as well. We don't have to convert all of them into SDO model upfront 
as the conversion can be performed on-demand upon the query of a particular 
namespace.

>
>> What do you think?  Thanks for your reply.
>>
>> Fuhwei Lwo
>>
>> Jean-Sebastien Delfino  wrote: Fuhwei Lwo wrote:
>>
>>> Hi,
>>>
>>> In my composite, I defined  in the default.scdl file that would prompt 
>>> the SCA container to register my data types using SDO databinding. The 
>>> question I have is what API I should use in my service implementation 
>>> code to obtain the registered data types.  If I have two composites that 
>>> are using two different data type definition but with the same namespace 
>>> URI, I definitely don't want to obtain the wrong data type definition. 
>>> Thanks for your help.
>>>
>>> Below is the previous message from Raymond Feng about associating 
>>> databinding type system context/scope with a composite. I think this is 
>>> related to my question but from Tuscany SCA development perspective.
>>>
>>> How to associate some context with a composite?
>>> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
>>>
>>
>> Hi,
>>
>> The short (and not perfect) answer to your question is. With the current 
>> code in trunk, use:
>> commonj.sdo.impl.HelperProvider.getDefaultContext()
>>
>> But I thought about this a bit and your question triggered some comments, 
>> and more questions :)
>>
>> Import.sdo extension:
>> I think we should be able to remove that Tuscany extension to SCA 
>> assembly XML, now that we have the SCA contribution service in place. We 
>> know which WSDLs and XSDs are available in a given SCA contribution and, 
>> with sca-contribution.xml import elements, we also know which XML 
>> namespaces are imported from other SCA contributions or other locations 
>> outside of an SCA domain. So we probably don't need another  element 
>> duplicating part of this information in .composite files.
>>
>> Scope of XML metadata:
>> My understanding of the SCA assembly spec is that the scope of XML 
>> metadata is an SCA contribution (plus what it imports from outside) and 
>> not an individual Composite.
>>
>> Scope of metadata contributed by Java classes:
>> Our runtime currently supports SCA contributions packaged as JARs or file 
>> system folders. With these packaging schemes an SCA contribution is self 
>> contained and cannot reference application classes in other SCA 
>> contributions. At some point we'll probably want to support packaging of 
>> SCA contributions as OSGI bundles and then leverage OSGI to allow an OSGI 
>> bundle to see classes in another bundle, but we don't support that OSGI 
>> packaging scheme yet. As a side comment I'd like to see if we could 
>> reactivate some work on the OSGI extensions that we have under 
>> java/sca/contrib/ and are not integrated in our build at the moment. So, 
>> the scope of Java metadata is an SCA contribution as well, with no 
>> external import mechanism.
>>
>> So the bottom line is:
>> References to types in SCA artifacts are resolved at the SCA contribution 
>> level. There is no relationship between an SCA composite and a metadata 
>> scope.
>>
>> More comments, on databinding specific handling of metadata:
>> We need to support multiple databindings. Each databinding comes with its 
>> own form of metadata and different APIs to get to that metadata and 
>> define metadata scopes. I guess it's important for a databinding 
>> technology to define a way to scope metadata if it wants to be 
>> successfully used in a server environment, and isolate the metadata for 
>> the different applications running on the server.
>>
>> In such an environment, our SCA runtime should play nicely with the other 
>> pieces of runtime and application code (not necessarily running as SCA 
>> components), and use the metadata scoping mechanism defined by each 
>> databinding in such a way that non-SCA code and SCA component code 
>> running together in the server environment are able to see the same 
>> metadata for a given application.
>>
>> I'd like to start a discussion to cover this aspect for our various 
>> databindings and make sure that the metadata story for each databinding 
>> holds together.
>>
>> To help feed this discussion with concrete data, could the SDO folks jump 
>> in here, and describe the various ways of maintaining SDO metadata scopes 
>> in a server environment, running with multiple classloaders and threads?
>>
>> Thanks,
>>
>>
>
>
> -- 
> Jean-Sebastien
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org



Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Raymond Feng <en...@gmail.com>.
Hi,

Please see my comments inline.

Thanks,
Raymond

----- Original Message ----- 
From: "Jean-Sebastien Delfino" <js...@apache.org>
To: <tu...@ws.apache.org>
Sent: Friday, April 20, 2007 9:23 AM
Subject: Re: Scoping SDO metadata, was: How to access a composite's data 
model scope in an application?


> Fuhwei Lwo wrote:
>> Hi Sebastien,
>>
>> Here is my understanding of requirements about getting rid of import.sdo 
>> and switching to contribution -
>>
>> 1) A contribution will be created by contribution processor for each 
>> application. - Contribution processor has been done for Jar and file 
>> system.
>>
>>
>
> Yes
>
>> 2) The contribution processor will create a SDO scope (HelperContext 
>> instance) to associate with the contribution. Currently calling 
>> SDOUtil.createHelperContext() is enough.
>>
>
> That's what I was poking at in my previous email. Creating our own 
> context, different from the default SDO context forces SCA to introduce a 
> new API to get to that context, and forces all SDO users to use that new 
> API. So I'm wondering if it wouldn't be better to play more nicely with 
> SDO, and have the SCA runtime just populate the default SDO context in use 
> in a particular application in the server environment.
>

I have a slightly different view here. IMHO, the SDO should provide the 
scoping mechanism and the pluggability of scoping schemes. I assume the 
HelperContext is provided by SDO for scoping metadata. What's missing from 
SDO is the pluggability of the scoping schemes. Currently, the default 
HelperContext is based on TCCL and it's not replaceable. I agree SDO cannot 
define scoping schemes for all environment so the pluggability is desirable.

>> 3) Tuscany SCA needs to provide a way for the application to get hold of 
>> the HelperContext in association with the contribution in step 2 above. 
>> Currently the  application is forced to use SDO API - 
>> HelperProvider.getDefaultContext() which is using TCCL.
>>
>
> I'm not getting this one :) Is it bad for an SDO user to be "forced to" 
> use an SDO API to get an SDO context? It seems better to me than forcing 
> an SDO user to use an SCA API, simply because his code may be used at some 
> point in an SCA environment... and then his code wouldn't work in a JSP, a 
> servlet, or any other non-SCA environment...
>
> If the fact that HelperProvider.getDefaultContext() is using the TCCL to 
> find the correct SDO context is a problem, then we just need to fix that. 
> We went through the same discussion with SCA CompositeContext about a year 
> ago. Associating context with the TCCL is not always convenient in a 
> server environment, and it may be better to associate context with the 
> current Thread (using a threadlocal or an inheritable thread local for 
> example). This is what we did for SCA CompositeContext. Maybe SDO could 
> provide a way to associate an SDO context with the current thread instead 
> or in addition to associating the SDO context with the TCCL?

I agree that we should try to use the SDO API to retrieve the current 
context. But I think in the SCA application, the default context should be 
associated with the Contribution. Then it would be a win-win situation if we 
can do the following:

1) SDO defines the pluggability to supply the default HelperContext.
2) SCA plugs its own scoping scheme to the SDO default HelperContext. The 
HelperContext will be populated based on the Contribution.
3) Application code will use HelperProvider.getDefaultContext() to retrieve 
the default HelperContext.

>
> This would seem a good thing to have anyway since these contexts are not 
> thread safe as far as I know :)
>
> Thoughts?
>> I am not sure my understanding above is correct so please bear with me. 
>> Based on my understanding above, currently there is no additional 
>> requirement from SDO.
>
> I wouldn't reach that conclusion so fast :) I think that there is a 
> requirement to provide a way  to get to an SDO context independent of TCCL 
> if people don't like that association with TCCL.
>
>> In the future, if we decided to support contribution import/export that 
>> may require SDO scoping hierarchy support. But I think we should start 
>> using contribution and getting rid of import.sdo as the first step.
>>
>>
>
> Yes I'd like to get rid of import.sdo, as I indicated earlier in this 
> discussion thread.
>
> I would like to support contribution import/export at some point. I'm not 
> sure that we'll be able to use SDO scope hierarchy support as an SCA 
> contribution import does not necessarily import the whole scope of another 
> SCA contribution, but I guess we'll know more when we start to look at the 
> details.

I'm thinking of the following approach to discover SDO metadata from a SCA 
contribution.

When the Contribution is processed, the generated SDO factories (the class 
name and the namespace) are recognized. Other models such as WSDL/XSD are 
handled as well. We don't have to convert all of them into SDO model upfront 
as the conversion can be performed on-demand upon the query of a particular 
namespace.

>
>> What do you think?  Thanks for your reply.
>>
>> Fuhwei Lwo
>>
>> Jean-Sebastien Delfino <js...@apache.org> wrote: Fuhwei Lwo wrote:
>>
>>> Hi,
>>>
>>> In my composite, I defined  in the default.scdl file that would prompt 
>>> the SCA container to register my data types using SDO databinding. The 
>>> question I have is what API I should use in my service implementation 
>>> code to obtain the registered data types.  If I have two composites that 
>>> are using two different data type definition but with the same namespace 
>>> URI, I definitely don't want to obtain the wrong data type definition. 
>>> Thanks for your help.
>>>
>>> Below is the previous message from Raymond Feng about associating 
>>> databinding type system context/scope with a composite. I think this is 
>>> related to my question but from Tuscany SCA development perspective.
>>>
>>> How to associate some context with a composite?
>>> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
>>>
>>
>> Hi,
>>
>> The short (and not perfect) answer to your question is. With the current 
>> code in trunk, use:
>> commonj.sdo.impl.HelperProvider.getDefaultContext()
>>
>> But I thought about this a bit and your question triggered some comments, 
>> and more questions :)
>>
>> Import.sdo extension:
>> I think we should be able to remove that Tuscany extension to SCA 
>> assembly XML, now that we have the SCA contribution service in place. We 
>> know which WSDLs and XSDs are available in a given SCA contribution and, 
>> with sca-contribution.xml import elements, we also know which XML 
>> namespaces are imported from other SCA contributions or other locations 
>> outside of an SCA domain. So we probably don't need another  element 
>> duplicating part of this information in .composite files.
>>
>> Scope of XML metadata:
>> My understanding of the SCA assembly spec is that the scope of XML 
>> metadata is an SCA contribution (plus what it imports from outside) and 
>> not an individual Composite.
>>
>> Scope of metadata contributed by Java classes:
>> Our runtime currently supports SCA contributions packaged as JARs or file 
>> system folders. With these packaging schemes an SCA contribution is self 
>> contained and cannot reference application classes in other SCA 
>> contributions. At some point we'll probably want to support packaging of 
>> SCA contributions as OSGI bundles and then leverage OSGI to allow an OSGI 
>> bundle to see classes in another bundle, but we don't support that OSGI 
>> packaging scheme yet. As a side comment I'd like to see if we could 
>> reactivate some work on the OSGI extensions that we have under 
>> java/sca/contrib/ and are not integrated in our build at the moment. So, 
>> the scope of Java metadata is an SCA contribution as well, with no 
>> external import mechanism.
>>
>> So the bottom line is:
>> References to types in SCA artifacts are resolved at the SCA contribution 
>> level. There is no relationship between an SCA composite and a metadata 
>> scope.
>>
>> More comments, on databinding specific handling of metadata:
>> We need to support multiple databindings. Each databinding comes with its 
>> own form of metadata and different APIs to get to that metadata and 
>> define metadata scopes. I guess it's important for a databinding 
>> technology to define a way to scope metadata if it wants to be 
>> successfully used in a server environment, and isolate the metadata for 
>> the different applications running on the server.
>>
>> In such an environment, our SCA runtime should play nicely with the other 
>> pieces of runtime and application code (not necessarily running as SCA 
>> components), and use the metadata scoping mechanism defined by each 
>> databinding in such a way that non-SCA code and SCA component code 
>> running together in the server environment are able to see the same 
>> metadata for a given application.
>>
>> I'd like to start a discussion to cover this aspect for our various 
>> databindings and make sure that the metadata story for each databinding 
>> holds together.
>>
>> To help feed this discussion with concrete data, could the SDO folks jump 
>> in here, and describe the various ways of maintaining SDO metadata scopes 
>> in a server environment, running with multiple classloaders and threads?
>>
>> Thanks,
>>
>>
>
>
> -- 
> Jean-Sebastien
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
> For additional commands, e-mail: tuscany-dev-help@ws.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org


Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Jean-Sebastien Delfino <js...@apache.org>.
Fuhwei Lwo wrote:
> Hi Sebastien,
>
> Here is my understanding of requirements about getting rid of import.sdo and switching to contribution -
>
> 1) A contribution will be created by contribution processor for each application. - Contribution processor has been done for Jar and file system.
>
>   

Yes

> 2) The contribution processor will create a SDO scope (HelperContext instance) to associate with the contribution. Currently calling SDOUtil.createHelperContext() is enough.
>   

That's what I was poking at in my previous email. Creating our own 
context, different from the default SDO context forces SCA to introduce 
a new API to get to that context, and forces all SDO users to use that 
new API. So I'm wondering if it wouldn't be better to play more nicely 
with SDO, and have the SCA runtime just populate the default SDO context 
in use in a particular application in the server environment.

> 3) Tuscany SCA needs to provide a way for the application to get hold of the HelperContext in association with the contribution in step 2 above. Currently the  application is forced to use SDO API - HelperProvider.getDefaultContext() which is using TCCL.
>   

I'm not getting this one :) Is it bad for an SDO user to be "forced to" 
use an SDO API to get an SDO context? It seems better to me than forcing 
an SDO user to use an SCA API, simply because his code may be used at 
some point in an SCA environment... and then his code wouldn't work in a 
JSP, a servlet, or any other non-SCA environment...

If the fact that HelperProvider.getDefaultContext() is using the TCCL to 
find the correct SDO context is a problem, then we just need to fix 
that. We went through the same discussion with SCA CompositeContext 
about a year ago. Associating context with the TCCL is not always 
convenient in a server environment, and it may be better to associate 
context with the current Thread (using a threadlocal or an inheritable 
thread local for example). This is what we did for SCA CompositeContext. 
Maybe SDO could provide a way to associate an SDO context with the 
current thread instead or in addition to associating the SDO context 
with the TCCL?

This would seem a good thing to have anyway since these contexts are not 
thread safe as far as I know :)

Thoughts?
> I am not sure my understanding above is correct so please bear with me. Based on my understanding above, currently there is no additional requirement from SDO.

I wouldn't reach that conclusion so fast :) I think that there is a 
requirement to provide a way  to get to an SDO context independent of 
TCCL if people don't like that association with TCCL.

> In the future, if we decided to support contribution import/export that may require SDO scoping hierarchy support. But I think we should start using contribution and getting rid of import.sdo as the first step.
>
>   

Yes I'd like to get rid of import.sdo, as I indicated earlier in this 
discussion thread.

I would like to support contribution import/export at some point. I'm 
not sure that we'll be able to use SDO scope hierarchy support as an SCA 
contribution import does not necessarily import the whole scope of 
another SCA contribution, but I guess we'll know more when we start to 
look at the details.

> What do you think?  Thanks for your reply.
>
> Fuhwei Lwo
>
> Jean-Sebastien Delfino <js...@apache.org> wrote: Fuhwei Lwo wrote:
>   
>> Hi,
>>
>> In my composite, I defined  in the default.scdl file that would prompt the SCA container to register my data types using SDO databinding. The question I have is what API I should use in my service implementation code to obtain the registered data types.  If I have two composites that are using two different data type definition but with the same namespace URI, I definitely don't want to obtain the wrong data type definition. Thanks for your help.
>>
>> Below is the previous message from Raymond Feng about associating databinding type system context/scope with a composite. I think this is related to my question but from Tuscany SCA development perspective.
>>
>> How to associate some context with a composite?
>> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
>>   
>>     
>
> Hi,
>
> The short (and not perfect) answer to your question is. With the current 
> code in trunk, use:
> commonj.sdo.impl.HelperProvider.getDefaultContext()
>
> But I thought about this a bit and your question triggered some 
> comments, and more questions :)
>
> Import.sdo extension:
> I think we should be able to remove that Tuscany extension to SCA 
> assembly XML, now that we have the SCA contribution service in place. We 
> know which WSDLs and XSDs are available in a given SCA contribution and, 
> with sca-contribution.xml import elements, we also know which XML 
> namespaces are imported from other SCA contributions or other locations 
> outside of an SCA domain. So we probably don't need another  
> element duplicating part of this information in .composite files.
>
> Scope of XML metadata:
> My understanding of the SCA assembly spec is that the scope of XML 
> metadata is an SCA contribution (plus what it imports from outside) and 
> not an individual Composite.
>
> Scope of metadata contributed by Java classes:
> Our runtime currently supports SCA contributions packaged as JARs or 
> file system folders. With these packaging schemes an SCA contribution is 
> self contained and cannot reference application classes in other SCA 
> contributions. At some point we'll probably want to support packaging of 
> SCA contributions as OSGI bundles and then leverage OSGI to allow an 
> OSGI bundle to see classes in another bundle, but we don't support that 
> OSGI packaging scheme yet. As a side comment I'd like to see if we could 
> reactivate some work on the OSGI extensions that we have under 
> java/sca/contrib/ and are not integrated in our build at the moment. So, 
> the scope of Java metadata is an SCA contribution as well, with no 
> external import mechanism.
>
> So the bottom line is:
> References to types in SCA artifacts are resolved at the SCA 
> contribution level. There is no relationship between an SCA composite 
> and a metadata scope.
>
> More comments, on databinding specific handling of metadata:
> We need to support multiple databindings. Each databinding comes with 
> its own form of metadata and different APIs to get to that metadata and 
> define metadata scopes. I guess it's important for a databinding 
> technology to define a way to scope metadata if it wants to be 
> successfully used in a server environment, and isolate the metadata for 
> the different applications running on the server.
>
> In such an environment, our SCA runtime should play nicely with the 
> other pieces of runtime and application code (not necessarily running as 
> SCA components), and use the metadata scoping mechanism defined by each 
> databinding in such a way that non-SCA code and SCA component code 
> running together in the server environment are able to see the same 
> metadata for a given application.
>
> I'd like to start a discussion to cover this aspect for our various 
> databindings and make sure that the metadata story for each databinding 
> holds together.
>
> To help feed this discussion with concrete data, could the SDO folks 
> jump in here, and describe the various ways of maintaining SDO metadata 
> scopes in a server environment, running with multiple classloaders and 
> threads?
>
> Thanks,
>
>   


-- 
Jean-Sebastien


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org


Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Fuhwei Lwo <fu...@bricemedia.com>.
Another thought on this -

Today the application developers are calling HelperProvider.getDefaultContext() to get hold of the default helper context depending on TCCL to resolve the SDO scope. Can SDO expose DefaultHelperContext interface that would allow SCA to provide its own default helper context implementation to reference to the contribution's scoped metadata?

Fuhwei Lwo <fu...@bricemedia.com> wrote: Hi Sebastien,

Here is my understanding of requirements about getting rid of import.sdo and switching to contribution -

1) A contribution will be created by contribution processor for each application. - Contribution processor has been done for Jar and file system.

2) The contribution processor will create a SDO scope (HelperContext instance) to associate with the contribution. Currently calling SDOUtil.createHelperContext() is enough.

3) Tuscany SCA needs to provide a way for the application to get hold of the HelperContext in association with the contribution in step 2 above. Currently the  application is forced to use SDO API - HelperProvider.getDefaultContext() which is using TCCL.

I am not sure my understanding above is correct so please bear with me. Based on my understanding above, currently there is no additional requirement from SDO. In the future, if we decided to support contribution import/export that may require SDO scoping hierarchy support. But I think we should start using contribution and getting rid of import.sdo as the first step.

What do you think?  Thanks for your reply.

Fuhwei Lwo

Jean-Sebastien Delfino  wrote: Fuhwei Lwo wrote:
> Hi,
>
> In my composite, I defined  in the default.scdl file that would prompt the SCA container to register my data types using SDO databinding. The question I have is what API I should use in my service implementation code to obtain the registered data types.  If I have two composites that are using two different data type definition but with the same namespace URI, I definitely don't want to obtain the wrong data type definition. Thanks for your help.
>
> Below is the previous message from Raymond Feng about associating databinding type system context/scope with a composite. I think this is related to my question but from Tuscany SCA development perspective.
>
> How to associate some context with a composite?
> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
>   

Hi,

The short (and not perfect) answer to your question is. With the current 
code in trunk, use:
commonj.sdo.impl.HelperProvider.getDefaultContext()

But I thought about this a bit and your question triggered some 
comments, and more questions :)

Import.sdo extension:
I think we should be able to remove that Tuscany extension to SCA 
assembly XML, now that we have the SCA contribution service in place. We 
know which WSDLs and XSDs are available in a given SCA contribution and, 
with sca-contribution.xml import elements, we also know which XML 
namespaces are imported from other SCA contributions or other locations 
outside of an SCA domain. So we probably don't need another  
element duplicating part of this information in .composite files.

Scope of XML metadata:
My understanding of the SCA assembly spec is that the scope of XML 
metadata is an SCA contribution (plus what it imports from outside) and 
not an individual Composite.

Scope of metadata contributed by Java classes:
Our runtime currently supports SCA contributions packaged as JARs or 
file system folders. With these packaging schemes an SCA contribution is 
self contained and cannot reference application classes in other SCA 
contributions. At some point we'll probably want to support packaging of 
SCA contributions as OSGI bundles and then leverage OSGI to allow an 
OSGI bundle to see classes in another bundle, but we don't support that 
OSGI packaging scheme yet. As a side comment I'd like to see if we could 
reactivate some work on the OSGI extensions that we have under 
java/sca/contrib/ and are not integrated in our build at the moment. So, 
the scope of Java metadata is an SCA contribution as well, with no 
external import mechanism.

So the bottom line is:
References to types in SCA artifacts are resolved at the SCA 
contribution level. There is no relationship between an SCA composite 
and a metadata scope.

More comments, on databinding specific handling of metadata:
We need to support multiple databindings. Each databinding comes with 
its own form of metadata and different APIs to get to that metadata and 
define metadata scopes. I guess it's important for a databinding 
technology to define a way to scope metadata if it wants to be 
successfully used in a server environment, and isolate the metadata for 
the different applications running on the server.

In such an environment, our SCA runtime should play nicely with the 
other pieces of runtime and application code (not necessarily running as 
SCA components), and use the metadata scoping mechanism defined by each 
databinding in such a way that non-SCA code and SCA component code 
running together in the server environment are able to see the same 
metadata for a given application.

I'd like to start a discussion to cover this aspect for our various 
databindings and make sure that the metadata story for each databinding 
holds together.

To help feed this discussion with concrete data, could the SDO folks 
jump in here, and describe the various ways of maintaining SDO metadata 
scopes in a server environment, running with multiple classloaders and 
threads?

Thanks,

-- 
Jean-Sebastien


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org




Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Fuhwei Lwo <fu...@bricemedia.com>.
Hi Sebastien,

Here is my understanding of requirements about getting rid of import.sdo and switching to contribution -

1) A contribution will be created by contribution processor for each application. - Contribution processor has been done for Jar and file system.

2) The contribution processor will create a SDO scope (HelperContext instance) to associate with the contribution. Currently calling SDOUtil.createHelperContext() is enough.

3) Tuscany SCA needs to provide a way for the application to get hold of the HelperContext in association with the contribution in step 2 above. Currently the  application is forced to use SDO API - HelperProvider.getDefaultContext() which is using TCCL.

I am not sure my understanding above is correct so please bear with me. Based on my understanding above, currently there is no additional requirement from SDO. In the future, if we decided to support contribution import/export that may require SDO scoping hierarchy support. But I think we should start using contribution and getting rid of import.sdo as the first step.

What do you think?  Thanks for your reply.

Fuhwei Lwo

Jean-Sebastien Delfino <js...@apache.org> wrote: Fuhwei Lwo wrote:
> Hi,
>
> In my composite, I defined  in the default.scdl file that would prompt the SCA container to register my data types using SDO databinding. The question I have is what API I should use in my service implementation code to obtain the registered data types.  If I have two composites that are using two different data type definition but with the same namespace URI, I definitely don't want to obtain the wrong data type definition. Thanks for your help.
>
> Below is the previous message from Raymond Feng about associating databinding type system context/scope with a composite. I think this is related to my question but from Tuscany SCA development perspective.
>
> How to associate some context with a composite?
> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
>   

Hi,

The short (and not perfect) answer to your question is. With the current 
code in trunk, use:
commonj.sdo.impl.HelperProvider.getDefaultContext()

But I thought about this a bit and your question triggered some 
comments, and more questions :)

Import.sdo extension:
I think we should be able to remove that Tuscany extension to SCA 
assembly XML, now that we have the SCA contribution service in place. We 
know which WSDLs and XSDs are available in a given SCA contribution and, 
with sca-contribution.xml import elements, we also know which XML 
namespaces are imported from other SCA contributions or other locations 
outside of an SCA domain. So we probably don't need another  
element duplicating part of this information in .composite files.

Scope of XML metadata:
My understanding of the SCA assembly spec is that the scope of XML 
metadata is an SCA contribution (plus what it imports from outside) and 
not an individual Composite.

Scope of metadata contributed by Java classes:
Our runtime currently supports SCA contributions packaged as JARs or 
file system folders. With these packaging schemes an SCA contribution is 
self contained and cannot reference application classes in other SCA 
contributions. At some point we'll probably want to support packaging of 
SCA contributions as OSGI bundles and then leverage OSGI to allow an 
OSGI bundle to see classes in another bundle, but we don't support that 
OSGI packaging scheme yet. As a side comment I'd like to see if we could 
reactivate some work on the OSGI extensions that we have under 
java/sca/contrib/ and are not integrated in our build at the moment. So, 
the scope of Java metadata is an SCA contribution as well, with no 
external import mechanism.

So the bottom line is:
References to types in SCA artifacts are resolved at the SCA 
contribution level. There is no relationship between an SCA composite 
and a metadata scope.

More comments, on databinding specific handling of metadata:
We need to support multiple databindings. Each databinding comes with 
its own form of metadata and different APIs to get to that metadata and 
define metadata scopes. I guess it's important for a databinding 
technology to define a way to scope metadata if it wants to be 
successfully used in a server environment, and isolate the metadata for 
the different applications running on the server.

In such an environment, our SCA runtime should play nicely with the 
other pieces of runtime and application code (not necessarily running as 
SCA components), and use the metadata scoping mechanism defined by each 
databinding in such a way that non-SCA code and SCA component code 
running together in the server environment are able to see the same 
metadata for a given application.

I'd like to start a discussion to cover this aspect for our various 
databindings and make sure that the metadata story for each databinding 
holds together.

To help feed this discussion with concrete data, could the SDO folks 
jump in here, and describe the various ways of maintaining SDO metadata 
scopes in a server environment, running with multiple classloaders and 
threads?

Thanks,

-- 
Jean-Sebastien


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org



Re: Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Frank Budinsky <fr...@ca.ibm.com>.
Jean-Sebastien Delfino <js...@apache.org> wrote on 04/17/2007 03:53:43 
PM:

[snip]

> To help feed this discussion with concrete data, could the SDO folks 
> jump in here, and describe the various ways of maintaining SDO metadata 
> scopes in a server environment, running with multiple classloaders and 
> threads?

Here's what we have today.

1) HelperProvider.getDefaultContext()

This returns a special HelperContext that manages multiple 
ClassLoader-specific scopes behind the scene. All metadata get() calls 
will use the scope associated with the TCCL or a parent CL if not found in 
the child's scope. The put() calls will be associated with the TCCL scope.

2) SDOUtil.createHelperContext()

This returns a HelperContext representing a single local scope. It is 
completely independent of CLs. Any chaining of scope would need to be done 
somewhere else.

3) Hierarchical HelperContexts - TBD

I am expecting that we will provide ways to chain HelperContexts - e.g. 
HelperContext.addParent() - or maybe even some way to construct them based 
on a CL-hierarchy - e.g. createHelperContext(ClassLoader cl). These are 
the kinds of things I'd like to work out with you guys, implement them in 
Tuscany, and then take them back to the next verion of the spec. Note that 
everything I've described (1, 2, and 3) is outside of the spec.

Frank.


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org


Scoping SDO metadata, was: How to access a composite's data model scope in an application?

Posted by Jean-Sebastien Delfino <js...@apache.org>.
Fuhwei Lwo wrote:
> Hi,
>
> In my composite, I defined <import.sdo location="wsdl/mydatamodel.wsdl"> in the default.scdl file that would prompt the SCA container to register my data types using SDO databinding. The question I have is what API I should use in my service implementation code to obtain the registered data types.  If I have two composites that are using two different data type definition but with the same namespace URI, I definitely don't want to obtain the wrong data type definition. Thanks for your help.
>
> Below is the previous message from Raymond Feng about associating databinding type system context/scope with a composite. I think this is related to my question but from Tuscany SCA development perspective.
>
> How to associate some context with a composite?
> http://mail-archives.apache.org/mod_mbox/ws-tuscany-dev/200702.mbox/%3c004f01c75b81$3c395e90$0300a8c0@rfengt60p%3e
>   

Hi,

The short (and not perfect) answer to your question is. With the current 
code in trunk, use:
commonj.sdo.impl.HelperProvider.getDefaultContext()

But I thought about this a bit and your question triggered some 
comments, and more questions :)

Import.sdo extension:
I think we should be able to remove that Tuscany extension to SCA 
assembly XML, now that we have the SCA contribution service in place. We 
know which WSDLs and XSDs are available in a given SCA contribution and, 
with sca-contribution.xml import elements, we also know which XML 
namespaces are imported from other SCA contributions or other locations 
outside of an SCA domain. So we probably don't need another <import.sdo> 
element duplicating part of this information in .composite files.

Scope of XML metadata:
My understanding of the SCA assembly spec is that the scope of XML 
metadata is an SCA contribution (plus what it imports from outside) and 
not an individual Composite.

Scope of metadata contributed by Java classes:
Our runtime currently supports SCA contributions packaged as JARs or 
file system folders. With these packaging schemes an SCA contribution is 
self contained and cannot reference application classes in other SCA 
contributions. At some point we'll probably want to support packaging of 
SCA contributions as OSGI bundles and then leverage OSGI to allow an 
OSGI bundle to see classes in another bundle, but we don't support that 
OSGI packaging scheme yet. As a side comment I'd like to see if we could 
reactivate some work on the OSGI extensions that we have under 
java/sca/contrib/ and are not integrated in our build at the moment. So, 
the scope of Java metadata is an SCA contribution as well, with no 
external import mechanism.

So the bottom line is:
References to types in SCA artifacts are resolved at the SCA 
contribution level. There is no relationship between an SCA composite 
and a metadata scope.

More comments, on databinding specific handling of metadata:
We need to support multiple databindings. Each databinding comes with 
its own form of metadata and different APIs to get to that metadata and 
define metadata scopes. I guess it's important for a databinding 
technology to define a way to scope metadata if it wants to be 
successfully used in a server environment, and isolate the metadata for 
the different applications running on the server.

In such an environment, our SCA runtime should play nicely with the 
other pieces of runtime and application code (not necessarily running as 
SCA components), and use the metadata scoping mechanism defined by each 
databinding in such a way that non-SCA code and SCA component code 
running together in the server environment are able to see the same 
metadata for a given application.

I'd like to start a discussion to cover this aspect for our various 
databindings and make sure that the metadata story for each databinding 
holds together.

To help feed this discussion with concrete data, could the SDO folks 
jump in here, and describe the various ways of maintaining SDO metadata 
scopes in a server environment, running with multiple classloaders and 
threads?

Thanks,

-- 
Jean-Sebastien


---------------------------------------------------------------------
To unsubscribe, e-mail: tuscany-dev-unsubscribe@ws.apache.org
For additional commands, e-mail: tuscany-dev-help@ws.apache.org