You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@commons.apache.org by Gilles Sadowski <gi...@harfang.homelinux.org> on 2011/03/30 02:15:32 UTC

[Math] What's the problem with interfaces?

Hi.

> We have been talking about moving away from interfaces as the
> preferred way to support people plugging in alternative
> implementations because they have in several places gotten "behind"
> due to the fact that adding anything to them breaks compatibility.
> We should probably continue that discussion in a different thread.

[This is the different thread.]

>From comments that were posted to the other thread, I gather the main trend
that, because some interfaces needed an upgrade, the "interface" design tool
is becoming "evil". Did I get this right?

I guess that you refer to "RandomData" and "RandomDataImpl". This is indeed
the typical example of abusing the "interface" tool. When only one
implementation is meaningful, an "interface" need not be defined.

The "interface" is not the way (preferred or not) to support alternative
implementations. As was already said, this is done with (abstract or not)
classes which an alternative implementation can inherit from.
Rather, the (Java) "interface" is supposed to represent the abstraction
(from the "real" world object) of everything that is needed to interact with
that object (i.e. its interface). It makes it possible to treat different
objects on a equal footing (from the caller's point-of-view).
But you all know that...

So what's the problem? Is it the binary compatibility, again? This is a
configuration management issue. When the compatibility is broken, you change
the major version number and/or the base package name. That was settled, or
not?
It would be a pity to sacrifice a tool aimed at improving the design because
of such considerations as keeping backward compatibility with versions that
nobody here is going to support.
If some user is happy with version "M.m", he can use it forever. If he wants
to use a newer version "N.n", he should not expect it to be compatible. It
does not have to be! Non-compatible modifications are not performed out of
some urge for change but stem from the desire to get a better product, bits
by bits.

Yes, it's not easy to get the interfaces right; so what? If you find that
you can improve the design, you do it and bump the major verion number.
As someone pointed out, it's not as if we'll run out of numbers.

Part of the real problem (as shown by the amazing amount of done and undone
work for 2.2) is that you (collectively) want to do too many things at the
same time (lots of changes *and* few releases). To be clear, the problem is
not the "lots of changes" part (which you would like to "solve" by vetoing
future compatibility-breaking improvements). Note that the following is not
a criticism of CM (which has many features etc.) but some of the code that
I've seen (in the parts which I've more closely looked at) do not make me
think that it is mature enough that one can say "That's it!" and stick with
the design forever.
Again, all this (e.g. removing duplicate code and refactoring the design)
can be considered unimportant and swept under the carpet but IMO *any*
cleanup is good as it will contribute to the overall robustness and
maintainability.
Half-baked design will continue to itch.


Best regards,
Gilles

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Gilles Sadowski <gi...@harfang.homelinux.org>.
On Wed, Mar 30, 2011 at 10:05:42AM -0700, Phil Steitz wrote:
> We are mixing two things in this thread - how much we care about
> backward compatibility and how and when to use interfaces.
> 
> I think we need to settle both topics.  I have stated my view, which
> is really just to standard Commons policy, on the backward
> compatibility issue.  Based on many years experience using,
> developing and maintaining libraries here and elsewhere, I agree
> with others who have stated that constant incompatible change makes
> a library almost worthless for use across a wide range of
> applications.  Something just used inside one application or
> relatively small development group can be constantly refactored and
> productively used; but broadly reusable components need to have
> stable APIs.  Incompatible changes have to be well-documented and
> introduced in major releases that are relatively few and far
> between.  That is essentially how we have operated in Commons for
> 10+ years now and it is the reason that some suggest that when we do
> a major API refactoring of a component we change the package name
> and even the component name.
> 
> I don't buy the argument that we really *need* to keep making
> incompatible API changes in [math] because there are features or
> bugs that can't be added in compatible ways.

Who gave this argument?
I argue that changes are necessary to straighten the design.

>  I have seen only a
> tiny handful of these - just one in 2.2.  Luc mentions SVD as a
> frustrating problem for us.  The bugs have nothing to do with the
> API definition, unless I am missing something basic.

What I argue has nothing to do with fixing bugs, although bugs can be
discovered more easily when the design is consistent.

> We have
> numerical problems.  We need to solve them.  Gilles is right that we
> have limited resources.  I would much rather that these resources be
> focused on solving the numerical and algorithmic problems involved
> in delivering robust and stable mathematical software than endless
> arguments about how to refactor the API.  As a user, I would be much
> happier with a stable API with a few warts that provides
> well-documented, well-tested (because lots of users are using the
> *same* core impl) mathematics than a beautiful but unstable API with
> buggy implementation code.

When the API has become beautiful, it will also be more stable.
A consistent design is easier to document and test.

Users cannot require that developers spend a single minute more on
maintenance just to save their own time once in while. [Unless they pay
them.]

Both parties (users and developers) should equally share the burden.
And, again, this only temporary; and indeed, it would have been more
productive to do the changes and be done with it rather than argue
endlessly.

> Regarding interfaces, I think we are starting to focus on the right
> question.  I think we agree that use of interfaces just to separate
> interface from implementation in support of the strategy pattern is,
> let's just say "deprecated."  When we started [math] back in 2003,
> that was not considered bad design.  We went overboard with it,
> though, and ran into the problems we have been discussing on this
> thread around extensibility.  Most of the bad designs came from my
> contributions, so I have to apologize for putting us into this position.

You do not have to apologize. Trends change, and what was seen as good rules
in textbook codes (e.g. checked exceptions) are challenged by the
accumulated experience in programming projects.
If we want CM to be an example to be followed, its code cannot stay as
it was 8 years ago.

> To allow multiple implementations of a fully defined interface, we
> seem to have learned in Commons that abstract classes work better. 
> I am fine with that principle and will volunteer to start first
> suggesting and discussing changes individually and then making
> them.  I already started this with RandomData/RandomDataImpl.  We
> can continue discussion on that thread about whether we even need an
> abstract class in that case.
> 
> Other use for interfaces are to
>  a) designate behaviors for implementation units that can be
> "plugged in" to algorithms (where the interface defines only some of
> the behaviors of the class - as in the multiple inheritance
> example).  An example of this is the RandomGenerator interface,
> which encapsulates the behavior of a low-level source of random data.
>  b) encapsulate abstract data types, e.g. Field.
> 
> I think we need to keep at least these interfaces, but we should
> think long and hard about exactly what they should contain.  Here
> again, I think we need to look at each example individually. 
> Examples like RealMatrix could be argued to be good "b)" examples or
> restrictive handcuffs that should be eliminated.  I think we need to
> be very careful with these decisions so that we can aim to really
> stabilize the API in 3.0.  I honestly do not think that is an
> unrealistic expectation.

I think that each specific issue should go to its own thread ;-)

Gilles

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Phil Steitz <ph...@gmail.com>.
We are mixing two things in this thread - how much we care about
backward compatibility and how and when to use interfaces.

I think we need to settle both topics.  I have stated my view, which
is really just to standard Commons policy, on the backward
compatibility issue.  Based on many years experience using,
developing and maintaining libraries here and elsewhere, I agree
with others who have stated that constant incompatible change makes
a library almost worthless for use across a wide range of
applications.  Something just used inside one application or
relatively small development group can be constantly refactored and
productively used; but broadly reusable components need to have
stable APIs.  Incompatible changes have to be well-documented and
introduced in major releases that are relatively few and far
between.  That is essentially how we have operated in Commons for
10+ years now and it is the reason that some suggest that when we do
a major API refactoring of a component we change the package name
and even the component name.

I don't buy the argument that we really *need* to keep making
incompatible API changes in [math] because there are features or
bugs that can't be added in compatible ways.  I have seen only a
tiny handful of these - just one in 2.2.  Luc mentions SVD as a
frustrating problem for us.  The bugs have nothing to do with the
API definition, unless I am missing something basic.  We have
numerical problems.  We need to solve them.  Gilles is right that we
have limited resources.  I would much rather that these resources be
focused on solving the numerical and algorithmic problems involved
in delivering robust and stable mathematical software than endless
arguments about how to refactor the API.  As a user, I would be much
happier with a stable API with a few warts that provides
well-documented, well-tested (because lots of users are using the
*same* core impl) mathematics than a beautiful but unstable API with
buggy implementation code.

Regarding interfaces, I think we are starting to focus on the right
question.  I think we agree that use of interfaces just to separate
interface from implementation in support of the strategy pattern is,
let's just say "deprecated."  When we started [math] back in 2003,
that was not considered bad design.  We went overboard with it,
though, and ran into the problems we have been discussing on this
thread around extensibility.  Most of the bad designs came from my
contributions, so I have to apologize for putting us into this position.

To allow multiple implementations of a fully defined interface, we
seem to have learned in Commons that abstract classes work better. 
I am fine with that principle and will volunteer to start first
suggesting and discussing changes individually and then making
them.  I already started this with RandomData/RandomDataImpl.  We
can continue discussion on that thread about whether we even need an
abstract class in that case.

Other use for interfaces are to
 a) designate behaviors for implementation units that can be
"plugged in" to algorithms (where the interface defines only some of
the behaviors of the class - as in the multiple inheritance
example).  An example of this is the RandomGenerator interface,
which encapsulates the behavior of a low-level source of random data.
 b) encapsulate abstract data types, e.g. Field.

I think we need to keep at least these interfaces, but we should
think long and hard about exactly what they should contain.  Here
again, I think we need to look at each example individually. 
Examples like RealMatrix could be argued to be good "b)" examples or
restrictive handcuffs that should be eliminated.  I think we need to
be very careful with these decisions so that we can aim to really
stabilize the API in 3.0.  I honestly do not think that is an
unrealistic expectation.

Phil
 

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Luc Maisonobe <Lu...@free.fr>.
Le 30/03/2011 13:33, Gilles Sadowski a écrit :
> On Wed, Mar 30, 2011 at 02:21:04AM +0100, sebb wrote:
>> On 30 March 2011 01:15, Gilles Sadowski <gi...@harfang.homelinux.org> wrote:
>>> Hi.
>>>
>>>> We have been talking about moving away from interfaces as the
>>>> preferred way to support people plugging in alternative
>>>> implementations because they have in several places gotten "behind"
>>>> due to the fact that adding anything to them breaks compatibility.
>>>> We should probably continue that discussion in a different thread.
>>>
>>> [This is the different thread.]
>>>
>>> From comments that were posted to the other thread, I gather the main trend
>>> that, because some interfaces needed an upgrade, the "interface" design tool
>>> is becoming "evil". Did I get this right?
>>
>> It's not as clear-cut as that.
>> Interfaces have their place, but have drawbacks if the original
>> interface is later found wanting.
> 
> I have no problem with that; especially, I am against creating interfaces
> just to follow the "programming by interface" paradigm.
> This was done at a few places in CM, and I wholeheartedly welcome the
> simplification brought by removing all interfaces for which there is only
> one implementation.

I guess we all agree on this. There are many places where interfaces as
the right way to go, and there are some misuses too that we try to avoid.

> 
>>> I guess that you refer to "RandomData" and "RandomDataImpl". This is indeed
>>> the typical example of abusing the "interface" tool. When only one
>>> implementation is meaningful, an "interface" need not be defined.
>>>
>>> The "interface" is not the way (preferred or not) to support alternative
>>> implementations. As was already said, this is done with (abstract or not)
>>> classes which an alternative implementation can inherit from.
>>> Rather, the (Java) "interface" is supposed to represent the abstraction
>>> (from the "real" world object) of everything that is needed to interact with
>>> that object (i.e. its interface). It makes it possible to treat different
>>> objects on a equal footing (from the caller's point-of-view).
>>> But you all know that...
>>>
>>> So what's the problem? Is it the binary compatibility, again? This is a
>>> configuration management issue. When the compatibility is broken, you change
>>> the major version number and/or the base package name. That was settled, or
>>> not?
>>
>> That solves the problem, but at the cost of forcing all users to edit
>> and recompile, so should not be undertaken lightly.
> 
> I'm sorry: I still might not have gotten something quite fundamental, as I
> continue to not understand.
> Either the user wants to use new features and he *has* to recompile or he
> doesn't want to be bothered by incompatible changes and he keeps using the
> previous release.
> The other case is when a bug has been discovered, so that the user might
> rightly want to use a drop-in replacement with the bug fixed. Then it is a
> release and support policy issue. The right thing would be to provide a
> compatible release with all bugs removed. As I see it, the problem in CM is
> one of lacking resources. IMHO erecting the binary compatibility principle
> as the ideal goal is not a substitute of support of old releases.

You have perfectly identified the problems. We do try to add features
and improve existing ones, and we also try to fix bugs. The recent
history is an example of trying to address both goals in two different
development branches: trunk for 3.0 new features on one side and a
branch for 2.X fixes.

Near the end, it was a nightmare to handle. A few part of the team was
working only on the 3.0 features, another part was trying to only fix
bugs in 2.X and port the fixes in 3.0, and yet from time to time there
were attempts to retrofit interesting stuff between the branches. In
addition, we changed our mind before releasing and had to roll back.

Our main priority has been to avoid incompatible changes at some stages
(minor releases), and to allow them at other stages (major releases). I
completely agree with you that we must go forward and that sometimes
change is needed, we are precisely at one such point so we can do a lot
of things just now. I also agree it is difficult to get an interface
right the first time. So sometimes, an abstract class may be better if
an interface is not mandatory.

That does not mean interfaces are forbidden, of course. For some (in
fact many) cases, they are good. Just as Sebb said, it's not clear-cut,
there are fuzzy borders.

> As a mitigating measure, the minor numbers releases will be binary
> compatible. For the user, there remains the risk that a bug has been fixed
> before just before a major release: If he wants to benefit from that, he'll
> have to edit and recompile. That's the balance between the user's slight
> discomfort, sometimes, and a project that will be stuck in dead ends.

Yes.

> 
>>> It would be a pity to sacrifice a tool aimed at improving the design because
>>> of such considerations as keeping backward compatibility with versions that
>>> nobody here is going to support.
>>> If some user is happy with version "M.m", he can use it forever. If he wants
>>> to use a newer version "N.n", he should not expect it to be compatible. It
>>> does not have to be! Non-compatible modifications are not performed out of
>>> some urge for change but stem from the desire to get a better product, bits
>>> by bits.
>>>
>>> Yes, it's not easy to get the interfaces right; so what? If you find that
>>> you can improve the design, you do it and bump the major verion number.
>>> As someone pointed out, it's not as if we'll run out of numbers.
>>
>> But users could run out of patience if every release requires them to
>> edit and recompile.
> 
> I'm not advocating for making each new release incompatible with the
> previous one. It's releasing too rarely which leads to this situation!
> Rather I'm in favour (and I'm not the only one in the Commons community) of
> releasing often, because there will be a higher probablility that a
> backward-compatible official release exists that contains the bug fixes
> which a user might want.

I also would prefer to release much more often than what we do now, and
I think most of us would be happy with that. Each time we talk about a
new release, everybody agrees it has been already too long since the
last one, and in fact it still takes ages to do and we are behind our
own schedule. We lack resources. This is slowly improving. Now I count
four regular committers (you are one of them), and at least three less
regular ones. This is already an important team for a commons component
and I am happy we got there. More resources would be better and we could
release more often and do less design errors.

> 
>>> Part of the real problem (as shown by the amazing amount of done and undone
>>> work for 2.2) is that you (collectively) want to do too many things at the
>>> same time (lots of changes *and* few releases).
>>
>> I don't think that's fair.
>>
>> Making lots of releases all of which may be binary incompatible with
>> each other just compounds the problem.
>>
>> IMO it's important to minimise the amount of user disruption, so it
>> make sense to bundle as many API breaking changes into a single
>> release as possible.
> 
> I agree with that. I just don't think that anyone (and not even all of us
> together) can make sure to get it right in one fell swoop. We fix what we
> see, and make releases (preferably at more or less fixed dates). It is
> counter-productive to work endlessly for fear to have missed something
> (which we will anyways).

Yes, we fix what we see and what we release is never perfect. I don't
think we can go with fixed dates, it's a too ambitious goal for now.
Each time one of us has another priority task out of [math], it adds
delays to [math] because we still are a small team.

> [Luc did that for release 2.2, changing yet another little thing to be nice
> to users, and then everything was reverted because the ultimate goal could
> not be achieved that way.]

Yes. We changed our mind in between and discovered problems we did not
see before. I did perform some compatibility tests before the changes
and convinced everyone it was correct. Then Sebb discovered I failed by
running the 2.1 tests against 2.2 code and proved my compatible changes
were in fact incompatible ones. It's a good thing he saw it before the
release and we did not break existing user code (at least nobody has
complained yet).

> 
>>> To be clear, the problem is
>>> not the "lots of changes" part (which you would like to "solve" by vetoing
>>> future compatibility-breaking improvements). Note that the following is not
>>> a criticism of CM (which has many features etc.) but some of the code that
>>> I've seen (in the parts which I've more closely looked at) do not make me
>>> think that it is mature enough that one can say "That's it!" and stick with
>>> the design forever.

We all know there are some bad things. SVD is the typical example, we
wrote it three or four times already and it still has numerical
stability problems! We tried our own implementation, we tried to port a
very complex reference implementation, we tried to use a naive
implementation and we need to find another one now.

>>
>>> Again, all this (e.g. removing duplicate code and refactoring the design)
>>> can be considered unimportant and swept under the carpet but IMO *any*
>>> cleanup is good as it will contribute to the overall robustness and
>>> maintainability.
>>
>>> Half-baked design will continue to itch.
>>
>> Which is why it's important to spend enought time on development, and
>> be prepared to restart if things turn out wrong.
> 
> That's *exactly* my point; but I still suspect that we don't mean the same
> thing. IMO, we cannot decide that things *will* go wrong just because we try
> to answer all "What if ...?" questions that come to mind. It is impossible
> to foresee all uses of CM; so CM's development cannot be based on the
> assumption that we can cover all user's requests. IMO, it must be based on
> internal consistency. If that is achieved, and _afterwards_ it is asserted
> that some use case cannot be performed, it will be time to restart. But the
> refactoring will be easier, thanks to the internal consistency.

I would choose an intermediate position. Users come first, but internal
consistency must be there too.

Luc

> 
> 
> Regards,
> Gilles
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
> For additional commands, e-mail: dev-help@commons.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Gilles Sadowski <gi...@harfang.homelinux.org>.
On Wed, Mar 30, 2011 at 02:21:04AM +0100, sebb wrote:
> On 30 March 2011 01:15, Gilles Sadowski <gi...@harfang.homelinux.org> wrote:
> > Hi.
> >
> >> We have been talking about moving away from interfaces as the
> >> preferred way to support people plugging in alternative
> >> implementations because they have in several places gotten "behind"
> >> due to the fact that adding anything to them breaks compatibility.
> >> We should probably continue that discussion in a different thread.
> >
> > [This is the different thread.]
> >
> > From comments that were posted to the other thread, I gather the main trend
> > that, because some interfaces needed an upgrade, the "interface" design tool
> > is becoming "evil". Did I get this right?
> 
> It's not as clear-cut as that.
> Interfaces have their place, but have drawbacks if the original
> interface is later found wanting.

I have no problem with that; especially, I am against creating interfaces
just to follow the "programming by interface" paradigm.
This was done at a few places in CM, and I wholeheartedly welcome the
simplification brought by removing all interfaces for which there is only
one implementation.

> > I guess that you refer to "RandomData" and "RandomDataImpl". This is indeed
> > the typical example of abusing the "interface" tool. When only one
> > implementation is meaningful, an "interface" need not be defined.
> >
> > The "interface" is not the way (preferred or not) to support alternative
> > implementations. As was already said, this is done with (abstract or not)
> > classes which an alternative implementation can inherit from.
> > Rather, the (Java) "interface" is supposed to represent the abstraction
> > (from the "real" world object) of everything that is needed to interact with
> > that object (i.e. its interface). It makes it possible to treat different
> > objects on a equal footing (from the caller's point-of-view).
> > But you all know that...
> >
> > So what's the problem? Is it the binary compatibility, again? This is a
> > configuration management issue. When the compatibility is broken, you change
> > the major version number and/or the base package name. That was settled, or
> > not?
> 
> That solves the problem, but at the cost of forcing all users to edit
> and recompile, so should not be undertaken lightly.

I'm sorry: I still might not have gotten something quite fundamental, as I
continue to not understand.
Either the user wants to use new features and he *has* to recompile or he
doesn't want to be bothered by incompatible changes and he keeps using the
previous release.
The other case is when a bug has been discovered, so that the user might
rightly want to use a drop-in replacement with the bug fixed. Then it is a
release and support policy issue. The right thing would be to provide a
compatible release with all bugs removed. As I see it, the problem in CM is
one of lacking resources. IMHO erecting the binary compatibility principle
as the ideal goal is not a substitute of support of old releases.
As a mitigating measure, the minor numbers releases will be binary
compatible. For the user, there remains the risk that a bug has been fixed
before just before a major release: If he wants to benefit from that, he'll
have to edit and recompile. That's the balance between the user's slight
discomfort, sometimes, and a project that will be stuck in dead ends.

> > It would be a pity to sacrifice a tool aimed at improving the design because
> > of such considerations as keeping backward compatibility with versions that
> > nobody here is going to support.
> > If some user is happy with version "M.m", he can use it forever. If he wants
> > to use a newer version "N.n", he should not expect it to be compatible. It
> > does not have to be! Non-compatible modifications are not performed out of
> > some urge for change but stem from the desire to get a better product, bits
> > by bits.
> >
> > Yes, it's not easy to get the interfaces right; so what? If you find that
> > you can improve the design, you do it and bump the major verion number.
> > As someone pointed out, it's not as if we'll run out of numbers.
> 
> But users could run out of patience if every release requires them to
> edit and recompile.

I'm not advocating for making each new release incompatible with the
previous one. It's releasing too rarely which leads to this situation!
Rather I'm in favour (and I'm not the only one in the Commons community) of
releasing often, because there will be a higher probablility that a
backward-compatible official release exists that contains the bug fixes
which a user might want.

> > Part of the real problem (as shown by the amazing amount of done and undone
> > work for 2.2) is that you (collectively) want to do too many things at the
> > same time (lots of changes *and* few releases).
> 
> I don't think that's fair.
> 
> Making lots of releases all of which may be binary incompatible with
> each other just compounds the problem.
> 
> IMO it's important to minimise the amount of user disruption, so it
> make sense to bundle as many API breaking changes into a single
> release as possible.

I agree with that. I just don't think that anyone (and not even all of us
together) can make sure to get it right in one fell swoop. We fix what we
see, and make releases (preferably at more or less fixed dates). It is
counter-productive to work endlessly for fear to have missed something
(which we will anyways).
[Luc did that for release 2.2, changing yet another little thing to be nice
to users, and then everything was reverted because the ultimate goal could
not be achieved that way.]

> > To be clear, the problem is
> > not the "lots of changes" part (which you would like to "solve" by vetoing
> > future compatibility-breaking improvements). Note that the following is not
> > a criticism of CM (which has many features etc.) but some of the code that
> > I've seen (in the parts which I've more closely looked at) do not make me
> > think that it is mature enough that one can say "That's it!" and stick with
> > the design forever.
> 
> > Again, all this (e.g. removing duplicate code and refactoring the design)
> > can be considered unimportant and swept under the carpet but IMO *any*
> > cleanup is good as it will contribute to the overall robustness and
> > maintainability.
> 
> > Half-baked design will continue to itch.
> 
> Which is why it's important to spend enought time on development, and
> be prepared to restart if things turn out wrong.

That's *exactly* my point; but I still suspect that we don't mean the same
thing. IMO, we cannot decide that things *will* go wrong just because we try
to answer all "What if ...?" questions that come to mind. It is impossible
to foresee all uses of CM; so CM's development cannot be based on the
assumption that we can cover all user's requests. IMO, it must be based on
internal consistency. If that is achieved, and _afterwards_ it is asserted
that some use case cannot be performed, it will be time to restart. But the
refactoring will be easier, thanks to the internal consistency.


Regards,
Gilles

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by sebb <se...@gmail.com>.
On 30 March 2011 01:15, Gilles Sadowski <gi...@harfang.homelinux.org> wrote:
> Hi.
>
>> We have been talking about moving away from interfaces as the
>> preferred way to support people plugging in alternative
>> implementations because they have in several places gotten "behind"
>> due to the fact that adding anything to them breaks compatibility.
>> We should probably continue that discussion in a different thread.
>
> [This is the different thread.]
>
> From comments that were posted to the other thread, I gather the main trend
> that, because some interfaces needed an upgrade, the "interface" design tool
> is becoming "evil". Did I get this right?

It's not as clear-cut as that.
Interfaces have their place, but have drawbacks if the original
interface is later found wanting.

> I guess that you refer to "RandomData" and "RandomDataImpl". This is indeed
> the typical example of abusing the "interface" tool. When only one
> implementation is meaningful, an "interface" need not be defined.
>
> The "interface" is not the way (preferred or not) to support alternative
> implementations. As was already said, this is done with (abstract or not)
> classes which an alternative implementation can inherit from.
> Rather, the (Java) "interface" is supposed to represent the abstraction
> (from the "real" world object) of everything that is needed to interact with
> that object (i.e. its interface). It makes it possible to treat different
> objects on a equal footing (from the caller's point-of-view).
> But you all know that...
>
> So what's the problem? Is it the binary compatibility, again? This is a
> configuration management issue. When the compatibility is broken, you change
> the major version number and/or the base package name. That was settled, or
> not?

That solves the problem, but at the cost of forcing all users to edit
and recompile, so should not be undertaken lightly.

> It would be a pity to sacrifice a tool aimed at improving the design because
> of such considerations as keeping backward compatibility with versions that
> nobody here is going to support.
> If some user is happy with version "M.m", he can use it forever. If he wants
> to use a newer version "N.n", he should not expect it to be compatible. It
> does not have to be! Non-compatible modifications are not performed out of
> some urge for change but stem from the desire to get a better product, bits
> by bits.
>
> Yes, it's not easy to get the interfaces right; so what? If you find that
> you can improve the design, you do it and bump the major verion number.
> As someone pointed out, it's not as if we'll run out of numbers.

But users could run out of patience if every release requires them to
edit and recompile.

> Part of the real problem (as shown by the amazing amount of done and undone
> work for 2.2) is that you (collectively) want to do too many things at the
> same time (lots of changes *and* few releases).

I don't think that's fair.

Making lots of releases all of which may be binary incompatible with
each other just compounds the problem.

IMO it's important to minimise the amount of user disruption, so it
make sense to bundle as many API breaking changes into a single
release as possible.

> To be clear, the problem is
> not the "lots of changes" part (which you would like to "solve" by vetoing
> future compatibility-breaking improvements). Note that the following is not
> a criticism of CM (which has many features etc.) but some of the code that
> I've seen (in the parts which I've more closely looked at) do not make me
> think that it is mature enough that one can say "That's it!" and stick with
> the design forever.

> Again, all this (e.g. removing duplicate code and refactoring the design)
> can be considered unimportant and swept under the carpet but IMO *any*
> cleanup is good as it will contribute to the overall robustness and
> maintainability.

> Half-baked design will continue to itch.

Which is why it's important to spend enought time on development, and
be prepared to restart if things turn out wrong.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Ole Ersoy <ol...@gmail.com>.
> I think that you reason on the basic assumption that CM is close to
> stability. Many problems (some bugs but also design consistency) have shown
> that it is not. So, my opinion is that users will prefer a product that
> continues to improve rather than something that is backward-compatible. I'm
> one of those. If not, I'd be content with what is there and would not feel
> the need to contribute.

I'm a user as well and agree with this.  I'd gladly deal with upgrading my code and configuration if it means that CM has gotten more elegant from an architectural point of view.

Cheers,
- Ole

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: FastMathTestPerformance results misleading

Posted by sebb <se...@gmail.com>.
On 1 April 2011 01:23, Bill Rossi <bi...@rossi.com> wrote:
>
> This test shows that Math.sin() is faster than FastMath.sin(), but this is
> misleading because the test operates over the domain 0 < x <1, whereas legal
> arguments to sin() are all finite numbers.   In particular, its when |x| is
> large that performance of the Math trig functions really drops off.

Good catch, for example:

Name     StrictMath      FastMath          Math
sin      128    1.0    238 1.8512     85 0.6620            <= current
sin     1331    1.0    601 0.4518   1365 1.0260          <= 0 -> 10000000

There are probably quite a few other tests that exhibit the same problem.

>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
> For additional commands, e-mail: dev-help@commons.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


FastMathTestPerformance results misleading

Posted by Bill Rossi <bi...@rossi.com>.
This test shows that Math.sin() is faster than FastMath.sin(), but this is 
misleading because the test operates over the domain 0 < x <1, whereas 
legal arguments to sin() are all finite numbers.   In particular, its when 
|x| is large that performance of the Math trig functions really drops off.


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Jörg Schaible <jo...@scalaris.com>.
Hi Gilles,

Gilles Sadowski wrote:

[snip]

> If I'm not mistaken, this is the answer to the "business requirement".
> And it does not prevent us from evolving the code as often as necessary
> ("math3", "math4", ...).

I think, this is the real point where we disagree. My interpretation of "as 
often as necessary" does not match my expectation of "a major version should 
be stable at least for 12 to 24 months". This is what I consider a "stable 
API".

Looking at the age of CM 2.0 it might be very well the time to move on with 
3.x. But if you do, you should work very hard before releasing final 3.x, 
because the API should not be changed (resp. should not break binary 
compatibility) for this period of time. If you're unsure how the new API 
works out, you can do a minor set of alphas and betas before - we did this 
for lang 3.x also. However, business will always wait for the final release 
and it expects the API to "be stable" then. And this is not even a technical 
problem, but a political.

[snip]

- Jörg

BTW: Gosh, we're so glad that we could finally drop in our company JDK 1.4 
support everywhere with end of last year. That's the business reality with 
global players in sensitive environments.



---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Gilles Sadowski <gi...@harfang.homelinux.org>.
Hi.

> No. There are always times, when too many stuff piled up that requires an 
> API change. But then you may have more radical changes and combine it with 
> e.g. new (incompatible) JDK features. New digester is a very good example of 
> it. But you have to make very wise decisions about the new API to keep it 
> stable again for a maximum time.

My intention is not to blame anyone for past decisions; they were certainly
convinced they made wise decisions.
However, I have looked at some of the code, and it was obvious that the
some of the design could be improved. Again I'm not criticizing the math
features but e.g. code duplication, overlapping functionality, similar
functionalities implemented in dissimilar ways; all little things that add
up so as to make the code more difficult to understand and maintain.
I stress again than CM is probably not like other Commons projects in that
its code base is much larger, which makes it all the more important to have
it consistent throughout.

I insist that once the API is consistent, it will naturally (and not by
veto) not have to change much in the future. [And if it needs be, the work
will be easier because all the changes will be similar.]

> > So, my opinion is that users will prefer a product
> > that continues to improve rather than something that is
> > backward-compatible. I'm one of those. If not, I'd be content with what is
> > there and would not feel the need to contribute.
> 
> Problem is that it does not match the business requirement.
> 
> See, when I am cleaning the apache stuff out of my local repo and start a 
> complete build of or products and components, I end up with quite every 
> available version of commons-lang, commons-io, commons-collection, etc. in 
> my repo again. However, *we* do only use one of those versions i.e. all he 
> other versions are pulled by 3rd party stuff. Therefore I have to praise my 
> fellow committers here at Apache, who made it possible to upgrade all those 
> artifacts and all the stuff will simply run regardless of the original 
> version they were dependent on.
> 
> Now, commons math might not have that wide-spread usage up to now, but the 
> new version has been proposed and recognized and as soon as companies start 
> to build products on top of it, you argumentation simply no longer applies. 
> Not all of those products will upgrade at the same time, but a user might 
> have to use two such products (depending on incompatible math releases).
> 
> Therefore we have the policy to rename package and artifactId for major 
> versions, at the price of annoying upgrades for our users. But they will at 
> least not face the situation above. However, such annoying upgrades should 
> not happen too frequently, because it is effort.

I thought that this was settled these rules:
 1. minor releases are backward-compatible,
 2. new major releases can live with older releases (thanks to a different
    base package name).

Hence
 1. users who don't touch their own code can use the last release compatible
    with it;
 2. users who want to use new features will, by definition, have to touch
    their code and thus will "import" what they want, either from the old
    base name packages, or from the new ones.

If I'm not mistaken, this is the answer to the "business requirement".
And it does not prevent us from evolving the code as often as necessary
("math3", "math4", ...).

> >> > Part of the real problem (as shown by the amazing amount of done and
> >> > undone work for 2.2) is that you (collectively) want to do too many
> >> > things at the same time (lots of changes *and* few releases). To be
> >> > clear, the problem is not the "lots of changes" part (which you would
> >> > like to "solve" by vetoing future compatibility-breaking improvements).
> >> > Note that the following is not a criticism of CM (which has many
> >> > features etc.) but some of the code that I've seen (in the parts which
> >> > I've more closely looked at) do not make me think that it is mature
> >> > enough that one can say "That's it!" and stick with the design forever.
> >> > Again, all this (e.g. removing duplicate code and refactoring the
> >> > design) can be considered unimportant and swept under the carpet but
> >> > IMO *any* cleanup is good as it will contribute to the overall
> >> > robustness and maintainability.
> >> > Half-baked design will continue to itch.
> >> 
> >> Half-baked upgrade strategies also.
> > 
> > Unforunately CM currently does not have the resources of its ambitions
> > concerning the upgrade strategies! That implies that some expectations
> > must be dropped: IMHO, code improvement is, at this time, more important
> > than upgrade easiness.
> 
> We disagree here obviously, but see also Phil's answer on this.

The more we defer the consistency goal, the more code will need changes, the
lesser people will want to do it, and the overall quality will probably
suffer. Then, there will be a day where some potential contributors will
turn away rather than figure out the mess.


Gilles

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Jörg Schaible <jo...@gmx.de>.
Hi Gilles,

Gilles Sadowski wrote:

> Hi Jorg.
> 
>> > 
>> > From comments that were posted to the other thread, I gather the main
>> > trend that, because some interfaces needed an upgrade, the "interface"
>> > design tool is becoming "evil". Did I get this right?
>> > 
>> > I guess that you refer to "RandomData" and "RandomDataImpl". This is
>> > indeed the typical example of abusing the "interface" tool. When only
>> > one implementation is meaningful, an "interface" need not be defined.
>> 
>> No. It is about adding something to the interface, just like Phil said.
>> If you add a method to an abstract class, the dervied ones are still
>> binary compatible, the same does not apply for interfaces.
>> 
>> We had discussions in *length* about this and I am not interested in
>> starting over again.
> 
> Well, I was not part of those discussions, and CM is _still_ full of Java
> "interface"s; so I assume that it's acceptable that I give my
> point-of-view now.
> 
> As I said, I'm not in favour of creating interface for the sake of having
> everything separated in "Foo" and "FooImpl".
> I understand that having an abstract class has some definite practical
> advantages. And for the things I had in mind and which I'm most concerned
> about (coherent design), it might well be a non-issue: "interface" could
> be changed to "abstract class".
> 
> The problems might come from the one thing which you can do with interface
> but cannot with classes, namely, multiple inheritance.

I never said that using abtract classes over interfaces is a mantra, but for 
a common library you should always prefer abstract classes if in doubt. More 
on this later.

> Did someone already analyze the situation in CM?
> The issue might well be resolved similarly to the debate about checked
> exceptions, i.e. within CM, "interface" may also prove to be an
> unneccessary feature of the Java language.

It is a case by case decision, but an interface can be a heavy burden.

[snip]

>> > So what's the problem? Is it the binary compatibility, again? This is a
>> > configuration management issue. When the compatibility is broken, you
>> > change the major version number and/or the base package name. That was
>> > settled, or not?
>> 
>> The point is that an interface should not be changed for a long time.
>> Otherwise every new release is a major one.
> 
> It is not a question of time but of major (incompatible) and minor
> (compatible) releases. As I noted in the reply to Sebb, the real problem
> is not supporting older releases. That's a policy issue.

For common libraries this is a very bad policy. Again more on this later.

>> > It would be a pity to sacrifice a tool aimed at improving the design
>> > because of such considerations as keeping backward compatibility with
>> > versions that nobody here is going to support.
>> 
>> It's a pity for the user, because he will *never* be able to use the next
>> version as drop in.
> 
> I also answered that in other post.
> 
>> > If some user is happy with version "M.m", he can use it forever. If he
>> > wants to use a newer version "N.n", he should not expect it to be
>> > compatible. It does not have to be! Non-compatible modifications are
>> > not performed out of some urge for change but stem from the desire to
>> > get a better product, bits by bits.
>> 
>> If that were true, you would currently have 10 different commons-lang, 10
>> different commons-collections and so on in your classpath.
> 
> No, it's the user's choice to select which version he wants.

This is the whole point, because it is probably not. Again more on it later.
 
> You cannot have multiple versions of Linux running at the same time, but
> nevertheless the kernel developers make releases at a much higher pace
> than Commons...

Nice example, I remember that Linux has explicit stable kernel lines that 
are used in business environment and patches from newer kernel versions are 
backported. The stable API is also a business requirement for Linux. And 
this has nothing to do which kernel version *I* currently run on my box and 
how often I upgrade.

>> > Yes, it's not easy to get the interfaces right; so what? If you find
>> > that you can improve the design, you do it and bump the major verion
>> > number. As someone pointed out, it's not as if we'll run out of
>> > numbers.
>> 
>> And people will stop using it, because they are annoyed when they are
>> depending in the end on n different versions of math, simply because of
>> transitive dependencies.
> 
> I think that you reason on the basic assumption that CM is close to
> stability. Many problems (some bugs but also design consistency) have
> shown that it is not.

No. There are always times, when too many stuff piled up that requires an 
API change. But then you may have more radical changes and combine it with 
e.g. new (incompatible) JDK features. New digester is a very good example of 
it. But you have to make very wise decisions about the new API to keep it 
stable again for a maximum time.

> So, my opinion is that users will prefer a product
> that continues to improve rather than something that is
> backward-compatible. I'm one of those. If not, I'd be content with what is
> there and would not feel the need to contribute.

Problem is that it does not match the business requirement.

See, when I am cleaning the apache stuff out of my local repo and start a 
complete build of or products and components, I end up with quite every 
available version of commons-lang, commons-io, commons-collection, etc. in 
my repo again. However, *we* do only use one of those versions i.e. all he 
other versions are pulled by 3rd party stuff. Therefore I have to praise my 
fellow committers here at Apache, who made it possible to upgrade all those 
artifacts and all the stuff will simply run regardless of the original 
version they were dependent on.

Now, commons math might not have that wide-spread usage up to now, but the 
new version has been proposed and recognized and as soon as companies start 
to build products on top of it, you argumentation simply no longer applies. 
Not all of those products will upgrade at the same time, but a user might 
have to use two such products (depending on incompatible math releases).

Therefore we have the policy to rename package and artifactId for major 
versions, at the price of annoying upgrades for our users. But they will at 
least not face the situation above. However, such annoying upgrades should 
not happen too frequently, because it is effort.

>> > Part of the real problem (as shown by the amazing amount of done and
>> > undone work for 2.2) is that you (collectively) want to do too many
>> > things at the same time (lots of changes *and* few releases). To be
>> > clear, the problem is not the "lots of changes" part (which you would
>> > like to "solve" by vetoing future compatibility-breaking improvements).
>> > Note that the following is not a criticism of CM (which has many
>> > features etc.) but some of the code that I've seen (in the parts which
>> > I've more closely looked at) do not make me think that it is mature
>> > enough that one can say "That's it!" and stick with the design forever.
>> > Again, all this (e.g. removing duplicate code and refactoring the
>> > design) can be considered unimportant and swept under the carpet but
>> > IMO *any* cleanup is good as it will contribute to the overall
>> > robustness and maintainability.
>> > Half-baked design will continue to itch.
>> 
>> Half-baked upgrade strategies also.
> 
> Unforunately CM currently does not have the resources of its ambitions
> concerning the upgrade strategies! That implies that some expectations
> must be dropped: IMHO, code improvement is, at this time, more important
> than upgrade easiness.

We disagree here obviously, but see also Phil's answer on this.

- Jörg


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Gilles Sadowski <gi...@harfang.homelinux.org>.
Hi Jorg.

> > 
> > From comments that were posted to the other thread, I gather the main
> > trend that, because some interfaces needed an upgrade, the "interface"
> > design tool is becoming "evil". Did I get this right?
> > 
> > I guess that you refer to "RandomData" and "RandomDataImpl". This is
> > indeed the typical example of abusing the "interface" tool. When only one
> > implementation is meaningful, an "interface" need not be defined.
> 
> No. It is about adding something to the interface, just like Phil said. If 
> you add a method to an abstract class, the dervied ones are still binary 
> compatible, the same does not apply for interfaces.
> 
> We had discussions in *length* about this and I am not interested in 
> starting over again.

Well, I was not part of those discussions, and CM is _still_ full of Java
"interface"s; so I assume that it's acceptable that I give my point-of-view
now.

As I said, I'm not in favour of creating interface for the sake of having
everything separated in "Foo" and "FooImpl".
I understand that having an abstract class has some definite practical
advantages. And for the things I had in mind and which I'm most concerned
about (coherent design), it might well be a non-issue: "interface" could be
changed to "abstract class".

The problems might come from the one thing which you can do with interface
but cannot with classes, namely, multiple inheritance.

Did someone already analyze the situation in CM?
The issue might well be resolved similarly to the debate about checked
exceptions, i.e. within CM, "interface" may also prove to be an unneccessary
feature of the Java language.

> > The "interface" is not the way (preferred or not) to support alternative
> > implementations. As was already said, this is done with (abstract or not)
> > classes which an alternative implementation can inherit from.
> > Rather, the (Java) "interface" is supposed to represent the abstraction
> > (from the "real" world object) of everything that is needed to interact
> > with that object (i.e. its interface). It makes it possible to treat
> > different objects on a equal footing (from the caller's point-of-view).
> > But you all know that...
> > 
> > So what's the problem? Is it the binary compatibility, again? This is a
> > configuration management issue. When the compatibility is broken, you
> > change the major version number and/or the base package name. That was
> > settled, or not?
> 
> The point is that an interface should not be changed for a long time. 
> Otherwise every new release is a major one.

It is not a question of time but of major (incompatible) and minor
(compatible) releases. As I noted in the reply to Sebb, the real problem
is not supporting older releases. That's a policy issue.

> > It would be a pity to sacrifice a tool aimed at improving the design
> > because of such considerations as keeping backward compatibility with
> > versions that nobody here is going to support.
> 
> It's a pity for the user, because he will *never* be able to use the next 
> version as drop in.

I also answered that in other post.

> > If some user is happy with version "M.m", he can use it forever. If he
> > wants to use a newer version "N.n", he should not expect it to be
> > compatible. It does not have to be! Non-compatible modifications are not
> > performed out of some urge for change but stem from the desire to get a
> > better product, bits by bits.
> 
> If that were true, you would currently have 10 different commons-lang, 10 
> different commons-collections and so on in your classpath.

No, it's the user's choice to select which version he wants.

You cannot have multiple versions of Linux running at the same time, but
nevertheless the kernel developers make releases at a much higher pace than
Commons...

> > Yes, it's not easy to get the interfaces right; so what? If you find that
> > you can improve the design, you do it and bump the major verion number.
> > As someone pointed out, it's not as if we'll run out of numbers.
> 
> And people will stop using it, because they are annoyed when they are 
> depending in the end on n different versions of math, simply because of 
> transitive dependencies.

I think that you reason on the basic assumption that CM is close to
stability. Many problems (some bugs but also design consistency) have shown
that it is not. So, my opinion is that users will prefer a product that
continues to improve rather than something that is backward-compatible. I'm
one of those. If not, I'd be content with what is there and would not feel
the need to contribute.

> > Part of the real problem (as shown by the amazing amount of done and
> > undone work for 2.2) is that you (collectively) want to do too many things
> > at the same time (lots of changes *and* few releases). To be clear, the
> > problem is not the "lots of changes" part (which you would like to "solve"
> > by vetoing future compatibility-breaking improvements). Note that the
> > following is not a criticism of CM (which has many features etc.) but some
> > of the code that I've seen (in the parts which I've more closely looked
> > at) do not make me think that it is mature enough that one can say "That's
> > it!" and stick with the design forever.
> > Again, all this (e.g. removing duplicate code and refactoring the design)
> > can be considered unimportant and swept under the carpet but IMO *any*
> > cleanup is good as it will contribute to the overall robustness and
> > maintainability.
> > Half-baked design will continue to itch.
> 
> Half-baked upgrade strategies also.

Unforunately CM currently does not have the resources of its ambitions
concerning the upgrade strategies! That implies that some expectations
must be dropped: IMHO, code improvement is, at this time, more important
than upgrade easiness.


Regards,
Gilles

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [Math] What's the problem with interfaces?

Posted by Jörg Schaible <jo...@gmx.de>.
Hi Gilles,

Gilles Sadowski wrote:

> Hi.
> 
>> We have been talking about moving away from interfaces as the
>> preferred way to support people plugging in alternative
>> implementations because they have in several places gotten "behind"
>> due to the fact that adding anything to them breaks compatibility.
>> We should probably continue that discussion in a different thread.
> 
> [This is the different thread.]
> 
> From comments that were posted to the other thread, I gather the main
> trend that, because some interfaces needed an upgrade, the "interface"
> design tool is becoming "evil". Did I get this right?
> 
> I guess that you refer to "RandomData" and "RandomDataImpl". This is
> indeed the typical example of abusing the "interface" tool. When only one
> implementation is meaningful, an "interface" need not be defined.

No. It is about adding something to the interface, just like Phil said. If 
you add a method to an abstract class, the dervied ones are still binary 
compatible, the same does not apply for interfaces.

We had discussions in *length* about this and I am not interested in 
starting over again.

> The "interface" is not the way (preferred or not) to support alternative
> implementations. As was already said, this is done with (abstract or not)
> classes which an alternative implementation can inherit from.
> Rather, the (Java) "interface" is supposed to represent the abstraction
> (from the "real" world object) of everything that is needed to interact
> with that object (i.e. its interface). It makes it possible to treat
> different objects on a equal footing (from the caller's point-of-view).
> But you all know that...
> 
> So what's the problem? Is it the binary compatibility, again? This is a
> configuration management issue. When the compatibility is broken, you
> change the major version number and/or the base package name. That was
> settled, or not?

The point is that an interface should not be changed for a long time. 
Otherwise every new release is a major one.

> It would be a pity to sacrifice a tool aimed at improving the design
> because of such considerations as keeping backward compatibility with
> versions that nobody here is going to support.

It's a pity for the user, because he will *never* be able to use the next 
version as drop in.

> If some user is happy with version "M.m", he can use it forever. If he
> wants to use a newer version "N.n", he should not expect it to be
> compatible. It does not have to be! Non-compatible modifications are not
> performed out of some urge for change but stem from the desire to get a
> better product, bits by bits.

If that were true, you would currently have 10 different commons-lang, 10 
different commons-collections and so on in your classpath.

> Yes, it's not easy to get the interfaces right; so what? If you find that
> you can improve the design, you do it and bump the major verion number.
> As someone pointed out, it's not as if we'll run out of numbers.

And people will stop using it, because they are annoyed when they are 
depending in the end on n different versions of math, simply because of 
transitive dependencies.

> Part of the real problem (as shown by the amazing amount of done and
> undone work for 2.2) is that you (collectively) want to do too many things
> at the same time (lots of changes *and* few releases). To be clear, the
> problem is not the "lots of changes" part (which you would like to "solve"
> by vetoing future compatibility-breaking improvements). Note that the
> following is not a criticism of CM (which has many features etc.) but some
> of the code that I've seen (in the parts which I've more closely looked
> at) do not make me think that it is mature enough that one can say "That's
> it!" and stick with the design forever.
> Again, all this (e.g. removing duplicate code and refactoring the design)
> can be considered unimportant and swept under the carpet but IMO *any*
> cleanup is good as it will contribute to the overall robustness and
> maintainability.
> Half-baked design will continue to itch.

Half-baked upgrade strategies also.

- Jörg


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org