You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2008/10/02 21:21:29 UTC

[Hadoop Wiki] Update of "Release1.0Requirements" by DougCutting

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The following page has been changed by DougCutting:
http://wiki.apache.org/hadoop/Release1%2e0Requirements

------------------------------------------------------------------------------
     * Old 1.x clients can connect to new 1.y servers, where x <= y
     * Only bug fixes in 1.x.y releases and new features in 1.x.0 releases.
     * New !FileSystem clients must be able to call old methods when talking to old servers. This generally will be done by having old methods continue to use old rpc methods. However, it is legal to have new implementations of old methods call new rpcs methods, as long as the library transparently handles the fallback case for old servers.
+ 
+ ''Owen, you seem to be extending the ["Roadmap"]'s compatibility requirements to RPC protocols, is that right?  Clients must be back-compatible with older servers and servers must be back-compatible with older clients.  If so, perhaps we should vote on this new policy and update ["Roadmap"] accordingly.  We could even start enforcing it before 1.0, so that, e.g., 0.20's protocols would need to be back-compatible with 0.19's but 0.21's would not.  --DougCutting''
  
  Question: Does the release number really matter?  Should we just keep adding features, improving back-compatibility, etc?  Our ["Roadmap"] currently defines what a major release means.  Does this need updating?  -- !DougCutting