You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@subversion.apache.org by Gabriel Cooper <ga...@mediapulse.com> on 2006/07/26 19:54:49 UTC
Storing externals for future generations?
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html;charset=ISO-8859-1" http-equiv="Content-Type">
<title></title>
</head>
<body bgcolor="#ffffff" text="#000000">
Hi there. <br>
<br>
(First I'd like to say I'm rather new to externals so I could be asking
about a feature that exists already...)<br>
<br>
I'm currently in the process of making a duplicate of a web server for
an important client. Thing is, the sites on the box haven't been
touched in any significant way for over four years. That being said, <i>many</i>
of the tools used to create the web server are versioned far ahead of
what is compatible with the sites today and some very important pieces
were some time ago deprecated and eventually removed from the internet
entirely!<br>
<br>
While looking at a new project that uses SVN externals it occurred to
me to wonder whether externals would solve this problem. If my cursory
digging and understanding is correct, the way externals work is that
instead of needing a local copy of a library and needing to manually
keep it up to date, externals will automatically do this for you by
looking at the originating source you tell it to and pull from there
instead of the local repository.<br>
<br>
So. Thinking long term, what would happen if the scenario above
pertained not to installed programs, but to software libraries obtained
from external sources? If your external repositories from
<a class="moz-txt-link-freetext" href="http://svn.somewhere.com/ImportantLibrary">http://svn.somewhere.com/ImportantLibrary</a> disappeared, would you have
access to your library anymore? <br>
<br>
If I understand correctly, you would not. It would attempt to look for
the latest version of your external repository and fail fail
alltogether. And so presumably you'd be left with a non-functioning
application. <br>
<br>
So why not then keep a local copy of the repository and have externals
function such that it merely verifies that your repository has the most
up to date files before it commits them? That way even if your foreign
repositories disappear your files won't and your application will
continue to run smoothly.<br>
<br>
<br>
Gabriel.<br>
</body>
</html>
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Re: Storing externals for future generations?
Posted by John Waycott <ja...@cox.net>.
Gabriel Cooper wrote:
> John Waycott wrote:
>
>> As a CM manger, I have to ensure that developers always build with
>> the correct software and tools. I'm uncomfortable with the idea of
>> using an out-of-date (the local copy) externalLib if the current copy
>> is unavailable.
>
> Well... At that point you're either not using any library (*ka-blam!*
> you can't build) or you're using the last-known-good. Which resource
> you use is up to your particular project. As said in the Vendor Branch
> example page at one point they wanted to keep up to the bleeding edge
> of an Apache library, and eventually they cut back to releases only.
>
I need to clarify my statement: I'm uncomfortable using an out-of-date
version when I'm supposed to use the current version. This is a case
where I know I have to get the latest version to continue my work, but
it is not available. However, I usually want to be able to control when
I get it; I don't want to pick up the latest version of a library if I'm
not ready to use it. If it breaks my build unexpectedly, I have to go
back and figure out what happened.
-- John
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Re: Storing externals for future generations?
Posted by Gabriel Cooper <ga...@mediapulse.com>.
John Waycott wrote:
>> Perhaps there would be another way entirely to handle keeping the
>> "local external" up to date... perhaps it acts /exactly/ like a
>> normal external /unless/ an error occurs ("svn:propfind request
>> failed on .."). In all other cases it secretly checks the repository
>> versions of "externalLib" and automatically updates the "local
>> external" repository behind the scenes while providing you with your
>> up to date copy, and in the event of error it notifies you of the
>> local copy and uses that.
>>
> As a CM manger, I have to ensure that developers always build with the
> correct software and tools. I'm uncomfortable with the idea of using
> an out-of-date (the local copy) externalLib if the current copy is
> unavailable.
Well... At that point you're either not using any library (*ka-blam!*
you can't build) or you're using the last-known-good. Which resource you
use is up to your particular project. As said in the Vendor Branch
example page at one point they wanted to keep up to the bleeding edge of
an Apache library, and eventually they cut back to releases only.
The proposed solution is to not be dependent on externals not existing.
Presumably if an external isn't accessible for one reason or another,
you'd rather have a working project you can continue using than one that
doesn't build at all. If it were built into subversion itself, then I'd
say the best solution was the one I put at the end of the email (and
quoted above): the "local" external is completely transparent and acts
100% like a normal external does with the only exception being that
should your external be unavailable it asks you whether you want to use
the last used extraction or perhaps gives you options of tags or something.
> A property that specifies a library is from an external source could
> be used to control changes to the local copy so that developers could
> not commit to it directly.
Presumably it would still act like externals do now, wherein (I believe)
you have to commit the external itself directly back to the external
repository. Then on next checkout / update the system would pull your
changes down from the external, save them to the local external, etc.
etc. just like any other change to the library.
If you start wanting to modify the external, simply remove its "local
external" status in the local external's repository then implement the
"Vendor Branch" functionality. Certainly you're not going to be able to
automate keeping a modified branch up to date.
> Once they want to tag a release, all library references must point to
> tags as well to ensure build repeatability.
Hm... I don't know how hard it would be to refer to both a local
external and remote external by way of versions, tags, etc., but it
seems doable, and certainly would be a necessity for releases.
Gabriel.
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Re: Storing externals for future generations?
Posted by John Waycott <ja...@cox.net>.
Gabriel Cooper wrote:
>
> I'll give it a go. My svn-fu is low and it's Go-Time(tm) on Friday so
> you'll have to put up with abstracted descriptions.
>
[snip]
> Various scenarios play out here:
> ------- scenario one, external is unavailable --------
> Checking Externals...
> Found Local External 'externalLib'. Checking...
> http://svn.somewhere.com/externalLib is unreachable (500 Server Error),
> Use local copy of 'externalLib' or try again? [(L)ocal/(T)ry again] L
> externalLib will use local repository.
> ------- scenario two, external is up to date --------
> Checking Externals...
> Found Local External 'externalLib'. Checking...
> 'externalLib' is up to date; externalLib will use local repository.
> ------- scenario three, local copy is old --------
> Checking Externals...
> Found Local External 'externalLib'. Checking...
> http://svn.somewhere.com/externalLib has a newer version
> Use local copy of 'externalLib' or update? [(L)ocal/(U)pdate] U
>
> Fetching external item 'externalLib' for local repository
> A [... new versions of files are pulled down... ]
> 'externalLib' is up to date
> ----------------------------------------------------------
> (and then we continue...)
> ----------------------------------------------------------
>
> A superstuff/
> A superstuff/[... Super Stuff's various files are added to your local
> working copy ...]
>
> Fetching local external item 'externalLib' from local repository
> A superstuff/vendor/libs/externalLib/
> A superstuff/vendor/libs/externalLib/[... externalLib's various files
> are added to your local working copy ...]
>
> ----------------------------------------------------------
>
> Hope this on-the-fly example serves to enlighten on the suggestion...
> certainly flags could be presented to avoid the nuisance of user
> input, or perhaps a helpful default might prove the best choice.
> Perhaps there would be another way entirely to handle keeping the
> "local external" up to date... perhaps it acts /exactly/ like a normal
> external /unless/ an error occurs ("svn:propfind request failed on
> .."). In all other cases it secretly checks the repository versions of
> "externalLib" and automatically updates the "local external"
> repository behind the scenes while providing you with your up to date
> copy, and in the event of error it notifies you of the local copy and
> uses that.
>
As a CM manger, I have to ensure that developers always build with the
correct software and tools. I'm uncomfortable with the idea of using an
out-of-date (the local copy) externalLib if the current copy is
unavailable. At that point the developer is building the software
incorrectly which can cause problems. In the case where a project
developer and library developer need to collaborate (which is really
where your idea comes to play) you could keep the local copy
synchronized using a separate script.
A property that specifies a library is from an external source could be
used to control changes to the local copy so that developers could not
commit to it directly. Maybe a pre-commit hook could detect the property
and only allow changes to the vendor portion of the library tree from
the synchronizing script somehow? I hope that makes sense; I would not
want developers modifying the local copy of a library except in a part
of the tree where it explicitly is for that purpose. The last thing you
want is to have two externalLib versions 1.1 that are different.
From a CM perspective, it is important to lay down rules for this sort
of use. We allow developers to externally link to a library's trunk only
during collaborative development. Once they want to tag a release, all
library references must point to tags as well to ensure build
repeatability. You should always know exactly which versions of the
libraries were built for any specific tagged software. My experience is
that failing to do that can really cause problems down the road.
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Re: Storing externals for future generations?
Posted by Gabriel Cooper <ga...@mediapulse.com>.
Ryan Schmidt wrote:
> On Jul 27, 2006, at 22:37, Gabriel Cooper wrote:
>
>> It seems the idea of a "local external" would be beneficial for those
>> that want the benefits of svn externals but want to be kept
>> up-to-date so they're not crippled should that external disappear
>> unexpectedly.
>>
>> One possible implementation would be a hybrid with the Vendor Branch
>> idea: Create a vendor repository then run a "local external" command
>> which imports the off-site directory into the specified local
>> repository /and/ marks the directory as an external. All local
>> projects then reference the local vendor repository as an external
>> and any time one of those local projects is checked out it cycles
>> through the local externals, updates them to the latest code, then
>> exports their data to the working copy. With a simple flag the
>> functionality could override whether it looks for new versions of the
>> external or uses the local repository's code.
>
> I don't understand your proposal. Can you show a set of commands that
> gives a demonstration?
I'll give it a go. My svn-fu is low and it's Go-Time(tm) on Friday so
you'll have to put up with abstracted descriptions.
External library:
http://svn.somewhere.com/externalLib
My application (we'll call it "super stuff"):
http://svn.local.com/superstuff
Super Stuff needs externalLib to run. externalLib works out of the box,
I don't need to edit it, but I do /need it/. Just like you said
initially, if their site is down and I need Super Stuff /now/ I'm in a
bind. So. I need a local copy of externalLib. But I want to stay up to
date with it too, just like Externals let me.
I set up a local repository for externalLib so I'll always have it in an
emergency.
http://svn.local.com/vendor/externalLib
I checkout externalLib as a "local external" and dump it into my vendor
directory
(for those just tuning in, this won't work! )
--> svn localcheckout http://svn.somewhere.com/externalLib
http://svn.local.com/vendor/externalLib
At this point if I were to go into the svn server and look at
[svnroot]/vendor/externalLib/db/revs/[revision#] it'd show in the diff
both the externals reference... like so...
PLAIN
K 13
svn:externals
V 53
engines http://svn.somewhere.org/externalLib
... and it would contain the added files being put into the repository
itself (which currently doesn't happen, no local copy of an external is
created)
Then I go into Super Stuff and make it an external:
svn propset svn:externals [...]
then when I check out a new copy of Super Stuff somethinig along the
lines of this would happen:
svn co http://svn.local.com/superstuff
Various scenarios play out here:
------- scenario one, external is unavailable --------
Checking Externals...
Found Local External 'externalLib'. Checking...
http://svn.somewhere.com/externalLib is unreachable (500 Server Error),
Use local copy of 'externalLib' or try again? [(L)ocal/(T)ry again] L
externalLib will use local repository.
------- scenario two, external is up to date --------
Checking Externals...
Found Local External 'externalLib'. Checking...
'externalLib' is up to date; externalLib will use local repository.
------- scenario three, local copy is old --------
Checking Externals...
Found Local External 'externalLib'. Checking...
http://svn.somewhere.com/externalLib has a newer version
Use local copy of 'externalLib' or update? [(L)ocal/(U)pdate] U
Fetching external item 'externalLib' for local repository
A [... new versions of files are pulled down... ]
'externalLib' is up to date
----------------------------------------------------------
(and then we continue...)
----------------------------------------------------------
A superstuff/
A superstuff/[... Super Stuff's various files are added to your local
working copy ...]
Fetching local external item 'externalLib' from local repository
A superstuff/vendor/libs/externalLib/
A superstuff/vendor/libs/externalLib/[... externalLib's various files
are added to your local working copy ...]
----------------------------------------------------------
Hope this on-the-fly example serves to enlighten on the suggestion...
certainly flags could be presented to avoid the nuisance of user input,
or perhaps a helpful default might prove the best choice. Perhaps there
would be another way entirely to handle keeping the "local external" up
to date... perhaps it acts /exactly/ like a normal external /unless/ an
error occurs ("svn:propfind request failed on .."). In all other cases
it secretly checks the repository versions of "externalLib" and
automatically updates the "local external" repository behind the scenes
while providing you with your up to date copy, and in the event of error
it notifies you of the local copy and uses that.
At any rate, enjoy your weekend, I'm off!
Gabriel.
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Re: Storing externals for future generations?
Posted by Ryan Schmidt <su...@ryandesign.com>.
On Jul 27, 2006, at 22:37, Gabriel Cooper wrote:
> Ryan Schmidt wrote:
>
>> Your assessment sounds just about correct. The 3rd-party library
>> doesn't even have to disappear entirely for it to cause problems
>> for you -- they could decide to switch from Subversion to a
>> different version control system, or their server could merely be
>> inaccessible for a few hours at just the wrong time.
>>
>> For my projects, I've been therefore following the vendor branch
>> strategy, where I import 3rd-party software into my own
>> repository. This has the added advantage that I can make local
>> modifications to the 3rd-party software if I want to, and track
>> those changes, and still be able to upgrade smoothly to newer
>> releases.
>>
>> http://svnbook.red-bean.com/en/1.2/svn.advanced.vendorbr.html
>
> It seems the idea of a "local external" would be beneficial for
> those that want the benefits of svn externals but want to be kept
> up-to-date so they're not crippled should that external disappear
> unexpectedly.
>
> One possible implementation would be a hybrid with the Vendor
> Branch idea: Create a vendor repository then run a "local external"
> command which imports the off-site directory into the specified
> local repository /and/ marks the directory as an external. All
> local projects then reference the local vendor repository as an
> external and any time one of those local projects is checked out it
> cycles through the local externals, updates them to the latest
> code, then exports their data to the working copy. With a simple
> flag the functionality could override whether it looks for new
> versions of the external or uses the local repository's code.
I don't understand your proposal. Can you show a set of commands that
gives a demonstration?
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Re: Storing externals for future generations?
Posted by John Waycott <ja...@cox.net>.
Gabriel Cooper wrote:
> Ryan Schmidt wrote:
>
>> Your assessment sounds just about correct. The 3rd-party library
>> doesn't even have to disappear entirely for it to cause problems for
>> you -- they could decide to switch from Subversion to a different
>> version control system, or their server could merely be inaccessible
>> for a few hours at just the wrong time.
>>
>> For my projects, I've been therefore following the vendor branch
>> strategy, where I import 3rd-party software into my own repository.
>> This has the added advantage that I can make local modifications to
>> the 3rd-party software if I want to, and track those changes, and
>> still be able to upgrade smoothly to newer releases.
>>
>> http://svnbook.red-bean.com/en/1.2/svn.advanced.vendorbr.html
>
> It seems the idea of a "local external" would be beneficial for those
> that want the benefits of svn externals but want to be kept up-to-date
> so they're not crippled should that external disappear unexpectedly.
>
> One possible implementation would be a hybrid with the Vendor Branch
> idea: Create a vendor repository then run a "local external" command
> which imports the off-site directory into the specified local
> repository /and/ marks the directory as an external. All local
> projects then reference the local vendor repository as an external and
> any time one of those local projects is checked out it cycles through
> the local externals, updates them to the latest code, then exports
> their data to the working copy. With a simple flag the functionality
> could override whether it looks for new versions of the external or
> uses the local repository's code.
>
> Gabriel.
>
We are struggling with this problem too. We have hundreds of projects in
dozens of repositories, some sharing libraries and tools.
I've been kicking around the idea of an indirect repository link for
externals. The idea is that instead of svn:externals pointing to the
actual repository, it would just have a unique name for the external
project plus the relative path within that project. Another database
would map the unique name to the repository location. This is similar to
using relative paths with an extra level of indirection. The svn client
would have to know where the externals mapping database is, but it would
allow you to shuffle projects around as needed without affecting the
externals links.
-- John
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Re: Storing externals for future generations?
Posted by Gabriel Cooper <ga...@mediapulse.com>.
Ryan Schmidt wrote:
> Your assessment sounds just about correct. The 3rd-party library
> doesn't even have to disappear entirely for it to cause problems for
> you -- they could decide to switch from Subversion to a different
> version control system, or their server could merely be inaccessible
> for a few hours at just the wrong time.
>
> For my projects, I've been therefore following the vendor branch
> strategy, where I import 3rd-party software into my own repository.
> This has the added advantage that I can make local modifications to
> the 3rd-party software if I want to, and track those changes, and
> still be able to upgrade smoothly to newer releases.
>
> http://svnbook.red-bean.com/en/1.2/svn.advanced.vendorbr.html
It seems the idea of a "local external" would be beneficial for those
that want the benefits of svn externals but want to be kept up-to-date
so they're not crippled should that external disappear unexpectedly.
One possible implementation would be a hybrid with the Vendor Branch
idea: Create a vendor repository then run a "local external" command
which imports the off-site directory into the specified local repository
/and/ marks the directory as an external. All local projects then
reference the local vendor repository as an external and any time one of
those local projects is checked out it cycles through the local
externals, updates them to the latest code, then exports their data to
the working copy. With a simple flag the functionality could override
whether it looks for new versions of the external or uses the local
repository's code.
Gabriel.
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Re: Storing externals for future generations?
Posted by Ryan Schmidt <su...@ryandesign.com>.
On Jul 26, 2006, at 21:54, Gabriel Cooper wrote:
> I'm currently in the process of making a duplicate of a web server
> for an important client. Thing is, the sites on the box haven't
> been touched in any significant way for over four years. That being
> said, many of the tools used to create the web server are versioned
> far ahead of what is compatible with the sites today and some very
> important pieces were some time ago deprecated and eventually
> removed from the internet entirely!
>
> While looking at a new project that uses SVN externals it occurred
> to me to wonder whether externals would solve this problem. If my
> cursory digging and understanding is correct, the way externals
> work is that instead of needing a local copy of a library and
> needing to manually keep it up to date, externals will
> automatically do this for you by looking at the originating source
> you tell it to and pull from there instead of the local repository.
>
> So. Thinking long term, what would happen if the scenario above
> pertained not to installed programs, but to software libraries
> obtained from external sources? If your external repositories from
> http://svn.somewhere.com/ImportantLibrary disappeared, would you
> have access to your library anymore?
>
> If I understand correctly, you would not. It would attempt to look
> for the latest version of your external repository and fail fail
> alltogether. And so presumably you'd be left with a non-functioning
> application.
>
> So why not then keep a local copy of the repository and have
> externals function such that it merely verifies that your
> repository has the most up to date files before it commits them?
> That way even if your foreign repositories disappear your files
> won't and your application will continue to run smoothly.
Your assessment sounds just about correct. The 3rd-party library
doesn't even have to disappear entirely for it to cause problems for
you -- they could decide to switch from Subversion to a different
version control system, or their server could merely be inaccessible
for a few hours at just the wrong time.
For my projects, I've been therefore following the vendor branch
strategy, where I import 3rd-party software into my own repository.
This has the added advantage that I can make local modifications to
the 3rd-party software if I want to, and track those changes, and
still be able to upgrade smoothly to newer releases.
http://svnbook.red-bean.com/en/1.2/svn.advanced.vendorbr.html
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org