You are viewing a plain text version of this content. The canonical link for it is here.
Posted to proton@qpid.apache.org by Keith W <ke...@gmail.com> on 2013/01/18 17:17:13 UTC

Changing the Proton build system to accommodate jni bindings

We are currently in the process of implementing the proton-jni binding
for the proton-c library that implements the Java Proton-API, allow
Java users to choose the C based proton stack if they wish. This work
is being performed on the jni-branch under PROTON-192 (for the JNI
work) and PROTON-194 (for the build system changes).

Currently, Proton has two independent build systems: one for the
proton-c and its ruby/perl/python/php bindings (based on Cmake/Make),
and second a separate build system for proton-j (based on Maven).  As
proton-jni will cut across both technology areas, non trivial changes
are required to both build systems.

The nub of the problem is the sharing of the Java Proton-API between
both proton-c and proton-j trees. Solutions based on svn-external and
a simple tree copy have been considered and discussed at length on
conference calls.  We have identified drawbacks in both solutions.

This email proposes another solution. The hope is that this proposal
can be developed on list into a solution that is acceptable to all.

Proposal:

Move the Java Proton-API to the top level so that it can be shared
simply and conveniently by both proton-j and proton-c.

* Maven builds the proton-api JAR to a well known location
* Cmake/make builds proton-c and all bindings including java. As the
building of the java binding requires the Java Proton API, it is
optional and only takes place if proton-api has been previously
created by Maven (or found by other means).
* Maven builds of proton-j
* Maven runs the system tests against either proton-c or proton-j. The
system tests are currently written in Python but are being augmented
with new ones written in Java.

Proposed Directory Structure:

proton
|-- release.sh/bat                  # Builds, tests and packages
proton-c and proton-j
|-- pom.xml
|
|-- proton-api                      # Java Proton-API
|   |-- pom.xml                     # Will create proton-api.jar at a
well known location in tree
|       `-- main
|
|-- proton-c                        # Proton-C and Proton-C bindings
|   |-- CMakeLists.txt
|   `-- bindings
|       |-- CMakeLists.txt
|       `-- java
|           |-- CMakeLists.txt
|           `-- jni
|               `-- CMakeLists.txt  # Creates proton-jni.jar using
proton-api.jar from a well known
|                                   # location in tree or skip if jar
cannot be found
|
|-- proton-j                        # Proton-J
|   |-- pom.xml                     # Creates proton-j.jar using
proton-api.jar (found via Maven)
|   `-- src
|       `-- main
|
`-- tests                           # Python and Java based system
tests that test equally Proton-C and
    |                               # Proton-J.
    |-- pom.xml
    `-- src
        `-- test

Use cases:

usecase #1 - Proton-C Developer exclusively focused on Proton-C

This developer may choose to check out the proton-c subtree.  The
build tool set remains unchanged from today i.e. cmake and make.  By
default, all bindings will be built expect for the java bindings (as
Cmake would fail to find the proton-api.jar).  For flexibility, we
would include option to have cmake search another directory allowing
proton-api.jar to be found in non-standard locations.

usecase #2 - Proton-C Developer who wishes to run all system tests

This developer must check out the complete proton tree.  The build
tool set now includes maven in order to build the proton-api and run
the complete system test suite.

Typical commands used by this developer would be:
mvn -f proton-api/pom.xml install  # build and install proton-api to
the well known location
<build proton-c>
mvn test                           # run all the system tests against proton-c.

usecase #3 - Proton-J Developer

This developer must check out the complete proton tree.   The build
tool set is maven.

mvn test -Pproton-j                # build all and run the system
tests against proton-j (the -P switch to specifies the Maven profile
to use)

usecase #4 - Release manager

The release manager must check out the complete proton tree.  The tool
set will be maven, cmake, make in order to build and test all
artefacts. A release script could be added to make this a single step.

release.sh/bat

Re: Changing the Proton build system to accommodate jni bindings

Posted by "Darryl L. Pierce" <dp...@redhat.com>.
On Fri, Jan 25, 2013 at 12:07:46PM -0500, Ken Giusti wrote:
> I'd like to know if the intent is to keep both the Java and Python test suites synchronized.  That is, when I write a new python test, am I expected to provide a similar test in Java?
> 
> If we hold off that decision for later the suites will diverge and getting them re-synced will be painful.

Are the tests using any sort of framework? I've mentioned in past (and
have on the back burner of my todos) to look into using Cucumber for
tests. They let you define in a language-agnostic way functional tests,
defining steps for tests. Then you write language-specific
implementations of the steps.

When a language is missing a step definition you get a test failure that
outputs the template for the step so you can cut-and-paste it into the
language's definitions and fill it in.

Perhaps it's time to bring that out for us to talk about?

-- 
Darryl L. Pierce, Sr. Software Engineer @ Red Hat, Inc.
Delivering value year after year.
Red Hat ranks #1 in value among software vendors.
http://www.redhat.com/promo/vendor/


Re: Changing the Proton build system to accommodate jni bindings

Posted by Rob Godfrey <ro...@gmail.com>.
On 25 January 2013 18:07, Ken Giusti <kg...@redhat.com> wrote:
> Thank you Phil, for providing that summary.  And, although I don't want to toss around a little gasoline (petrol), I have to ask:
>
>
>> === System Tests ===
>>
>> Returning to another discussion point, note that the proton/tests
>> folder
>> will contain both Python and Java test suites, each of which can be
>> run
>> against either proton-c or proton-j. Their division of responsibility
>> is
>> something that will emerge over time, and does not need to be fully
>> resolved right now.
>>
>
>
> I'd like to know if the intent is to keep both the Java and Python test suites synchronized.  That is, when I write a new python test, am I expected to provide a similar test in Java?
>
> If we hold off that decision for later the suites will diverge and getting them re-synced will be painful.

Since all the Python tests will continue to be run against the Java
implementation there will be no need to implement a test in Java if
you write it in Python.

We believe the Python tests will be runnable across more platforms
than the Java (since JVM support is not available on some hardware /
OS combinations), as such no Python test should be "migrated" to Java
since this would reduce coverage.

For reasons of developer efficiency / tool integration, Java tests
have advantages.  Rather than write unit tests which *only* run
against the Java code, we will be writing system tests that run
against the pure Java and the JNI bindings.  This will give us extra
test coverage of the C implementation (and the JNI binding code) in
addition to testing the Java implementation.

-- Rob

>
> -K
>
> ----- Original Message -----
>> As promised, here is a proper write-up of how we're planning to
>> modify the
>> Proton build system.
>>
>>
>> === Requirements ===
>> I've updated the "Proton build system requirements" wiki page:
>> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
>>
>>
>> === Proposed solution ===
>>
>> The proposed solution changes Proton in two important ways.
>>
>> 1. proton will built and distributed as a single project.
>> This means that the released souce tarball will be created from the
>> top
>> level, not from the proton-c sub-folder.
>>
>> To support this, a CMakelists.txt file will be created at the top
>> level.
>> Therefore, an existing end user wishing to build proton-c will follow
>> the
>> usual steps when they receive the tar (e.g. tar xf proton.tar ; mkdir
>> proton/build ; cd proton/build ; cmake .. ; make install).
>>
>>
>> 2. Both proton-c and proton-j will be buildable and testable using
>> cmake,
>> although proton-j will retain Maven support.
>>
>> Expanding our cmake build to include proton-j solves two problems,
>> namely:
>> (i) Satisfying the JNI binding's dependency on proton-api.jar and
>> libqpid-proton.so.
>> (ii) Allowing RHEL and Fedora users to build proton-j despite the
>> lack of a
>> recent Maven version on their platforms.
>>
>> The cmake Java build will assume that the developer has already
>> downloaded
>> dependencies (e.g. bouncycastle.jar for SSL support), and will not
>> emulate
>> Maven's ability to fetch them from remote repositories. This support
>> could
>> be added in the future if it is deemed necessary.
>>
>
>>
>> Phil
>>
>>
>> On 24 January 2013 21:05, Phil Harvey <ph...@philharveyonline.com>
>> wrote:
>>
>> > In case anyone thinks World War 3 is about to break out, an
>> > approach
>> > emerged during the Proton task force meeting today that is probably
>> > acceptable to all the main protagonists.
>> >
>> > A brief summary follows. I've tried to avoid too many value
>> > judgements in
>> > the summary, to avoid fanning the embers.
>> >
>> > - We'll add a cmake file at the top level of the proton project,
>> > which
>> > will be able to build all of the Java and C code in one pass. This
>> > will be
>> > useful both for building the JNI bindings without undue
>> > contortions, and
>> > for building the entire project in Maven-less environments. The
>> > Maven pom
>> > files will remain, for use in more mainstream Java deployments.
>> >
>> > - No duplication of the proton-api code.
>> >
>> > - The source tarball will in the future be generated from the top
>> > level,
>> > rather than from the proton-c folder. This avoids the issues
>> > previously
>> > discussed whereby JNI bindings can't be built because proton-api is
>> > "missing" from the tarball. The new top level cmake file will mean
>> > that the
>> > commands required to build proton-c will be unchanged.
>> >
>> > I'll write up some proper notes this evening if I get a chance, and
>> > will
>> > update the wiki.
>> >
>> > Going back to Rajith's point earlier in this discussion, it is of
>> > course
>> > important that decisions like this are debated and agreed in
>> > public, so
>> > rest assured this will all be written up and Jira'd properly so
>> > that
>> > everyone has the opportunity to comment.
>> >
>> > Phil
>> > On Jan 24, 2013 3:01 PM, "Rob Godfrey" <ro...@gmail.com>
>> > wrote:
>> >
>> >> On 24 January 2013 15:49, Rafael Schloming <rh...@alum.mit.edu>
>> >> wrote:
>> >> > On Wed, Jan 23, 2013 at 6:44 PM, Rob Godfrey
>> >> > <rob.j.godfrey@gmail.com
>> >> >wrote:
>> >> >
>> >> >> Firstly I think it would be helpful if you made clear the
>> >> >> requirements
>> >> you
>> >> >> consider to be essential, nice to have,  unimportant and/or
>> >> detrimental.
>> >> >>
>> >> >> On 23 January 2013 20:17, Rafael Schloming <rh...@alum.mit.edu>
>> >> >> wrote:
>> >> >>
>> >> >> > On Wed, Jan 23, 2013 at 8:01 AM, Keith W
>> >> >> > <ke...@gmail.com>
>> >> wrote:
>> >> >> >
>> >>
>> >> [snip]
>> >>
>> >> >> > Given the above workflow, it seems like even with a
>> >> >> > relatively small
>> >> >> change
>> >> >> > like adding a getter, the scripted portion of the syncing
>> >> >> > effort is
>> >> going
>> >> >> > to be vanishingly small compared to the manual process of
>> >> >> > syncing the
>> >> >> > implementations. Perhaps I'm just envisioning a different
>> >> >> > workflow
>> >> than
>> >> >> > you, or maybe I'm missing some important scenarios. Could you
>> >> describe
>> >> >> what
>> >> >> > workflow(s) you envision and how the sync process would
>> >> >> > impacting
>> >> your
>> >> >> > productivity?
>> >> >> >
>> >> >> >
>> >> >> I differ strongly in my opinion here. Every time I need to drop
>> >> >> out of
>> >> my
>> >> >> development environment to run some ad-hoc script then there is
>> >> overhead...
>> >> >>
>> >> >
>> >> > Don't you end up dropping out of your development environment
>> >> > anyways in
>> >> > order to build the C code?
>> >> >
>> >>
>> >> No - presuming I have built the C at some point in the past and
>> >> there
>> >> is no change to the SWIG part, then I don;t need to drop out of my
>> >> environment at all to do a build / test.
>> >>
>> >> >
>> >> >> Moreover if we are using svn to do this I presume we would be
>> >> >> having to
>> >> >> check in any change before the sync could be made. This means
>> >> >> that
>> >> every
>> >> >> edit to a file now has to be followed by commit and sync (which
>> >> >> would
>> >> >> obviously be an insane process).  Those of us behind corporate
>> >> firewalls
>> >> >> and proxies experience very degraded response times when
>> >> >> updating from
>> >> the
>> >> >> Apache repository.
>> >> >>
>> >> >
>> >> > I wasn't suggesting syncing via the repository, I was suggesting
>> >> > syncing
>> >> > the directories in the local checkout, i.e. effectively have a
>> >> > script
>> >> that
>> >> > just does 'cp -r ../proton-j/<blah...>/src
>> >> > bindings/<blah...>/src', but
>> >> > obviously with some logic in there to exclude .svn and the like
>> >> > and to
>> >> work
>> >> > regardless of what directory you're in. You could even as you
>> >> > suggest
>> >> below
>> >> > make the sync happen automatically from the build.
>> >> >
>> >>
>> >> And which runs on Windows too? :-)
>> >>
>> >> Really this is a non-starter for me.  I haven't seen anything to
>> >> change my mind that actively having the source in two places in
>> >> the
>> >> tree is insane.  It is far more sensible IMHO to simply bow to the
>> >> simple fact that both the C and the Java builds do depend on a
>> >> shared
>> >> definition of the API.
>> >>
>> >> The same issue actually exists right now in that both depend on
>> >> the
>> >> Python API ... just that that is not actually formally defined
>> >> anywhere.
>> >>
>> >> >
>> >> >>
>> >> >> Frankly I have doubts that any such sync script could be
>> >> >> produced that
>> >> >> would work across all the different environments that
>> >> >> developers may
>> >> work
>> >> >> in.  Mostly though I remain totally unconvinced that there is a
>> >> compelling
>> >> >> reason to do this. Rather than having a "sync" script to keep
>> >> >> two
>> >> copies in
>> >> >> svn... why not just test if in a svn directory and ../proton-j
>> >> >> is
>> >> present,
>> >> >> then copy ../proton-j/proton-api.  Then in the release process
>> >> >> just
>> >> copy
>> >> >> the proton-j/proton-api source into the relevant place in the
>> >> >> tarball?
>> >> >>
>> >> >
>> >> > That's really not that different from what I'm proposing, modulo
>> >> > whether
>> >> > you do the copy automatically or not, the only real difference
>> >> > is
>> >> whether
>> >> > you check in the copy or whether it just sits in your checkout
>> >> > during
>> >> > development and the release script does the copying. My issue
>> >> > with the
>> >> > release script doing the copying is that it makes the mapping
>> >> > from
>> >> release
>> >> > tarball back to svn significantly less transparent. I know as a
>> >> > user of
>> >> > other open source projects I sometimes find myself wanting to
>> >> > find the
>> >> svn
>> >> > source for a file included in a given release, I usually do this
>> >> > via
>> >> google
>> >> > and look at the thing view viewvc. It would be quite confusing
>> >> > to be
>> >> > browsing the release branch and have it be that different from
>> >> > what is
>> >> in
>> >> > the source tarball. I'd also worry about someone accidentally
>> >> > checking
>> >> in a
>> >> > copy, something that happens fairly frequently with other files
>> >> > that are
>> >> > perpetually locally modified. As you can imagine that would make
>> >> > kind
>> >> of a
>> >> > mess.
>> >>
>> >> So instead of doing a copy lets just be have the source code in
>> >> the
>> >> one place in the repo, and have the source tarball include that
>> >> directory (both the C and the Java traballs) the same as they have
>> >> to
>> >> do with the tests.  Moreover let's not fake a new directory
>> >> structure,
>> >> let's just tar up from a higher level but exclude the bits that we
>> >> don;t want for the C / java tar respectively.
>> >>
>> >> >
>> >> >
>> >> >
>> >> >>
>> >> >>
>> >> >>
>> >> >> >
>> >> >> > > 4. To switch to a particular SVN revision, simple SVN
>> >> >> > > commands are
>> >> run
>> >> >> > > (e.g. svn switch or svn update)
>> >> >> > > - Developer productivity
>> >> >> > >
>> >> >> > > 5. proton-c can be built, excluding its JNI binding,
>> >> >> > > without
>> >> requiring
>> >> >> > > non-standard tools*
>> >> >> > > 6. proton-c can be built, excluding its JNI binding, from a
>> >> standalone
>> >> >> > > checkout of the proton-c directory
>> >> >> > > - Developer productivity / tool familiarity
>> >> >> > >
>> >> >> > > Neutral
>> >> >> > >
>> >> >> > > 1. A "tarball" source release of proton-c can be built by a
>> >> >> > > user
>> >> >> > > without an external dependency on any other part of proton,
>> >> >> > > e.g.
>> >> >> > > proton-api.
>> >> >> > > 2. The aforementioned proton-c tarball release can be
>> >> >> > > produced by
>> >> >> > > performing a simple "svn export" of proton-c.
>> >> >> > > - If I were building proton-c for my platform for tarball,
>> >> >> > > I would
>> >> >> > > also want to run the tests to be sure proton-c functions
>> >> >> > > correctly.
>> >> >> > > For this reason I question the usefulness of a proton-c
>> >> >> > > tarball.  I
>> >> >> > > would want a tarball that included the whole tree including
>> >> >> > > the
>> >> tests.
>> >> >> > >
>> >> >> >
>> >> >> > The proton-c tarball does include the tests directory. The
>> >> >> > tests
>> >> >> directory
>> >> >> > is just pure python code, so once you've installed proton-c
>> >> >> > onto your
>> >> >> > system, you can run any of the proton tests just like you
>> >> >> > would run
>> >> any
>> >> >> > normal python script. As I mentioned in another post, the
>> >> >> > inclusion
>> >> of
>> >> >> > tests under both proton-c and proton-j is the one deviation
>> >> >> > in
>> >> directory
>> >> >> > structure from a pure svn export, and even this much is
>> >> >> > kindof a
>> >> pain as
>> >> >> > there is no way for the README to actually describe things
>> >> >> > properly
>> >> >> without
>> >> >> > being broken in either the svn tree or in the release
>> >> >> > artifact.
>> >> >> >
>> >> >> >
>> >> >> So, if we want to keep the source and the svn the same,
>> >> >> wouldn't it
>> >> make
>> >> >> more sense for the release tarballs to actually just be strict
>> >> >> subsets
>> >> of
>> >> >> the proton tree?  that is the proton-j tarball would be
>> >> >>
>> >> >> proton
>> >> >>   |
>> >> >>   +-- proton-j
>> >> >>   |
>> >> >>   +-- tests
>> >> >>
>> >> >> and the proton-c tarball would be
>> >> >>
>> >> >> proton
>> >> >>   |
>> >> >>   +-- proton-c
>> >> >>   |
>> >> >>   +-- tests
>> >> >>
>> >> >>
>> >> >> If we wanted to avoid a copy of the java API in the build then
>> >> >> we could
>> >> >> then actually just reference ../proton-j and include that
>> >> >> subset of
>> >> >> proton-j in the release source tarball?
>> >> >>
>> >> >
>> >> >  I'm not sure that helps with the README situation. You now
>> >> >  effectively
>> >> > have 3 views of the same tree, and it would be difficult to
>> >> > imagine
>> >> writing
>> >> > a single README that would make sense in every case. You'd
>> >> > pretty much
>> >> have
>> >> > to structure each sub directory as a standalone tree and each
>> >> > would need
>> >> > their own READMEs.
>> >>
>> >> Why - you build from the top level.  The only thing that is
>> >> different
>> >> is that you'd probably have a README_JAVA and a README_C and you'd
>> >> exclude the one you didn't want
>> >>
>> >> -- Rob
>> >>
>> >> >As is, the C tarball basically works on the assumption
>> >> > that the tests directory is a standalone tree which it simply
>> >> > includes
>> >> to
>> >> > eliminate the extra step of downloading it, so it's similar in
>> >> > that
>> >> respect
>> >> > but avoids the extra depth.
>> >> >
>> >> > --Rafael
>> >>
>> >
>>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Ken Giusti <kg...@redhat.com>.
Thank you Phil, for providing that summary.  And, although I don't want to toss around a little gasoline (petrol), I have to ask:


> === System Tests ===
> 
> Returning to another discussion point, note that the proton/tests
> folder
> will contain both Python and Java test suites, each of which can be
> run
> against either proton-c or proton-j. Their division of responsibility
> is
> something that will emerge over time, and does not need to be fully
> resolved right now.
> 


I'd like to know if the intent is to keep both the Java and Python test suites synchronized.  That is, when I write a new python test, am I expected to provide a similar test in Java?

If we hold off that decision for later the suites will diverge and getting them re-synced will be painful.

-K

----- Original Message -----
> As promised, here is a proper write-up of how we're planning to
> modify the
> Proton build system.
> 
> 
> === Requirements ===
> I've updated the "Proton build system requirements" wiki page:
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
> 
> 
> === Proposed solution ===
> 
> The proposed solution changes Proton in two important ways.
> 
> 1. proton will built and distributed as a single project.
> This means that the released souce tarball will be created from the
> top
> level, not from the proton-c sub-folder.
> 
> To support this, a CMakelists.txt file will be created at the top
> level.
> Therefore, an existing end user wishing to build proton-c will follow
> the
> usual steps when they receive the tar (e.g. tar xf proton.tar ; mkdir
> proton/build ; cd proton/build ; cmake .. ; make install).
> 
> 
> 2. Both proton-c and proton-j will be buildable and testable using
> cmake,
> although proton-j will retain Maven support.
> 
> Expanding our cmake build to include proton-j solves two problems,
> namely:
> (i) Satisfying the JNI binding's dependency on proton-api.jar and
> libqpid-proton.so.
> (ii) Allowing RHEL and Fedora users to build proton-j despite the
> lack of a
> recent Maven version on their platforms.
> 
> The cmake Java build will assume that the developer has already
> downloaded
> dependencies (e.g. bouncycastle.jar for SSL support), and will not
> emulate
> Maven's ability to fetch them from remote repositories. This support
> could
> be added in the future if it is deemed necessary.
> 

> 
> Phil
> 
> 
> On 24 January 2013 21:05, Phil Harvey <ph...@philharveyonline.com>
> wrote:
> 
> > In case anyone thinks World War 3 is about to break out, an
> > approach
> > emerged during the Proton task force meeting today that is probably
> > acceptable to all the main protagonists.
> >
> > A brief summary follows. I've tried to avoid too many value
> > judgements in
> > the summary, to avoid fanning the embers.
> >
> > - We'll add a cmake file at the top level of the proton project,
> > which
> > will be able to build all of the Java and C code in one pass. This
> > will be
> > useful both for building the JNI bindings without undue
> > contortions, and
> > for building the entire project in Maven-less environments. The
> > Maven pom
> > files will remain, for use in more mainstream Java deployments.
> >
> > - No duplication of the proton-api code.
> >
> > - The source tarball will in the future be generated from the top
> > level,
> > rather than from the proton-c folder. This avoids the issues
> > previously
> > discussed whereby JNI bindings can't be built because proton-api is
> > "missing" from the tarball. The new top level cmake file will mean
> > that the
> > commands required to build proton-c will be unchanged.
> >
> > I'll write up some proper notes this evening if I get a chance, and
> > will
> > update the wiki.
> >
> > Going back to Rajith's point earlier in this discussion, it is of
> > course
> > important that decisions like this are debated and agreed in
> > public, so
> > rest assured this will all be written up and Jira'd properly so
> > that
> > everyone has the opportunity to comment.
> >
> > Phil
> > On Jan 24, 2013 3:01 PM, "Rob Godfrey" <ro...@gmail.com>
> > wrote:
> >
> >> On 24 January 2013 15:49, Rafael Schloming <rh...@alum.mit.edu>
> >> wrote:
> >> > On Wed, Jan 23, 2013 at 6:44 PM, Rob Godfrey
> >> > <rob.j.godfrey@gmail.com
> >> >wrote:
> >> >
> >> >> Firstly I think it would be helpful if you made clear the
> >> >> requirements
> >> you
> >> >> consider to be essential, nice to have,  unimportant and/or
> >> detrimental.
> >> >>
> >> >> On 23 January 2013 20:17, Rafael Schloming <rh...@alum.mit.edu>
> >> >> wrote:
> >> >>
> >> >> > On Wed, Jan 23, 2013 at 8:01 AM, Keith W
> >> >> > <ke...@gmail.com>
> >> wrote:
> >> >> >
> >>
> >> [snip]
> >>
> >> >> > Given the above workflow, it seems like even with a
> >> >> > relatively small
> >> >> change
> >> >> > like adding a getter, the scripted portion of the syncing
> >> >> > effort is
> >> going
> >> >> > to be vanishingly small compared to the manual process of
> >> >> > syncing the
> >> >> > implementations. Perhaps I'm just envisioning a different
> >> >> > workflow
> >> than
> >> >> > you, or maybe I'm missing some important scenarios. Could you
> >> describe
> >> >> what
> >> >> > workflow(s) you envision and how the sync process would
> >> >> > impacting
> >> your
> >> >> > productivity?
> >> >> >
> >> >> >
> >> >> I differ strongly in my opinion here. Every time I need to drop
> >> >> out of
> >> my
> >> >> development environment to run some ad-hoc script then there is
> >> overhead...
> >> >>
> >> >
> >> > Don't you end up dropping out of your development environment
> >> > anyways in
> >> > order to build the C code?
> >> >
> >>
> >> No - presuming I have built the C at some point in the past and
> >> there
> >> is no change to the SWIG part, then I don;t need to drop out of my
> >> environment at all to do a build / test.
> >>
> >> >
> >> >> Moreover if we are using svn to do this I presume we would be
> >> >> having to
> >> >> check in any change before the sync could be made. This means
> >> >> that
> >> every
> >> >> edit to a file now has to be followed by commit and sync (which
> >> >> would
> >> >> obviously be an insane process).  Those of us behind corporate
> >> firewalls
> >> >> and proxies experience very degraded response times when
> >> >> updating from
> >> the
> >> >> Apache repository.
> >> >>
> >> >
> >> > I wasn't suggesting syncing via the repository, I was suggesting
> >> > syncing
> >> > the directories in the local checkout, i.e. effectively have a
> >> > script
> >> that
> >> > just does 'cp -r ../proton-j/<blah...>/src
> >> > bindings/<blah...>/src', but
> >> > obviously with some logic in there to exclude .svn and the like
> >> > and to
> >> work
> >> > regardless of what directory you're in. You could even as you
> >> > suggest
> >> below
> >> > make the sync happen automatically from the build.
> >> >
> >>
> >> And which runs on Windows too? :-)
> >>
> >> Really this is a non-starter for me.  I haven't seen anything to
> >> change my mind that actively having the source in two places in
> >> the
> >> tree is insane.  It is far more sensible IMHO to simply bow to the
> >> simple fact that both the C and the Java builds do depend on a
> >> shared
> >> definition of the API.
> >>
> >> The same issue actually exists right now in that both depend on
> >> the
> >> Python API ... just that that is not actually formally defined
> >> anywhere.
> >>
> >> >
> >> >>
> >> >> Frankly I have doubts that any such sync script could be
> >> >> produced that
> >> >> would work across all the different environments that
> >> >> developers may
> >> work
> >> >> in.  Mostly though I remain totally unconvinced that there is a
> >> compelling
> >> >> reason to do this. Rather than having a "sync" script to keep
> >> >> two
> >> copies in
> >> >> svn... why not just test if in a svn directory and ../proton-j
> >> >> is
> >> present,
> >> >> then copy ../proton-j/proton-api.  Then in the release process
> >> >> just
> >> copy
> >> >> the proton-j/proton-api source into the relevant place in the
> >> >> tarball?
> >> >>
> >> >
> >> > That's really not that different from what I'm proposing, modulo
> >> > whether
> >> > you do the copy automatically or not, the only real difference
> >> > is
> >> whether
> >> > you check in the copy or whether it just sits in your checkout
> >> > during
> >> > development and the release script does the copying. My issue
> >> > with the
> >> > release script doing the copying is that it makes the mapping
> >> > from
> >> release
> >> > tarball back to svn significantly less transparent. I know as a
> >> > user of
> >> > other open source projects I sometimes find myself wanting to
> >> > find the
> >> svn
> >> > source for a file included in a given release, I usually do this
> >> > via
> >> google
> >> > and look at the thing view viewvc. It would be quite confusing
> >> > to be
> >> > browsing the release branch and have it be that different from
> >> > what is
> >> in
> >> > the source tarball. I'd also worry about someone accidentally
> >> > checking
> >> in a
> >> > copy, something that happens fairly frequently with other files
> >> > that are
> >> > perpetually locally modified. As you can imagine that would make
> >> > kind
> >> of a
> >> > mess.
> >>
> >> So instead of doing a copy lets just be have the source code in
> >> the
> >> one place in the repo, and have the source tarball include that
> >> directory (both the C and the Java traballs) the same as they have
> >> to
> >> do with the tests.  Moreover let's not fake a new directory
> >> structure,
> >> let's just tar up from a higher level but exclude the bits that we
> >> don;t want for the C / java tar respectively.
> >>
> >> >
> >> >
> >> >
> >> >>
> >> >>
> >> >>
> >> >> >
> >> >> > > 4. To switch to a particular SVN revision, simple SVN
> >> >> > > commands are
> >> run
> >> >> > > (e.g. svn switch or svn update)
> >> >> > > - Developer productivity
> >> >> > >
> >> >> > > 5. proton-c can be built, excluding its JNI binding,
> >> >> > > without
> >> requiring
> >> >> > > non-standard tools*
> >> >> > > 6. proton-c can be built, excluding its JNI binding, from a
> >> standalone
> >> >> > > checkout of the proton-c directory
> >> >> > > - Developer productivity / tool familiarity
> >> >> > >
> >> >> > > Neutral
> >> >> > >
> >> >> > > 1. A "tarball" source release of proton-c can be built by a
> >> >> > > user
> >> >> > > without an external dependency on any other part of proton,
> >> >> > > e.g.
> >> >> > > proton-api.
> >> >> > > 2. The aforementioned proton-c tarball release can be
> >> >> > > produced by
> >> >> > > performing a simple "svn export" of proton-c.
> >> >> > > - If I were building proton-c for my platform for tarball,
> >> >> > > I would
> >> >> > > also want to run the tests to be sure proton-c functions
> >> >> > > correctly.
> >> >> > > For this reason I question the usefulness of a proton-c
> >> >> > > tarball.  I
> >> >> > > would want a tarball that included the whole tree including
> >> >> > > the
> >> tests.
> >> >> > >
> >> >> >
> >> >> > The proton-c tarball does include the tests directory. The
> >> >> > tests
> >> >> directory
> >> >> > is just pure python code, so once you've installed proton-c
> >> >> > onto your
> >> >> > system, you can run any of the proton tests just like you
> >> >> > would run
> >> any
> >> >> > normal python script. As I mentioned in another post, the
> >> >> > inclusion
> >> of
> >> >> > tests under both proton-c and proton-j is the one deviation
> >> >> > in
> >> directory
> >> >> > structure from a pure svn export, and even this much is
> >> >> > kindof a
> >> pain as
> >> >> > there is no way for the README to actually describe things
> >> >> > properly
> >> >> without
> >> >> > being broken in either the svn tree or in the release
> >> >> > artifact.
> >> >> >
> >> >> >
> >> >> So, if we want to keep the source and the svn the same,
> >> >> wouldn't it
> >> make
> >> >> more sense for the release tarballs to actually just be strict
> >> >> subsets
> >> of
> >> >> the proton tree?  that is the proton-j tarball would be
> >> >>
> >> >> proton
> >> >>   |
> >> >>   +-- proton-j
> >> >>   |
> >> >>   +-- tests
> >> >>
> >> >> and the proton-c tarball would be
> >> >>
> >> >> proton
> >> >>   |
> >> >>   +-- proton-c
> >> >>   |
> >> >>   +-- tests
> >> >>
> >> >>
> >> >> If we wanted to avoid a copy of the java API in the build then
> >> >> we could
> >> >> then actually just reference ../proton-j and include that
> >> >> subset of
> >> >> proton-j in the release source tarball?
> >> >>
> >> >
> >> >  I'm not sure that helps with the README situation. You now
> >> >  effectively
> >> > have 3 views of the same tree, and it would be difficult to
> >> > imagine
> >> writing
> >> > a single README that would make sense in every case. You'd
> >> > pretty much
> >> have
> >> > to structure each sub directory as a standalone tree and each
> >> > would need
> >> > their own READMEs.
> >>
> >> Why - you build from the top level.  The only thing that is
> >> different
> >> is that you'd probably have a README_JAVA and a README_C and you'd
> >> exclude the one you didn't want
> >>
> >> -- Rob
> >>
> >> >As is, the C tarball basically works on the assumption
> >> > that the tests directory is a standalone tree which it simply
> >> > includes
> >> to
> >> > eliminate the extra step of downloading it, so it's similar in
> >> > that
> >> respect
> >> > but avoids the extra depth.
> >> >
> >> > --Rafael
> >>
> >
> 

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Fri, Jan 25, 2013 at 9:56 AM, Phil Harvey <ph...@philharveyonline.com>wrote:

> As promised, here is a proper write-up of how we're planning to modify the
> Proton build system.
>
>
> === Requirements ===
> I've updated the "Proton build system requirements" wiki page:
>
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
>
>
> === Proposed solution ===
>
> The proposed solution changes Proton in two important ways.
>
> 1. proton will built and distributed as a single project.
> This means that the released souce tarball will be created from the top
> level, not from the proton-c sub-folder.
>
> To support this, a CMakelists.txt file will be created at the top level.
> Therefore, an existing end user wishing to build proton-c will follow the
> usual steps when they receive the tar (e.g. tar xf proton.tar ; mkdir
> proton/build ; cd proton/build ; cmake .. ; make install).
>
>
> 2. Both proton-c and proton-j will be buildable and testable using cmake,
> although proton-j will retain Maven support.
>
> Expanding our cmake build to include proton-j solves two problems, namely:
> (i) Satisfying the JNI binding's dependency on proton-api.jar and
> libqpid-proton.so.
> (ii) Allowing RHEL and Fedora users to build proton-j despite the lack of a
> recent Maven version on their platforms.
>
> The cmake Java build will assume that the developer has already downloaded
> dependencies (e.g. bouncycastle.jar for SSL support), and will not emulate
> Maven's ability to fetch them from remote repositories. This support could
> be added in the future if it is deemed necessary.
>

Two comments here. First, I think it's important to be clear that as part
of this proposal the root of the tree will emphasize the cmake build.  As
discussed on the call, the README will headline with the cmake build
instructions with the maven build as more of a secondary footnote. The
reasoning here being that the C code depends significantly more on manual
install testing than the Java, and the source code is really the primary
distribution mechanism for C, whereas most people will probably never
bother with the source when consuming the Java code.

Secondly, I suspect it's probably technically feasible (and possibly quite
natural) to structure the build such that an export of the pure Java
portion of the tree would constitute a completely functioning maven build.
This would leave us with the option to produce two source tarballs if we
wished where the Java source tarball was simply a nested subset of the full
source tarball. I don't know that we necessarily need this, but it might be
good to keep the option open if in the future we find the cmake oriented
source tarball is an obstacle for Java users.


>
> === System Tests ===
>
> Returning to another discussion point, note that the proton/tests folder
> will contain both Python and Java test suites, each of which can be run
> against either proton-c or proton-j. Their division of responsibility is
> something that will emerge over time, and does not need to be fully
> resolved right now.
>
>
As a final note on this point, I don't think this needs to be resolved to
proceed with the build, however I do think it is an important discussion to
have sooner rather than later if a significant investment in new tests are
planned.

--Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Phil Harvey <ph...@philharveyonline.com>.
As promised, here is a proper write-up of how we're planning to modify the
Proton build system.


=== Requirements ===
I've updated the "Proton build system requirements" wiki page:
https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements


=== Proposed solution ===

The proposed solution changes Proton in two important ways.

1. proton will built and distributed as a single project.
This means that the released souce tarball will be created from the top
level, not from the proton-c sub-folder.

To support this, a CMakelists.txt file will be created at the top level.
Therefore, an existing end user wishing to build proton-c will follow the
usual steps when they receive the tar (e.g. tar xf proton.tar ; mkdir
proton/build ; cd proton/build ; cmake .. ; make install).


2. Both proton-c and proton-j will be buildable and testable using cmake,
although proton-j will retain Maven support.

Expanding our cmake build to include proton-j solves two problems, namely:
(i) Satisfying the JNI binding's dependency on proton-api.jar and
libqpid-proton.so.
(ii) Allowing RHEL and Fedora users to build proton-j despite the lack of a
recent Maven version on their platforms.

The cmake Java build will assume that the developer has already downloaded
dependencies (e.g. bouncycastle.jar for SSL support), and will not emulate
Maven's ability to fetch them from remote repositories. This support could
be added in the future if it is deemed necessary.

=== System Tests ===

Returning to another discussion point, note that the proton/tests folder
will contain both Python and Java test suites, each of which can be run
against either proton-c or proton-j. Their division of responsibility is
something that will emerge over time, and does not need to be fully
resolved right now.


Phil


On 24 January 2013 21:05, Phil Harvey <ph...@philharveyonline.com> wrote:

> In case anyone thinks World War 3 is about to break out, an approach
> emerged during the Proton task force meeting today that is probably
> acceptable to all the main protagonists.
>
> A brief summary follows. I've tried to avoid too many value judgements in
> the summary, to avoid fanning the embers.
>
> - We'll add a cmake file at the top level of the proton project, which
> will be able to build all of the Java and C code in one pass. This will be
> useful both for building the JNI bindings without undue contortions, and
> for building the entire project in Maven-less environments. The Maven pom
> files will remain, for use in more mainstream Java deployments.
>
> - No duplication of the proton-api code.
>
> - The source tarball will in the future be generated from the top level,
> rather than from the proton-c folder. This avoids the issues previously
> discussed whereby JNI bindings can't be built because proton-api is
> "missing" from the tarball. The new top level cmake file will mean that the
> commands required to build proton-c will be unchanged.
>
> I'll write up some proper notes this evening if I get a chance, and will
> update the wiki.
>
> Going back to Rajith's point earlier in this discussion, it is of course
> important that decisions like this are debated and agreed in public, so
> rest assured this will all be written up and Jira'd properly so that
> everyone has the opportunity to comment.
>
> Phil
> On Jan 24, 2013 3:01 PM, "Rob Godfrey" <ro...@gmail.com> wrote:
>
>> On 24 January 2013 15:49, Rafael Schloming <rh...@alum.mit.edu> wrote:
>> > On Wed, Jan 23, 2013 at 6:44 PM, Rob Godfrey <rob.j.godfrey@gmail.com
>> >wrote:
>> >
>> >> Firstly I think it would be helpful if you made clear the requirements
>> you
>> >> consider to be essential, nice to have,  unimportant and/or
>> detrimental.
>> >>
>> >> On 23 January 2013 20:17, Rafael Schloming <rh...@alum.mit.edu> wrote:
>> >>
>> >> > On Wed, Jan 23, 2013 at 8:01 AM, Keith W <ke...@gmail.com>
>> wrote:
>> >> >
>>
>> [snip]
>>
>> >> > Given the above workflow, it seems like even with a relatively small
>> >> change
>> >> > like adding a getter, the scripted portion of the syncing effort is
>> going
>> >> > to be vanishingly small compared to the manual process of syncing the
>> >> > implementations. Perhaps I'm just envisioning a different workflow
>> than
>> >> > you, or maybe I'm missing some important scenarios. Could you
>> describe
>> >> what
>> >> > workflow(s) you envision and how the sync process would impacting
>> your
>> >> > productivity?
>> >> >
>> >> >
>> >> I differ strongly in my opinion here. Every time I need to drop out of
>> my
>> >> development environment to run some ad-hoc script then there is
>> overhead...
>> >>
>> >
>> > Don't you end up dropping out of your development environment anyways in
>> > order to build the C code?
>> >
>>
>> No - presuming I have built the C at some point in the past and there
>> is no change to the SWIG part, then I don;t need to drop out of my
>> environment at all to do a build / test.
>>
>> >
>> >> Moreover if we are using svn to do this I presume we would be having to
>> >> check in any change before the sync could be made. This means that
>> every
>> >> edit to a file now has to be followed by commit and sync (which would
>> >> obviously be an insane process).  Those of us behind corporate
>> firewalls
>> >> and proxies experience very degraded response times when updating from
>> the
>> >> Apache repository.
>> >>
>> >
>> > I wasn't suggesting syncing via the repository, I was suggesting syncing
>> > the directories in the local checkout, i.e. effectively have a script
>> that
>> > just does 'cp -r ../proton-j/<blah...>/src bindings/<blah...>/src', but
>> > obviously with some logic in there to exclude .svn and the like and to
>> work
>> > regardless of what directory you're in. You could even as you suggest
>> below
>> > make the sync happen automatically from the build.
>> >
>>
>> And which runs on Windows too? :-)
>>
>> Really this is a non-starter for me.  I haven't seen anything to
>> change my mind that actively having the source in two places in the
>> tree is insane.  It is far more sensible IMHO to simply bow to the
>> simple fact that both the C and the Java builds do depend on a shared
>> definition of the API.
>>
>> The same issue actually exists right now in that both depend on the
>> Python API ... just that that is not actually formally defined
>> anywhere.
>>
>> >
>> >>
>> >> Frankly I have doubts that any such sync script could be produced that
>> >> would work across all the different environments that developers may
>> work
>> >> in.  Mostly though I remain totally unconvinced that there is a
>> compelling
>> >> reason to do this. Rather than having a "sync" script to keep two
>> copies in
>> >> svn... why not just test if in a svn directory and ../proton-j is
>> present,
>> >> then copy ../proton-j/proton-api.  Then in the release process just
>> copy
>> >> the proton-j/proton-api source into the relevant place in the tarball?
>> >>
>> >
>> > That's really not that different from what I'm proposing, modulo whether
>> > you do the copy automatically or not, the only real difference is
>> whether
>> > you check in the copy or whether it just sits in your checkout during
>> > development and the release script does the copying. My issue with the
>> > release script doing the copying is that it makes the mapping from
>> release
>> > tarball back to svn significantly less transparent. I know as a user of
>> > other open source projects I sometimes find myself wanting to find the
>> svn
>> > source for a file included in a given release, I usually do this via
>> google
>> > and look at the thing view viewvc. It would be quite confusing to be
>> > browsing the release branch and have it be that different from what is
>> in
>> > the source tarball. I'd also worry about someone accidentally checking
>> in a
>> > copy, something that happens fairly frequently with other files that are
>> > perpetually locally modified. As you can imagine that would make kind
>> of a
>> > mess.
>>
>> So instead of doing a copy lets just be have the source code in the
>> one place in the repo, and have the source tarball include that
>> directory (both the C and the Java traballs) the same as they have to
>> do with the tests.  Moreover let's not fake a new directory structure,
>> let's just tar up from a higher level but exclude the bits that we
>> don;t want for the C / java tar respectively.
>>
>> >
>> >
>> >
>> >>
>> >>
>> >>
>> >> >
>> >> > > 4. To switch to a particular SVN revision, simple SVN commands are
>> run
>> >> > > (e.g. svn switch or svn update)
>> >> > > - Developer productivity
>> >> > >
>> >> > > 5. proton-c can be built, excluding its JNI binding, without
>> requiring
>> >> > > non-standard tools*
>> >> > > 6. proton-c can be built, excluding its JNI binding, from a
>> standalone
>> >> > > checkout of the proton-c directory
>> >> > > - Developer productivity / tool familiarity
>> >> > >
>> >> > > Neutral
>> >> > >
>> >> > > 1. A "tarball" source release of proton-c can be built by a user
>> >> > > without an external dependency on any other part of proton, e.g.
>> >> > > proton-api.
>> >> > > 2. The aforementioned proton-c tarball release can be produced by
>> >> > > performing a simple "svn export" of proton-c.
>> >> > > - If I were building proton-c for my platform for tarball, I would
>> >> > > also want to run the tests to be sure proton-c functions correctly.
>> >> > > For this reason I question the usefulness of a proton-c tarball.  I
>> >> > > would want a tarball that included the whole tree including the
>> tests.
>> >> > >
>> >> >
>> >> > The proton-c tarball does include the tests directory. The tests
>> >> directory
>> >> > is just pure python code, so once you've installed proton-c onto your
>> >> > system, you can run any of the proton tests just like you would run
>> any
>> >> > normal python script. As I mentioned in another post, the inclusion
>> of
>> >> > tests under both proton-c and proton-j is the one deviation in
>> directory
>> >> > structure from a pure svn export, and even this much is kindof a
>> pain as
>> >> > there is no way for the README to actually describe things properly
>> >> without
>> >> > being broken in either the svn tree or in the release artifact.
>> >> >
>> >> >
>> >> So, if we want to keep the source and the svn the same, wouldn't it
>> make
>> >> more sense for the release tarballs to actually just be strict subsets
>> of
>> >> the proton tree?  that is the proton-j tarball would be
>> >>
>> >> proton
>> >>   |
>> >>   +-- proton-j
>> >>   |
>> >>   +-- tests
>> >>
>> >> and the proton-c tarball would be
>> >>
>> >> proton
>> >>   |
>> >>   +-- proton-c
>> >>   |
>> >>   +-- tests
>> >>
>> >>
>> >> If we wanted to avoid a copy of the java API in the build then we could
>> >> then actually just reference ../proton-j and include that subset of
>> >> proton-j in the release source tarball?
>> >>
>> >
>> >  I'm not sure that helps with the README situation. You now effectively
>> > have 3 views of the same tree, and it would be difficult to imagine
>> writing
>> > a single README that would make sense in every case. You'd pretty much
>> have
>> > to structure each sub directory as a standalone tree and each would need
>> > their own READMEs.
>>
>> Why - you build from the top level.  The only thing that is different
>> is that you'd probably have a README_JAVA and a README_C and you'd
>> exclude the one you didn't want
>>
>> -- Rob
>>
>> >As is, the C tarball basically works on the assumption
>> > that the tests directory is a standalone tree which it simply includes
>> to
>> > eliminate the extra step of downloading it, so it's similar in that
>> respect
>> > but avoids the extra depth.
>> >
>> > --Rafael
>>
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Phil Harvey <ph...@philharveyonline.com>.
In case anyone thinks World War 3 is about to break out, an approach
emerged during the Proton task force meeting today that is probably
acceptable to all the main protagonists.

A brief summary follows. I've tried to avoid too many value judgements in
the summary, to avoid fanning the embers.

- We'll add a cmake file at the top level of the proton project, which will
be able to build all of the Java and C code in one pass. This will be
useful both for building the JNI bindings without undue contortions, and
for building the entire project in Maven-less environments. The Maven pom
files will remain, for use in more mainstream Java deployments.

- No duplication of the proton-api code.

- The source tarball will in the future be generated from the top level,
rather than from the proton-c folder. This avoids the issues previously
discussed whereby JNI bindings can't be built because proton-api is
"missing" from the tarball. The new top level cmake file will mean that the
commands required to build proton-c will be unchanged.

I'll write up some proper notes this evening if I get a chance, and will
update the wiki.

Going back to Rajith's point earlier in this discussion, it is of course
important that decisions like this are debated and agreed in public, so
rest assured this will all be written up and Jira'd properly so that
everyone has the opportunity to comment.

Phil
On Jan 24, 2013 3:01 PM, "Rob Godfrey" <ro...@gmail.com> wrote:

> On 24 January 2013 15:49, Rafael Schloming <rh...@alum.mit.edu> wrote:
> > On Wed, Jan 23, 2013 at 6:44 PM, Rob Godfrey <rob.j.godfrey@gmail.com
> >wrote:
> >
> >> Firstly I think it would be helpful if you made clear the requirements
> you
> >> consider to be essential, nice to have,  unimportant and/or detrimental.
> >>
> >> On 23 January 2013 20:17, Rafael Schloming <rh...@alum.mit.edu> wrote:
> >>
> >> > On Wed, Jan 23, 2013 at 8:01 AM, Keith W <ke...@gmail.com>
> wrote:
> >> >
>
> [snip]
>
> >> > Given the above workflow, it seems like even with a relatively small
> >> change
> >> > like adding a getter, the scripted portion of the syncing effort is
> going
> >> > to be vanishingly small compared to the manual process of syncing the
> >> > implementations. Perhaps I'm just envisioning a different workflow
> than
> >> > you, or maybe I'm missing some important scenarios. Could you describe
> >> what
> >> > workflow(s) you envision and how the sync process would impacting your
> >> > productivity?
> >> >
> >> >
> >> I differ strongly in my opinion here. Every time I need to drop out of
> my
> >> development environment to run some ad-hoc script then there is
> overhead...
> >>
> >
> > Don't you end up dropping out of your development environment anyways in
> > order to build the C code?
> >
>
> No - presuming I have built the C at some point in the past and there
> is no change to the SWIG part, then I don;t need to drop out of my
> environment at all to do a build / test.
>
> >
> >> Moreover if we are using svn to do this I presume we would be having to
> >> check in any change before the sync could be made. This means that every
> >> edit to a file now has to be followed by commit and sync (which would
> >> obviously be an insane process).  Those of us behind corporate firewalls
> >> and proxies experience very degraded response times when updating from
> the
> >> Apache repository.
> >>
> >
> > I wasn't suggesting syncing via the repository, I was suggesting syncing
> > the directories in the local checkout, i.e. effectively have a script
> that
> > just does 'cp -r ../proton-j/<blah...>/src bindings/<blah...>/src', but
> > obviously with some logic in there to exclude .svn and the like and to
> work
> > regardless of what directory you're in. You could even as you suggest
> below
> > make the sync happen automatically from the build.
> >
>
> And which runs on Windows too? :-)
>
> Really this is a non-starter for me.  I haven't seen anything to
> change my mind that actively having the source in two places in the
> tree is insane.  It is far more sensible IMHO to simply bow to the
> simple fact that both the C and the Java builds do depend on a shared
> definition of the API.
>
> The same issue actually exists right now in that both depend on the
> Python API ... just that that is not actually formally defined
> anywhere.
>
> >
> >>
> >> Frankly I have doubts that any such sync script could be produced that
> >> would work across all the different environments that developers may
> work
> >> in.  Mostly though I remain totally unconvinced that there is a
> compelling
> >> reason to do this. Rather than having a "sync" script to keep two
> copies in
> >> svn... why not just test if in a svn directory and ../proton-j is
> present,
> >> then copy ../proton-j/proton-api.  Then in the release process just copy
> >> the proton-j/proton-api source into the relevant place in the tarball?
> >>
> >
> > That's really not that different from what I'm proposing, modulo whether
> > you do the copy automatically or not, the only real difference is whether
> > you check in the copy or whether it just sits in your checkout during
> > development and the release script does the copying. My issue with the
> > release script doing the copying is that it makes the mapping from
> release
> > tarball back to svn significantly less transparent. I know as a user of
> > other open source projects I sometimes find myself wanting to find the
> svn
> > source for a file included in a given release, I usually do this via
> google
> > and look at the thing view viewvc. It would be quite confusing to be
> > browsing the release branch and have it be that different from what is in
> > the source tarball. I'd also worry about someone accidentally checking
> in a
> > copy, something that happens fairly frequently with other files that are
> > perpetually locally modified. As you can imagine that would make kind of
> a
> > mess.
>
> So instead of doing a copy lets just be have the source code in the
> one place in the repo, and have the source tarball include that
> directory (both the C and the Java traballs) the same as they have to
> do with the tests.  Moreover let's not fake a new directory structure,
> let's just tar up from a higher level but exclude the bits that we
> don;t want for the C / java tar respectively.
>
> >
> >
> >
> >>
> >>
> >>
> >> >
> >> > > 4. To switch to a particular SVN revision, simple SVN commands are
> run
> >> > > (e.g. svn switch or svn update)
> >> > > - Developer productivity
> >> > >
> >> > > 5. proton-c can be built, excluding its JNI binding, without
> requiring
> >> > > non-standard tools*
> >> > > 6. proton-c can be built, excluding its JNI binding, from a
> standalone
> >> > > checkout of the proton-c directory
> >> > > - Developer productivity / tool familiarity
> >> > >
> >> > > Neutral
> >> > >
> >> > > 1. A "tarball" source release of proton-c can be built by a user
> >> > > without an external dependency on any other part of proton, e.g.
> >> > > proton-api.
> >> > > 2. The aforementioned proton-c tarball release can be produced by
> >> > > performing a simple "svn export" of proton-c.
> >> > > - If I were building proton-c for my platform for tarball, I would
> >> > > also want to run the tests to be sure proton-c functions correctly.
> >> > > For this reason I question the usefulness of a proton-c tarball.  I
> >> > > would want a tarball that included the whole tree including the
> tests.
> >> > >
> >> >
> >> > The proton-c tarball does include the tests directory. The tests
> >> directory
> >> > is just pure python code, so once you've installed proton-c onto your
> >> > system, you can run any of the proton tests just like you would run
> any
> >> > normal python script. As I mentioned in another post, the inclusion of
> >> > tests under both proton-c and proton-j is the one deviation in
> directory
> >> > structure from a pure svn export, and even this much is kindof a pain
> as
> >> > there is no way for the README to actually describe things properly
> >> without
> >> > being broken in either the svn tree or in the release artifact.
> >> >
> >> >
> >> So, if we want to keep the source and the svn the same, wouldn't it make
> >> more sense for the release tarballs to actually just be strict subsets
> of
> >> the proton tree?  that is the proton-j tarball would be
> >>
> >> proton
> >>   |
> >>   +-- proton-j
> >>   |
> >>   +-- tests
> >>
> >> and the proton-c tarball would be
> >>
> >> proton
> >>   |
> >>   +-- proton-c
> >>   |
> >>   +-- tests
> >>
> >>
> >> If we wanted to avoid a copy of the java API in the build then we could
> >> then actually just reference ../proton-j and include that subset of
> >> proton-j in the release source tarball?
> >>
> >
> >  I'm not sure that helps with the README situation. You now effectively
> > have 3 views of the same tree, and it would be difficult to imagine
> writing
> > a single README that would make sense in every case. You'd pretty much
> have
> > to structure each sub directory as a standalone tree and each would need
> > their own READMEs.
>
> Why - you build from the top level.  The only thing that is different
> is that you'd probably have a README_JAVA and a README_C and you'd
> exclude the one you didn't want
>
> -- Rob
>
> >As is, the C tarball basically works on the assumption
> > that the tests directory is a standalone tree which it simply includes to
> > eliminate the extra step of downloading it, so it's similar in that
> respect
> > but avoids the extra depth.
> >
> > --Rafael
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rob Godfrey <ro...@gmail.com>.
On 24 January 2013 15:49, Rafael Schloming <rh...@alum.mit.edu> wrote:
> On Wed, Jan 23, 2013 at 6:44 PM, Rob Godfrey <ro...@gmail.com>wrote:
>
>> Firstly I think it would be helpful if you made clear the requirements you
>> consider to be essential, nice to have,  unimportant and/or detrimental.
>>
>> On 23 January 2013 20:17, Rafael Schloming <rh...@alum.mit.edu> wrote:
>>
>> > On Wed, Jan 23, 2013 at 8:01 AM, Keith W <ke...@gmail.com> wrote:
>> >

[snip]

>> > Given the above workflow, it seems like even with a relatively small
>> change
>> > like adding a getter, the scripted portion of the syncing effort is going
>> > to be vanishingly small compared to the manual process of syncing the
>> > implementations. Perhaps I'm just envisioning a different workflow than
>> > you, or maybe I'm missing some important scenarios. Could you describe
>> what
>> > workflow(s) you envision and how the sync process would impacting your
>> > productivity?
>> >
>> >
>> I differ strongly in my opinion here. Every time I need to drop out of my
>> development environment to run some ad-hoc script then there is overhead...
>>
>
> Don't you end up dropping out of your development environment anyways in
> order to build the C code?
>

No - presuming I have built the C at some point in the past and there
is no change to the SWIG part, then I don;t need to drop out of my
environment at all to do a build / test.

>
>> Moreover if we are using svn to do this I presume we would be having to
>> check in any change before the sync could be made. This means that every
>> edit to a file now has to be followed by commit and sync (which would
>> obviously be an insane process).  Those of us behind corporate firewalls
>> and proxies experience very degraded response times when updating from the
>> Apache repository.
>>
>
> I wasn't suggesting syncing via the repository, I was suggesting syncing
> the directories in the local checkout, i.e. effectively have a script that
> just does 'cp -r ../proton-j/<blah...>/src bindings/<blah...>/src', but
> obviously with some logic in there to exclude .svn and the like and to work
> regardless of what directory you're in. You could even as you suggest below
> make the sync happen automatically from the build.
>

And which runs on Windows too? :-)

Really this is a non-starter for me.  I haven't seen anything to
change my mind that actively having the source in two places in the
tree is insane.  It is far more sensible IMHO to simply bow to the
simple fact that both the C and the Java builds do depend on a shared
definition of the API.

The same issue actually exists right now in that both depend on the
Python API ... just that that is not actually formally defined
anywhere.

>
>>
>> Frankly I have doubts that any such sync script could be produced that
>> would work across all the different environments that developers may work
>> in.  Mostly though I remain totally unconvinced that there is a compelling
>> reason to do this. Rather than having a "sync" script to keep two copies in
>> svn... why not just test if in a svn directory and ../proton-j is present,
>> then copy ../proton-j/proton-api.  Then in the release process just copy
>> the proton-j/proton-api source into the relevant place in the tarball?
>>
>
> That's really not that different from what I'm proposing, modulo whether
> you do the copy automatically or not, the only real difference is whether
> you check in the copy or whether it just sits in your checkout during
> development and the release script does the copying. My issue with the
> release script doing the copying is that it makes the mapping from release
> tarball back to svn significantly less transparent. I know as a user of
> other open source projects I sometimes find myself wanting to find the svn
> source for a file included in a given release, I usually do this via google
> and look at the thing view viewvc. It would be quite confusing to be
> browsing the release branch and have it be that different from what is in
> the source tarball. I'd also worry about someone accidentally checking in a
> copy, something that happens fairly frequently with other files that are
> perpetually locally modified. As you can imagine that would make kind of a
> mess.

So instead of doing a copy lets just be have the source code in the
one place in the repo, and have the source tarball include that
directory (both the C and the Java traballs) the same as they have to
do with the tests.  Moreover let's not fake a new directory structure,
let's just tar up from a higher level but exclude the bits that we
don;t want for the C / java tar respectively.

>
>
>
>>
>>
>>
>> >
>> > > 4. To switch to a particular SVN revision, simple SVN commands are run
>> > > (e.g. svn switch or svn update)
>> > > - Developer productivity
>> > >
>> > > 5. proton-c can be built, excluding its JNI binding, without requiring
>> > > non-standard tools*
>> > > 6. proton-c can be built, excluding its JNI binding, from a standalone
>> > > checkout of the proton-c directory
>> > > - Developer productivity / tool familiarity
>> > >
>> > > Neutral
>> > >
>> > > 1. A "tarball" source release of proton-c can be built by a user
>> > > without an external dependency on any other part of proton, e.g.
>> > > proton-api.
>> > > 2. The aforementioned proton-c tarball release can be produced by
>> > > performing a simple "svn export" of proton-c.
>> > > - If I were building proton-c for my platform for tarball, I would
>> > > also want to run the tests to be sure proton-c functions correctly.
>> > > For this reason I question the usefulness of a proton-c tarball.  I
>> > > would want a tarball that included the whole tree including the tests.
>> > >
>> >
>> > The proton-c tarball does include the tests directory. The tests
>> directory
>> > is just pure python code, so once you've installed proton-c onto your
>> > system, you can run any of the proton tests just like you would run any
>> > normal python script. As I mentioned in another post, the inclusion of
>> > tests under both proton-c and proton-j is the one deviation in directory
>> > structure from a pure svn export, and even this much is kindof a pain as
>> > there is no way for the README to actually describe things properly
>> without
>> > being broken in either the svn tree or in the release artifact.
>> >
>> >
>> So, if we want to keep the source and the svn the same, wouldn't it make
>> more sense for the release tarballs to actually just be strict subsets of
>> the proton tree?  that is the proton-j tarball would be
>>
>> proton
>>   |
>>   +-- proton-j
>>   |
>>   +-- tests
>>
>> and the proton-c tarball would be
>>
>> proton
>>   |
>>   +-- proton-c
>>   |
>>   +-- tests
>>
>>
>> If we wanted to avoid a copy of the java API in the build then we could
>> then actually just reference ../proton-j and include that subset of
>> proton-j in the release source tarball?
>>
>
>  I'm not sure that helps with the README situation. You now effectively
> have 3 views of the same tree, and it would be difficult to imagine writing
> a single README that would make sense in every case. You'd pretty much have
> to structure each sub directory as a standalone tree and each would need
> their own READMEs.

Why - you build from the top level.  The only thing that is different
is that you'd probably have a README_JAVA and a README_C and you'd
exclude the one you didn't want

-- Rob

>As is, the C tarball basically works on the assumption
> that the tests directory is a standalone tree which it simply includes to
> eliminate the extra step of downloading it, so it's similar in that respect
> but avoids the extra depth.
>
> --Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Wed, Jan 23, 2013 at 6:44 PM, Rob Godfrey <ro...@gmail.com>wrote:

> Firstly I think it would be helpful if you made clear the requirements you
> consider to be essential, nice to have,  unimportant and/or detrimental.
>
> On 23 January 2013 20:17, Rafael Schloming <rh...@alum.mit.edu> wrote:
>
> > On Wed, Jan 23, 2013 at 8:01 AM, Keith W <ke...@gmail.com> wrote:
> >
> > > Essential
> > >
> > > 3. To change proton-api, all that is required is to edit a Java file.
> > > - Developer productivity
> > >
> >
> > This seems to be kind of a leading requirement so to speak, or at least
> > it's phrased a little bit oddly. That said I would never argue with it
> for
> > most of the Java files, however in the case of the API files I don't see
> > how you're ever going to be able to stop after just editing the API.
> > Because we have two implementations, we're fundamentally stuck with
> > manually syncing the implementations themselves whenever a change to the
> > interface occurs. By comparison the highly automatable task of syncing
> the
> > API files themselves seems quite small. I'm imagining most changes would
> go
> > something like this, say we want to add a getter to the Message
> interface,
> > we would need to:
> >
> >
> I think it's worth considering two different cases
>
> 1) The API change is purely on the Java side... there is no corresponding
> change to the C API.  This may be to add some sort of convenience method,
> or simply a refactoring.
>
> In this case the developer making the change needs only to work in Java,
> there will be two implementations of the interface to change (in two
> different source locations) but is all rather trivial.
>

Is this actually possible? Wouldn't you at least need to build/run the C so
that you know there is actually no impact on the C impl? Even if you're
just calling it differently it could tickle a bug.

2) The API change affects both C and Java.
>
> In this case either a single developer has to commit to making the change
> in both the C and the Java, or the API change has to have been discussed
> before work commences and Java and C developers will need to work
> together.  If there is a single developer or developers working very
> closely together then I would suggest that the steps would in fact be:
>
>   1. edit the Message interface /  edit the message.h file
>   2. write and/or modify a test (and Python binding if necessary)
>   3. edit the JNI binding to use the SWIG generated API
>   4. edit the C / Pure Java
>   5. run the tests against the C / Java
>   (6. modify other bindings if necessary)
>
>   repeat steps 4 and 5 until they pass.
>
> In the case where the C and Java developers are separated by time/distance
> then the build / tests on one side will be broken until the implementation
> catches up.  For the sake of politeness it is probably better to ensure
> that at all points the checked in code compiles even if the tests do not
> pass.  For cases where the changes to the API are additions then it should
> be relatively easy to make the changes in such a way as to simply have any
> tests relating to the new API be skipped. For cases where the C leads the
> Java, the Java implementation can simply throw
> UnsupportedOperationException or some such.  Where the Java leads the C we
> can throw said exception from the JNI binding code and leave the .h file
> unchanged until the C developer is ready to do the work.
>
> Only for cases where there is modification to existing APIs does it seem
> that there may be occaisins where we could not have a consistent build
> across components, and I would strongly recommend that any change where the
> Java and C are being worked on in such a fashion should take place on a
> branch, with a merge to trunk only occurring when all tests are passing
> against all implementations.
>
>
> >   1. edit the Message interface
> >   2. write and/or possibly modify a test
> >   3. edit the java Message implementation
> >   4. run the tests against java, if they don't pass go to step 2
> >   5. now that the java impl passes the tests, run the tests against the C
> > impl
> >   6. if the sync check fails on the C build, run the sync script
> >   7. edit the message.h file
> >   8. edit the message.c implementation
> >   9. edit the adapter layer between the C API and the Java interfaces
> >   10. run the tests against the C, if they don't pass go to step 8
> >   11. run the tests against both, just to be sure
> >   12. check in
> >
> > Given the above workflow, it seems like even with a relatively small
> change
> > like adding a getter, the scripted portion of the syncing effort is going
> > to be vanishingly small compared to the manual process of syncing the
> > implementations. Perhaps I'm just envisioning a different workflow than
> > you, or maybe I'm missing some important scenarios. Could you describe
> what
> > workflow(s) you envision and how the sync process would impacting your
> > productivity?
> >
> >
> I differ strongly in my opinion here. Every time I need to drop out of my
> development environment to run some ad-hoc script then there is overhead...
>

Don't you end up dropping out of your development environment anyways in
order to build the C code?


> Moreover if we are using svn to do this I presume we would be having to
> check in any change before the sync could be made. This means that every
> edit to a file now has to be followed by commit and sync (which would
> obviously be an insane process).  Those of us behind corporate firewalls
> and proxies experience very degraded response times when updating from the
> Apache repository.
>

I wasn't suggesting syncing via the repository, I was suggesting syncing
the directories in the local checkout, i.e. effectively have a script that
just does 'cp -r ../proton-j/<blah...>/src bindings/<blah...>/src', but
obviously with some logic in there to exclude .svn and the like and to work
regardless of what directory you're in. You could even as you suggest below
make the sync happen automatically from the build.


>
> Frankly I have doubts that any such sync script could be produced that
> would work across all the different environments that developers may work
> in.  Mostly though I remain totally unconvinced that there is a compelling
> reason to do this. Rather than having a "sync" script to keep two copies in
> svn... why not just test if in a svn directory and ../proton-j is present,
> then copy ../proton-j/proton-api.  Then in the release process just copy
> the proton-j/proton-api source into the relevant place in the tarball?
>

That's really not that different from what I'm proposing, modulo whether
you do the copy automatically or not, the only real difference is whether
you check in the copy or whether it just sits in your checkout during
development and the release script does the copying. My issue with the
release script doing the copying is that it makes the mapping from release
tarball back to svn significantly less transparent. I know as a user of
other open source projects I sometimes find myself wanting to find the svn
source for a file included in a given release, I usually do this via google
and look at the thing view viewvc. It would be quite confusing to be
browsing the release branch and have it be that different from what is in
the source tarball. I'd also worry about someone accidentally checking in a
copy, something that happens fairly frequently with other files that are
perpetually locally modified. As you can imagine that would make kind of a
mess.



>
>
>
> >
> > > 4. To switch to a particular SVN revision, simple SVN commands are run
> > > (e.g. svn switch or svn update)
> > > - Developer productivity
> > >
> > > 5. proton-c can be built, excluding its JNI binding, without requiring
> > > non-standard tools*
> > > 6. proton-c can be built, excluding its JNI binding, from a standalone
> > > checkout of the proton-c directory
> > > - Developer productivity / tool familiarity
> > >
> > > Neutral
> > >
> > > 1. A "tarball" source release of proton-c can be built by a user
> > > without an external dependency on any other part of proton, e.g.
> > > proton-api.
> > > 2. The aforementioned proton-c tarball release can be produced by
> > > performing a simple "svn export" of proton-c.
> > > - If I were building proton-c for my platform for tarball, I would
> > > also want to run the tests to be sure proton-c functions correctly.
> > > For this reason I question the usefulness of a proton-c tarball.  I
> > > would want a tarball that included the whole tree including the tests.
> > >
> >
> > The proton-c tarball does include the tests directory. The tests
> directory
> > is just pure python code, so once you've installed proton-c onto your
> > system, you can run any of the proton tests just like you would run any
> > normal python script. As I mentioned in another post, the inclusion of
> > tests under both proton-c and proton-j is the one deviation in directory
> > structure from a pure svn export, and even this much is kindof a pain as
> > there is no way for the README to actually describe things properly
> without
> > being broken in either the svn tree or in the release artifact.
> >
> >
> So, if we want to keep the source and the svn the same, wouldn't it make
> more sense for the release tarballs to actually just be strict subsets of
> the proton tree?  that is the proton-j tarball would be
>
> proton
>   |
>   +-- proton-j
>   |
>   +-- tests
>
> and the proton-c tarball would be
>
> proton
>   |
>   +-- proton-c
>   |
>   +-- tests
>
>
> If we wanted to avoid a copy of the java API in the build then we could
> then actually just reference ../proton-j and include that subset of
> proton-j in the release source tarball?
>

 I'm not sure that helps with the README situation. You now effectively
have 3 views of the same tree, and it would be difficult to imagine writing
a single README that would make sense in every case. You'd pretty much have
to structure each sub directory as a standalone tree and each would need
their own READMEs. As is, the C tarball basically works on the assumption
that the tests directory is a standalone tree which it simply includes to
eliminate the extra step of downloading it, so it's similar in that respect
but avoids the extra depth.

--Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rob Godfrey <ro...@gmail.com>.
Firstly I think it would be helpful if you made clear the requirements you
consider to be essential, nice to have,  unimportant and/or detrimental.

On 23 January 2013 20:17, Rafael Schloming <rh...@alum.mit.edu> wrote:

> On Wed, Jan 23, 2013 at 8:01 AM, Keith W <ke...@gmail.com> wrote:
>
> > Essential
> >
> > 3. To change proton-api, all that is required is to edit a Java file.
> > - Developer productivity
> >
>
> This seems to be kind of a leading requirement so to speak, or at least
> it's phrased a little bit oddly. That said I would never argue with it for
> most of the Java files, however in the case of the API files I don't see
> how you're ever going to be able to stop after just editing the API.
> Because we have two implementations, we're fundamentally stuck with
> manually syncing the implementations themselves whenever a change to the
> interface occurs. By comparison the highly automatable task of syncing the
> API files themselves seems quite small. I'm imagining most changes would go
> something like this, say we want to add a getter to the Message interface,
> we would need to:
>
>
I think it's worth considering two different cases

1) The API change is purely on the Java side... there is no corresponding
change to the C API.  This may be to add some sort of convenience method,
or simply a refactoring.

In this case the developer making the change needs only to work in Java,
there will be two implementations of the interface to change (in two
different source locations) but is all rather trivial.

2) The API change affects both C and Java.

In this case either a single developer has to commit to making the change
in both the C and the Java, or the API change has to have been discussed
before work commences and Java and C developers will need to work
together.  If there is a single developer or developers working very
closely together then I would suggest that the steps would in fact be:

  1. edit the Message interface /  edit the message.h file
  2. write and/or modify a test (and Python binding if necessary)
  3. edit the JNI binding to use the SWIG generated API
  4. edit the C / Pure Java
  5. run the tests against the C / Java
  (6. modify other bindings if necessary)

  repeat steps 4 and 5 until they pass.

In the case where the C and Java developers are separated by time/distance
then the build / tests on one side will be broken until the implementation
catches up.  For the sake of politeness it is probably better to ensure
that at all points the checked in code compiles even if the tests do not
pass.  For cases where the changes to the API are additions then it should
be relatively easy to make the changes in such a way as to simply have any
tests relating to the new API be skipped. For cases where the C leads the
Java, the Java implementation can simply throw
UnsupportedOperationException or some such.  Where the Java leads the C we
can throw said exception from the JNI binding code and leave the .h file
unchanged until the C developer is ready to do the work.

Only for cases where there is modification to existing APIs does it seem
that there may be occaisins where we could not have a consistent build
across components, and I would strongly recommend that any change where the
Java and C are being worked on in such a fashion should take place on a
branch, with a merge to trunk only occurring when all tests are passing
against all implementations.


>   1. edit the Message interface
>   2. write and/or possibly modify a test
>   3. edit the java Message implementation
>   4. run the tests against java, if they don't pass go to step 2
>   5. now that the java impl passes the tests, run the tests against the C
> impl
>   6. if the sync check fails on the C build, run the sync script
>   7. edit the message.h file
>   8. edit the message.c implementation
>   9. edit the adapter layer between the C API and the Java interfaces
>   10. run the tests against the C, if they don't pass go to step 8
>   11. run the tests against both, just to be sure
>   12. check in
>
> Given the above workflow, it seems like even with a relatively small change
> like adding a getter, the scripted portion of the syncing effort is going
> to be vanishingly small compared to the manual process of syncing the
> implementations. Perhaps I'm just envisioning a different workflow than
> you, or maybe I'm missing some important scenarios. Could you describe what
> workflow(s) you envision and how the sync process would impacting your
> productivity?
>
>
I differ strongly in my opinion here. Every time I need to drop out of my
development environment to run some ad-hoc script then there is overhead...
Moreover if we are using svn to do this I presume we would be having to
check in any change before the sync could be made. This means that every
edit to a file now has to be followed by commit and sync (which would
obviously be an insane process).  Those of us behind corporate firewalls
and proxies experience very degraded response times when updating from the
Apache repository.

Frankly I have doubts that any such sync script could be produced that
would work across all the different environments that developers may work
in.  Mostly though I remain totally unconvinced that there is a compelling
reason to do this. Rather than having a "sync" script to keep two copies in
svn... why not just test if in a svn directory and ../proton-j is present,
then copy ../proton-j/proton-api.  Then in the release process just copy
the proton-j/proton-api source into the relevant place in the tarball?




>
> > 4. To switch to a particular SVN revision, simple SVN commands are run
> > (e.g. svn switch or svn update)
> > - Developer productivity
> >
> > 5. proton-c can be built, excluding its JNI binding, without requiring
> > non-standard tools*
> > 6. proton-c can be built, excluding its JNI binding, from a standalone
> > checkout of the proton-c directory
> > - Developer productivity / tool familiarity
> >
> > Neutral
> >
> > 1. A "tarball" source release of proton-c can be built by a user
> > without an external dependency on any other part of proton, e.g.
> > proton-api.
> > 2. The aforementioned proton-c tarball release can be produced by
> > performing a simple "svn export" of proton-c.
> > - If I were building proton-c for my platform for tarball, I would
> > also want to run the tests to be sure proton-c functions correctly.
> > For this reason I question the usefulness of a proton-c tarball.  I
> > would want a tarball that included the whole tree including the tests.
> >
>
> The proton-c tarball does include the tests directory. The tests directory
> is just pure python code, so once you've installed proton-c onto your
> system, you can run any of the proton tests just like you would run any
> normal python script. As I mentioned in another post, the inclusion of
> tests under both proton-c and proton-j is the one deviation in directory
> structure from a pure svn export, and even this much is kindof a pain as
> there is no way for the README to actually describe things properly without
> being broken in either the svn tree or in the release artifact.
>
>
So, if we want to keep the source and the svn the same, wouldn't it make
more sense for the release tarballs to actually just be strict subsets of
the proton tree?  that is the proton-j tarball would be

proton
  |
  +-- proton-j
  |
  +-- tests

and the proton-c tarball would be

proton
  |
  +-- proton-c
  |
  +-- tests


If we wanted to avoid a copy of the java API in the build then we could
then actually just reference ../proton-j and include that subset of
proton-j in the release source tarball?

-- Rob



>
> >
> > 7. Proton-c can be built without requiring non-standard tools*
> > 9. Proton-c can be tested without requiring non-standard tools*
> >  - If we can achieve this without introducing too much complexity,
> > reinventing too many wheels and the result is portable across all
> > target platforms.
> >
> > Detrimental
> >
> > 8. proton-c can be built from a standalone checkout of the proton-c
> > directory
> >  - I think that all proton developers who are changing either the C or
> > Java implementations should be running the system tests before each
> > commit.  If they are changing system tests then they need to run
> > against both implementations before each commit.
> >
>
> Doesn't this conflict pretty directly with 6?
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Wed, Jan 23, 2013 at 8:01 AM, Keith W <ke...@gmail.com> wrote:

> Essential
>
> 3. To change proton-api, all that is required is to edit a Java file.
> - Developer productivity
>

This seems to be kind of a leading requirement so to speak, or at least
it's phrased a little bit oddly. That said I would never argue with it for
most of the Java files, however in the case of the API files I don't see
how you're ever going to be able to stop after just editing the API.
Because we have two implementations, we're fundamentally stuck with
manually syncing the implementations themselves whenever a change to the
interface occurs. By comparison the highly automatable task of syncing the
API files themselves seems quite small. I'm imagining most changes would go
something like this, say we want to add a getter to the Message interface,
we would need to:

  1. edit the Message interface
  2. write and/or possibly modify a test
  3. edit the java Message implementation
  4. run the tests against java, if they don't pass go to step 2
  5. now that the java impl passes the tests, run the tests against the C
impl
  6. if the sync check fails on the C build, run the sync script
  7. edit the message.h file
  8. edit the message.c implementation
  9. edit the adapter layer between the C API and the Java interfaces
  10. run the tests against the C, if they don't pass go to step 8
  11. run the tests against both, just to be sure
  12. check in

Given the above workflow, it seems like even with a relatively small change
like adding a getter, the scripted portion of the syncing effort is going
to be vanishingly small compared to the manual process of syncing the
implementations. Perhaps I'm just envisioning a different workflow than
you, or maybe I'm missing some important scenarios. Could you describe what
workflow(s) you envision and how the sync process would impacting your
productivity?


> 4. To switch to a particular SVN revision, simple SVN commands are run
> (e.g. svn switch or svn update)
> - Developer productivity
>
> 5. proton-c can be built, excluding its JNI binding, without requiring
> non-standard tools*
> 6. proton-c can be built, excluding its JNI binding, from a standalone
> checkout of the proton-c directory
> - Developer productivity / tool familiarity
>
> Neutral
>
> 1. A "tarball" source release of proton-c can be built by a user
> without an external dependency on any other part of proton, e.g.
> proton-api.
> 2. The aforementioned proton-c tarball release can be produced by
> performing a simple "svn export" of proton-c.
> - If I were building proton-c for my platform for tarball, I would
> also want to run the tests to be sure proton-c functions correctly.
> For this reason I question the usefulness of a proton-c tarball.  I
> would want a tarball that included the whole tree including the tests.
>

The proton-c tarball does include the tests directory. The tests directory
is just pure python code, so once you've installed proton-c onto your
system, you can run any of the proton tests just like you would run any
normal python script. As I mentioned in another post, the inclusion of
tests under both proton-c and proton-j is the one deviation in directory
structure from a pure svn export, and even this much is kindof a pain as
there is no way for the README to actually describe things properly without
being broken in either the svn tree or in the release artifact.


>
> 7. Proton-c can be built without requiring non-standard tools*
> 9. Proton-c can be tested without requiring non-standard tools*
>  - If we can achieve this without introducing too much complexity,
> reinventing too many wheels and the result is portable across all
> target platforms.
>
> Detrimental
>
> 8. proton-c can be built from a standalone checkout of the proton-c
> directory
>  - I think that all proton developers who are changing either the C or
> Java implementations should be running the system tests before each
> commit.  If they are changing system tests then they need to run
> against both implementations before each commit.
>

Doesn't this conflict pretty directly with 6?

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rob Godfrey <ro...@gmail.com>.
On 24 January 2013 14:43, Rafael Schloming <rh...@alum.mit.edu> wrote:
> On Wed, Jan 23, 2013 at 6:10 PM, Rob Godfrey <ro...@gmail.com>wrote:
>
>> On 23 January 2013 19:09, Rafael Schloming <rh...@alum.mit.edu> wrote:
>>
>> > I've added another wiki page that documents the proton release steps as
>> > best I can remember. I'll updated it more during the 0.4 release:
>> > https://cwiki.apache.org/confluence/display/qpid/Proton+Release+Steps
>> >
>> > I think it's important to understand the overall release and testing
>> > process as it is a significant and perhaps underrepresented factor
>> against
>> > which to measure any proposals. I believe the build system requirements
>> > documented below are inherently incomplete as they don't recognize the
>> fact
>> > that the C build system is not just a developer productivity tool, it is
>> > also the installer for our end users. And before anyone says our end
>> users
>> > will just use yum or equivalents, all those packaging tools *also* depend
>> > on our build system both directly, and because we can't even supply a
>> > release for packagers to consume without a reasonable amount of direct
>> > install testing. To a good extent a standard looking C source tarball is
>> > pretty much the equivalent of a jar or jar + pom file in the Java world,
>> > it's really the only platform independent means of distribution we have.
>> >
>> >
>> It would be helpful if you could enumerate requirements which you believe
>> to be missing and add the to the existing wiki page.  I don't think anyone
>> is suggesting that the make install step should be broken in the source
>> tarball, so it's a little unclear to me the problem you are trying to
>> highlight above.
>>
>
> I believe it was suggested at one point that we not have a C source tarball
> but just export the entire tree as a single source tarball. This strictly
> speaking would not break the make install step, however it would have a
> serious impact on our ability to leverage others to test the C impl. Anyone
> downloading this would need to understand a great deal about the dual
> nature of proton and how it is structured just in order to know that they
> can ignore half the tree. Compare that with a standard C source tarball
> where I can hand it off to someone who knows nothing about proton and
> simply tell them to do a make install and then run one test script. Given
> the latter structure to our release artifacts there are *significantly*
> more resources we have access to in order to perform the testing necessary
> to do a quality release.
>

I'm not sure requiring "the ability to read a README file" is really
going to have a huge impact on our ability to leverage others. I'm not
sure how widespread CMake use is (it's certainly less familiar to me
than autotools) - certainly I expect people unfamiliar with cmake will
have to read the README anyway.



> I'll take a stab at distilling some requirements out of the above scenario
> and sticking them onto the wiki page, but I actually think the scenario
> itself is more important than the requirements. There's no disagreement
> that it would be nice to have a very standard looking C source tarball with
> minimal dependencies and so forth that can be used in the above manner,
> it's simply the relative priority of the requirement when it conflicts with
> developer convenience that is a source of contention.
>
>
>>
>> > It's also probably worth noting that perhaps the biggest issue with
>> system
>> > tests in Java is not so much imposing maven on proton-c developers, but
>> the and nottest against Java code running elsewhere.
>>
>> > fact that Java may not be available on all the platforms that proton-c
>> > needs to be tested on. My primary concern here would be iOS. I'm not an
>> > expert, but my brief googling seems to suggest there would be significant
>> > issues.
>> >
>> >
>> So, I think we probably need to consider what sort of tests are required,
>> and which languages it is appropriate to write any particular type of test
>> in.  For me tests in Java have some advantages over Python tests. Firstly
>> they allow interop tests between the two implementations within the same
>> process
>
>
> Can you elaborate on the benefits of this? It seems to me when it comes to
> interop testing that, to the extent you can get away with it, over the wire
> tests would be preferred. For example you could run proton-c on iOS via
> Java tests running on another system, to say nothing of testing against non
> proton implementations which would necessarily need to be over-the-wire.
>

The same arguments could be made about our existing tests. However
being able to run in process makes thing much easier to automate - no
need to randomly assign ports, communicate randomly assigned ports to
the other process, etc.

>
>> and secondly they will also be able to be used against any future
>> pure JavaScript Proton implementation (something we have planned to do but
>> not yet embarked upon).
>
>
> This is also true of python tests. In fact the whole point of a python test
> suite is that you can run python, java, and javascript all within the same
> JVM using purely Java tooling.

Except then we would have two levels of API mangling between the test
and the implementation.

Back when the tests were written against the non-idiomatic Python API
this wasn't so bad as there was very little code actually in the
Python "binding".  now there is a Python object model, and then a Java
object model.  Keith has already pointed out at least one instance
where the Python "binding" is doing things in its object
initialization that are not replicated in other bindings. In general
we are now in the situation where it's very hard to work out if a bug
is in the implementation or the binding.

Frankly my experience of the current Python tests has been very poor.
Partly this is because of the nature of the tests which tend to be
monolithic and unclear in what they are testing and why the expected
outcomes are "correct"; partly it is because attempting to debug
python through jython to Java turns out to be unpleasant and slow.
Moreover the test framework we have for Python is bespoke and has no
other tooling support which makes integration hard.

Now obviously the answer to not liking the tests is to write some of
my own.  That is what we are planning to do, but we are planning to do
so in Java.  These tests will be able to be run against the C through
the JNI binding so rather than making these tests "java only" anyone
who has access to a JVM will be able to run them against any
implementation which has a binding to the Java API.

>
>  A third issue for me is that when we start to
>> attempt more granular testing of things such as error handling, I will want
>> to ensure that the user experience is identical between the pure Java and
>> JNI binding implementations of the Java Proton API... if the tests are
>> being run through a second translation into the Python API then this is not
>> easily verifiable.
>>
>>
> Why is this not verifiable? The proton API in both python and Java have
> largely identical object models, and both languages support nearly
> identical exception models.
>
> I would expect that testing for specific
> exceptions from the Python API would quite directly force you to translate
> those exceptions with high granularity from the Java API, and then you
> could run the negative test suite against both pure Java and Java/JNI.
>
>
>> As a final aside, on the standard development environment many of us have
>> to work with, the installed version on Python is too old to support the
>> current Python client (lack of UUID, etc).
>>
>
> It's true there are a few pieces we depend on that aren't available in
> python 2.4, however the qpid python client has a compat layer that provides
> these pieces if they are not there and makes the 2.4 vs new python
> experience identical. Pulling this into proton would be quite trivial.
>
>
>>
>> Personally I think the more tests we have the better, and it's more
>> important to encourage people to write tests than to force the use of a
>> particular language to write them in.  I'd also suggest that we should be
>> writing at least some tests for each of the idiomatic bindings.
>>
>
> I never said we shouldn't write tests, however as you say above it's
> important to understand what kind of tests need to be written where.
> Writing engine/protocol tests in every binding has very limited utility
> since exercising the engine from one binding will run the exact same code
> as exercising it from another, however writing tests for a binding that
> verify that types are decoded properly is quite useful as those tests would
> actually be exercising the binding code itself which is unique to the
> binding language.

Actually I think we need some way to verify that for each binding
every call to the API presented through the binding actually works.
If a change was made to the .h file tomorrow with no corresponding
change to the PHP / Ruby / Perl binding... how would we know that
those bindings had been broken?

> As you point out, checking that exceptions are mapped
> into a specific binding in a particular way is also binding specific in the
> general case, however I think in the specific case of the Java binding a
> test suite written in python could easily be run against both the Java and
> JNI implementations and provide us with a high degree of confidence that
> they behave identically. The only case where behaviour could differ is if
> the python binding where to merge two error conditions into one, but this
> would presumably just be a bug as the python binding itself would be losing
> relevant information also.
>
> In general I'm quite interested in what you think fundamentally can't be
> tested from python vs what you think could be but is cumbersome or simply
> blocked for reasons X, Y, and Z. I think a clear statement of this would
> help identify areas we could improve (e.g. python 2.4 support) and also
> help inform us what kind of Java tests should be encouraged.
>

I'm not saying that things can't be tested from Python (with enough
thunking layers in between).  I think the same is also true of Ruby.
Possibly we could even write all our tests in C and then write a sort
of reverse JNI mapping where instead we wrap the Java code within a C
library.  I'm just stating that I think there is value in adding tests
in Java that can be run against both the pure Java and C
implementations.  That in doing so we will improve the testing
coverage and better define the API.

My personal experience of the current set up has been (very) poor.
Attempting to implement proton-j by reference to the tests was a
deeply frustrating and depressing experience, and the move from the
straight SWIG binding to the idiomatic Python API made the experience
much worse.  We are now attempting to test the Java implementation
though a Python binding to Java which we do not purport to support
using a Python API that is barely documented.  Testing using Java will
be testing against a Java-C binding we do support and a Java
implementation we do support.  Given the lack of JVMs on some
platforms, clearly test only in Java would not be sufficient...
however I don't believe anyone is suggesting that we remove or
deprecate the Python tests.

-- Rob

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Wed, Jan 23, 2013 at 6:10 PM, Rob Godfrey <ro...@gmail.com>wrote:

> On 23 January 2013 19:09, Rafael Schloming <rh...@alum.mit.edu> wrote:
>
> > I've added another wiki page that documents the proton release steps as
> > best I can remember. I'll updated it more during the 0.4 release:
> > https://cwiki.apache.org/confluence/display/qpid/Proton+Release+Steps
> >
> > I think it's important to understand the overall release and testing
> > process as it is a significant and perhaps underrepresented factor
> against
> > which to measure any proposals. I believe the build system requirements
> > documented below are inherently incomplete as they don't recognize the
> fact
> > that the C build system is not just a developer productivity tool, it is
> > also the installer for our end users. And before anyone says our end
> users
> > will just use yum or equivalents, all those packaging tools *also* depend
> > on our build system both directly, and because we can't even supply a
> > release for packagers to consume without a reasonable amount of direct
> > install testing. To a good extent a standard looking C source tarball is
> > pretty much the equivalent of a jar or jar + pom file in the Java world,
> > it's really the only platform independent means of distribution we have.
> >
> >
> It would be helpful if you could enumerate requirements which you believe
> to be missing and add the to the existing wiki page.  I don't think anyone
> is suggesting that the make install step should be broken in the source
> tarball, so it's a little unclear to me the problem you are trying to
> highlight above.
>

I believe it was suggested at one point that we not have a C source tarball
but just export the entire tree as a single source tarball. This strictly
speaking would not break the make install step, however it would have a
serious impact on our ability to leverage others to test the C impl. Anyone
downloading this would need to understand a great deal about the dual
nature of proton and how it is structured just in order to know that they
can ignore half the tree. Compare that with a standard C source tarball
where I can hand it off to someone who knows nothing about proton and
simply tell them to do a make install and then run one test script. Given
the latter structure to our release artifacts there are *significantly*
more resources we have access to in order to perform the testing necessary
to do a quality release.

I'll take a stab at distilling some requirements out of the above scenario
and sticking them onto the wiki page, but I actually think the scenario
itself is more important than the requirements. There's no disagreement
that it would be nice to have a very standard looking C source tarball with
minimal dependencies and so forth that can be used in the above manner,
it's simply the relative priority of the requirement when it conflicts with
developer convenience that is a source of contention.


>
> > It's also probably worth noting that perhaps the biggest issue with
> system
> > tests in Java is not so much imposing maven on proton-c developers, but
> the and nottest against Java code running elsewhere.
>
> > fact that Java may not be available on all the platforms that proton-c
> > needs to be tested on. My primary concern here would be iOS. I'm not an
> > expert, but my brief googling seems to suggest there would be significant
> > issues.
> >
> >
> So, I think we probably need to consider what sort of tests are required,
> and which languages it is appropriate to write any particular type of test
> in.  For me tests in Java have some advantages over Python tests. Firstly
> they allow interop tests between the two implementations within the same
> process


Can you elaborate on the benefits of this? It seems to me when it comes to
interop testing that, to the extent you can get away with it, over the wire
tests would be preferred. For example you could run proton-c on iOS via
Java tests running on another system, to say nothing of testing against non
proton implementations which would necessarily need to be over-the-wire.


> and secondly they will also be able to be used against any future
> pure JavaScript Proton implementation (something we have planned to do but
> not yet embarked upon).


This is also true of python tests. In fact the whole point of a python test
suite is that you can run python, java, and javascript all within the same
JVM using purely Java tooling.

 A third issue for me is that when we start to
> attempt more granular testing of things such as error handling, I will want
> to ensure that the user experience is identical between the pure Java and
> JNI binding implementations of the Java Proton API... if the tests are
> being run through a second translation into the Python API then this is not
> easily verifiable.
>
>
Why is this not verifiable? The proton API in both python and Java have
largely identical object models, and both languages support nearly
identical exception models. I would expect that testing for specific
exceptions from the Python API would quite directly force you to translate
those exceptions with high granularity from the Java API, and then you
could run the negative test suite against both pure Java and Java/JNI.


> As a final aside, on the standard development environment many of us have
> to work with, the installed version on Python is too old to support the
> current Python client (lack of UUID, etc).
>

It's true there are a few pieces we depend on that aren't available in
python 2.4, however the qpid python client has a compat layer that provides
these pieces if they are not there and makes the 2.4 vs new python
experience identical. Pulling this into proton would be quite trivial.


>
> Personally I think the more tests we have the better, and it's more
> important to encourage people to write tests than to force the use of a
> particular language to write them in.  I'd also suggest that we should be
> writing at least some tests for each of the idiomatic bindings.
>

I never said we shouldn't write tests, however as you say above it's
important to understand what kind of tests need to be written where.
Writing engine/protocol tests in every binding has very limited utility
since exercising the engine from one binding will run the exact same code
as exercising it from another, however writing tests for a binding that
verify that types are decoded properly is quite useful as those tests would
actually be exercising the binding code itself which is unique to the
binding language. As you point out, checking that exceptions are mapped
into a specific binding in a particular way is also binding specific in the
general case, however I think in the specific case of the Java binding a
test suite written in python could easily be run against both the Java and
JNI implementations and provide us with a high degree of confidence that
they behave identically. The only case where behaviour could differ is if
the python binding where to merge two error conditions into one, but this
would presumably just be a bug as the python binding itself would be losing
relevant information also.

In general I'm quite interested in what you think fundamentally can't be
tested from python vs what you think could be but is cumbersome or simply
blocked for reasons X, Y, and Z. I think a clear statement of this would
help identify areas we could improve (e.g. python 2.4 support) and also
help inform us what kind of Java tests should be encouraged.

--Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rob Godfrey <ro...@gmail.com>.
On 23 January 2013 19:09, Rafael Schloming <rh...@alum.mit.edu> wrote:

> I've added another wiki page that documents the proton release steps as
> best I can remember. I'll updated it more during the 0.4 release:
> https://cwiki.apache.org/confluence/display/qpid/Proton+Release+Steps
>
> I think it's important to understand the overall release and testing
> process as it is a significant and perhaps underrepresented factor against
> which to measure any proposals. I believe the build system requirements
> documented below are inherently incomplete as they don't recognize the fact
> that the C build system is not just a developer productivity tool, it is
> also the installer for our end users. And before anyone says our end users
> will just use yum or equivalents, all those packaging tools *also* depend
> on our build system both directly, and because we can't even supply a
> release for packagers to consume without a reasonable amount of direct
> install testing. To a good extent a standard looking C source tarball is
> pretty much the equivalent of a jar or jar + pom file in the Java world,
> it's really the only platform independent means of distribution we have.
>
>
It would be helpful if you could enumerate requirements which you believe
to be missing and add the to the existing wiki page.  I don't think anyone
is suggesting that the make install step should be broken in the source
tarball, so it's a little unclear to me the problem you are trying to
highlight above.


> It's also probably worth noting that perhaps the biggest issue with system
> tests in Java is not so much imposing maven on proton-c developers, but the
> fact that Java may not be available on all the platforms that proton-c
> needs to be tested on. My primary concern here would be iOS. I'm not an
> expert, but my brief googling seems to suggest there would be significant
> issues.
>
>
So, I think we probably need to consider what sort of tests are required,
and which languages it is appropriate to write any particular type of test
in.  For me tests in Java have some advantages over Python tests. Firstly
they allow interop tests between the two implementations within the same
process and secondly they will also be able to be used against any future
pure JavaScript Proton implementation (something we have planned to do but
not yet embarked upon).  A third issue for me is that when we start to
attempt more granular testing of things such as error handling, I will want
to ensure that the user experience is identical between the pure Java and
JNI binding implementations of the Java Proton API... if the tests are
being run through a second translation into the Python API then this is not
easily verifiable.

As a final aside, on the standard development environment many of us have
to work with, the installed version on Python is too old to support the
current Python client (lack of UUID, etc).

Personally I think the more tests we have the better, and it's more
important to encourage people to write tests than to force the use of a
particular language to write them in.  I'd also suggest that we should be
writing at least some tests for each of the idiomatic bindings.

-- Rob

--Rafael
>
> On Wed, Jan 23, 2013 at 12:45 PM, Phil Harvey <phil@philharveyonline.com
> >wrote:
>
> > In case anyone has missed it, note that Gordon has added some relevant
> > comments directly on the wiki pages:
> >
> >
> >
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
> >
> >
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+proposals
> >
> > Phil
> >
> >
> > On 23 January 2013 13:01, Keith W <ke...@gmail.com> wrote:
> >
> > > >> What are people's views on the relative priority of these
> > requirements?
> > > >> Are there any I've missed?  I think answering these questions is a
> > > >> prerequisite for agreeing the technical solution.
> > >
> > > With the aim of stimulating discussion regarding our requirements and
> > > to reach a consensus, I've classified each of the proposed
> > > requirements into whether I believe each is essential, neutral or
> > > detrimental to the smooth development of Proton.
> > >
> > > (proposed requirement numbers from
> > >
> > >
> >
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
> > > )
> > >
> > > Essential
> > >
> > > 3. To change proton-api, all that is required is to edit a Java file.
> > > - Developer productivity
> > >
> > > 4. To switch to a particular SVN revision, simple SVN commands are run
> > > (e.g. svn switch or svn update)
> > > - Developer productivity
> > >
> > > 5. proton-c can be built, excluding its JNI binding, without requiring
> > > non-standard tools*
> > > 6. proton-c can be built, excluding its JNI binding, from a standalone
> > > checkout of the proton-c directory
> > > - Developer productivity / tool familiarity
> > >
> > > Neutral
> > >
> > > 1. A "tarball" source release of proton-c can be built by a user
> > > without an external dependency on any other part of proton, e.g.
> > > proton-api.
> > > 2. The aforementioned proton-c tarball release can be produced by
> > > performing a simple "svn export" of proton-c.
> > > - If I were building proton-c for my platform for tarball, I would
> > > also want to run the tests to be sure proton-c functions correctly.
> > > For this reason I question the usefulness of a proton-c tarball.  I
> > > would want a tarball that included the whole tree including the tests.
> > >
> > > 7. Proton-c can be built without requiring non-standard tools*
> > > 9. Proton-c can be tested without requiring non-standard tools*
> > >  - If we can achieve this without introducing too much complexity,
> > > reinventing too many wheels and the result is portable across all
> > > target platforms.
> > >
> > > Detrimental
> > >
> > > 8. proton-c can be built from a standalone checkout of the proton-c
> > > directory
> > >  - I think that all proton developers who are changing either the C or
> > > Java implementations should be running the system tests before each
> > > commit.  If they are changing system tests then they need to run
> > > against both implementations before each commit.
> > >
> > > On 22 January 2013 17:09, Rafael Schloming <rh...@alum.mit.edu> wrote:
> > > > Thanks for posting this, I think it's a very useful step. I'd suggest
> > > > adding another Stakeholder -- someone testing a release artifact. Rob
> > > makes
> > > > a good point that the release manager is a distinct view, but I think
> > the
> > > > desire to minimize deltas between the svn tree and the release
> > artifacts
> > > is
> > > > most directly motivated by my experience *testing* release
> artifacts. I
> > > > remember going through qpid releases in the old days and having the
> > very
> > > > unpleasant experience of trying to remember from 8 or 10 months ago
> how
> > > > exactly stuff worked in the release artifact as compared to the build
> > > tree.
> > > > I very much like the fact that with a simple export I can be highly
> > > > confident that my experience of stuff working in my checkout
> translates
> > > > well to the release artifacts and testing them is a very familiar,
> > quick,
> > > > and easy process.
> > > >
> > > > Strictly speaking I think the requirement from a release management
> > > > perspective is purely that we can produce releases at the rate we
> need,
> > > so
> > > > it has to be quick and easy and robust to different environments,
> but I
> > > > wouldn't say the export thing is a requirement of the release manager
> > > > per/se. As many have pointed out we already use a script for this and
> > it
> > > > can remap things quite easily.
> > > >
> > > > I have more thoughts on the release process, especially as it is
> > somewhat
> > > > expanded now to produce java binaries and will need to expand more to
> > > > include windows stuff, however I need to run an errand at the moment.
> > > I'll
> > > > post and/or comment on the page later though.
> > > >
> > > > --Rafael
> > > >
> > > > I very much like the fact that our current release artifacts are
> > trivial
> > > >
> > > > On Tue, Jan 22, 2013 at 11:43 AM, Phil Harvey <
> > phil@philharveyonline.com
> > > >wrote:
> > > >
> > > >> It sounds like we're still a little way away from reaching a
> > consensus.
> > >  As
> > > >> a step towards this, I would like to clarify the relative priority
> of
> > > the
> > > >> various requirements that have come up.  I've therefore created a
> page
> > > on
> > > >> the wiki that lists them, with a child page briefly describing the
> > > various
> > > >> proposals.
> > > >>
> > > >>
> > > >>
> > >
> >
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
> > > >>
> > > >> What are people's views on the relative priority of these
> > requirements?
> > > >> Are there any I've missed?  I think answering these questions is a
> > > >> prerequisite for agreeing the technical solution.
> > > >>
> > > >> Phil
> > >
> >
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
I've added another wiki page that documents the proton release steps as
best I can remember. I'll updated it more during the 0.4 release:
https://cwiki.apache.org/confluence/display/qpid/Proton+Release+Steps

I think it's important to understand the overall release and testing
process as it is a significant and perhaps underrepresented factor against
which to measure any proposals. I believe the build system requirements
documented below are inherently incomplete as they don't recognize the fact
that the C build system is not just a developer productivity tool, it is
also the installer for our end users. And before anyone says our end users
will just use yum or equivalents, all those packaging tools *also* depend
on our build system both directly, and because we can't even supply a
release for packagers to consume without a reasonable amount of direct
install testing. To a good extent a standard looking C source tarball is
pretty much the equivalent of a jar or jar + pom file in the Java world,
it's really the only platform independent means of distribution we have.

It's also probably worth noting that perhaps the biggest issue with system
tests in Java is not so much imposing maven on proton-c developers, but the
fact that Java may not be available on all the platforms that proton-c
needs to be tested on. My primary concern here would be iOS. I'm not an
expert, but my brief googling seems to suggest there would be significant
issues.

--Rafael

On Wed, Jan 23, 2013 at 12:45 PM, Phil Harvey <ph...@philharveyonline.com>wrote:

> In case anyone has missed it, note that Gordon has added some relevant
> comments directly on the wiki pages:
>
>
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
>
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+proposals
>
> Phil
>
>
> On 23 January 2013 13:01, Keith W <ke...@gmail.com> wrote:
>
> > >> What are people's views on the relative priority of these
> requirements?
> > >> Are there any I've missed?  I think answering these questions is a
> > >> prerequisite for agreeing the technical solution.
> >
> > With the aim of stimulating discussion regarding our requirements and
> > to reach a consensus, I've classified each of the proposed
> > requirements into whether I believe each is essential, neutral or
> > detrimental to the smooth development of Proton.
> >
> > (proposed requirement numbers from
> >
> >
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
> > )
> >
> > Essential
> >
> > 3. To change proton-api, all that is required is to edit a Java file.
> > - Developer productivity
> >
> > 4. To switch to a particular SVN revision, simple SVN commands are run
> > (e.g. svn switch or svn update)
> > - Developer productivity
> >
> > 5. proton-c can be built, excluding its JNI binding, without requiring
> > non-standard tools*
> > 6. proton-c can be built, excluding its JNI binding, from a standalone
> > checkout of the proton-c directory
> > - Developer productivity / tool familiarity
> >
> > Neutral
> >
> > 1. A "tarball" source release of proton-c can be built by a user
> > without an external dependency on any other part of proton, e.g.
> > proton-api.
> > 2. The aforementioned proton-c tarball release can be produced by
> > performing a simple "svn export" of proton-c.
> > - If I were building proton-c for my platform for tarball, I would
> > also want to run the tests to be sure proton-c functions correctly.
> > For this reason I question the usefulness of a proton-c tarball.  I
> > would want a tarball that included the whole tree including the tests.
> >
> > 7. Proton-c can be built without requiring non-standard tools*
> > 9. Proton-c can be tested without requiring non-standard tools*
> >  - If we can achieve this without introducing too much complexity,
> > reinventing too many wheels and the result is portable across all
> > target platforms.
> >
> > Detrimental
> >
> > 8. proton-c can be built from a standalone checkout of the proton-c
> > directory
> >  - I think that all proton developers who are changing either the C or
> > Java implementations should be running the system tests before each
> > commit.  If they are changing system tests then they need to run
> > against both implementations before each commit.
> >
> > On 22 January 2013 17:09, Rafael Schloming <rh...@alum.mit.edu> wrote:
> > > Thanks for posting this, I think it's a very useful step. I'd suggest
> > > adding another Stakeholder -- someone testing a release artifact. Rob
> > makes
> > > a good point that the release manager is a distinct view, but I think
> the
> > > desire to minimize deltas between the svn tree and the release
> artifacts
> > is
> > > most directly motivated by my experience *testing* release artifacts. I
> > > remember going through qpid releases in the old days and having the
> very
> > > unpleasant experience of trying to remember from 8 or 10 months ago how
> > > exactly stuff worked in the release artifact as compared to the build
> > tree.
> > > I very much like the fact that with a simple export I can be highly
> > > confident that my experience of stuff working in my checkout translates
> > > well to the release artifacts and testing them is a very familiar,
> quick,
> > > and easy process.
> > >
> > > Strictly speaking I think the requirement from a release management
> > > perspective is purely that we can produce releases at the rate we need,
> > so
> > > it has to be quick and easy and robust to different environments, but I
> > > wouldn't say the export thing is a requirement of the release manager
> > > per/se. As many have pointed out we already use a script for this and
> it
> > > can remap things quite easily.
> > >
> > > I have more thoughts on the release process, especially as it is
> somewhat
> > > expanded now to produce java binaries and will need to expand more to
> > > include windows stuff, however I need to run an errand at the moment.
> > I'll
> > > post and/or comment on the page later though.
> > >
> > > --Rafael
> > >
> > > I very much like the fact that our current release artifacts are
> trivial
> > >
> > > On Tue, Jan 22, 2013 at 11:43 AM, Phil Harvey <
> phil@philharveyonline.com
> > >wrote:
> > >
> > >> It sounds like we're still a little way away from reaching a
> consensus.
> >  As
> > >> a step towards this, I would like to clarify the relative priority of
> > the
> > >> various requirements that have come up.  I've therefore created a page
> > on
> > >> the wiki that lists them, with a child page briefly describing the
> > various
> > >> proposals.
> > >>
> > >>
> > >>
> >
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
> > >>
> > >> What are people's views on the relative priority of these
> requirements?
> > >> Are there any I've missed?  I think answering these questions is a
> > >> prerequisite for agreeing the technical solution.
> > >>
> > >> Phil
> >
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Phil Harvey <ph...@philharveyonline.com>.
In case anyone has missed it, note that Gordon has added some relevant
comments directly on the wiki pages:

https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+proposals

Phil


On 23 January 2013 13:01, Keith W <ke...@gmail.com> wrote:

> >> What are people's views on the relative priority of these requirements?
> >> Are there any I've missed?  I think answering these questions is a
> >> prerequisite for agreeing the technical solution.
>
> With the aim of stimulating discussion regarding our requirements and
> to reach a consensus, I've classified each of the proposed
> requirements into whether I believe each is essential, neutral or
> detrimental to the smooth development of Proton.
>
> (proposed requirement numbers from
>
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
> )
>
> Essential
>
> 3. To change proton-api, all that is required is to edit a Java file.
> - Developer productivity
>
> 4. To switch to a particular SVN revision, simple SVN commands are run
> (e.g. svn switch or svn update)
> - Developer productivity
>
> 5. proton-c can be built, excluding its JNI binding, without requiring
> non-standard tools*
> 6. proton-c can be built, excluding its JNI binding, from a standalone
> checkout of the proton-c directory
> - Developer productivity / tool familiarity
>
> Neutral
>
> 1. A "tarball" source release of proton-c can be built by a user
> without an external dependency on any other part of proton, e.g.
> proton-api.
> 2. The aforementioned proton-c tarball release can be produced by
> performing a simple "svn export" of proton-c.
> - If I were building proton-c for my platform for tarball, I would
> also want to run the tests to be sure proton-c functions correctly.
> For this reason I question the usefulness of a proton-c tarball.  I
> would want a tarball that included the whole tree including the tests.
>
> 7. Proton-c can be built without requiring non-standard tools*
> 9. Proton-c can be tested without requiring non-standard tools*
>  - If we can achieve this without introducing too much complexity,
> reinventing too many wheels and the result is portable across all
> target platforms.
>
> Detrimental
>
> 8. proton-c can be built from a standalone checkout of the proton-c
> directory
>  - I think that all proton developers who are changing either the C or
> Java implementations should be running the system tests before each
> commit.  If they are changing system tests then they need to run
> against both implementations before each commit.
>
> On 22 January 2013 17:09, Rafael Schloming <rh...@alum.mit.edu> wrote:
> > Thanks for posting this, I think it's a very useful step. I'd suggest
> > adding another Stakeholder -- someone testing a release artifact. Rob
> makes
> > a good point that the release manager is a distinct view, but I think the
> > desire to minimize deltas between the svn tree and the release artifacts
> is
> > most directly motivated by my experience *testing* release artifacts. I
> > remember going through qpid releases in the old days and having the very
> > unpleasant experience of trying to remember from 8 or 10 months ago how
> > exactly stuff worked in the release artifact as compared to the build
> tree.
> > I very much like the fact that with a simple export I can be highly
> > confident that my experience of stuff working in my checkout translates
> > well to the release artifacts and testing them is a very familiar, quick,
> > and easy process.
> >
> > Strictly speaking I think the requirement from a release management
> > perspective is purely that we can produce releases at the rate we need,
> so
> > it has to be quick and easy and robust to different environments, but I
> > wouldn't say the export thing is a requirement of the release manager
> > per/se. As many have pointed out we already use a script for this and it
> > can remap things quite easily.
> >
> > I have more thoughts on the release process, especially as it is somewhat
> > expanded now to produce java binaries and will need to expand more to
> > include windows stuff, however I need to run an errand at the moment.
> I'll
> > post and/or comment on the page later though.
> >
> > --Rafael
> >
> > I very much like the fact that our current release artifacts are trivial
> >
> > On Tue, Jan 22, 2013 at 11:43 AM, Phil Harvey <phil@philharveyonline.com
> >wrote:
> >
> >> It sounds like we're still a little way away from reaching a consensus.
>  As
> >> a step towards this, I would like to clarify the relative priority of
> the
> >> various requirements that have come up.  I've therefore created a page
> on
> >> the wiki that lists them, with a child page briefly describing the
> various
> >> proposals.
> >>
> >>
> >>
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
> >>
> >> What are people's views on the relative priority of these requirements?
> >> Are there any I've missed?  I think answering these questions is a
> >> prerequisite for agreeing the technical solution.
> >>
> >> Phil
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Keith W <ke...@gmail.com>.
>> What are people's views on the relative priority of these requirements?
>> Are there any I've missed?  I think answering these questions is a
>> prerequisite for agreeing the technical solution.

With the aim of stimulating discussion regarding our requirements and
to reach a consensus, I've classified each of the proposed
requirements into whether I believe each is essential, neutral or
detrimental to the smooth development of Proton.

(proposed requirement numbers from
https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
)

Essential

3. To change proton-api, all that is required is to edit a Java file.
- Developer productivity

4. To switch to a particular SVN revision, simple SVN commands are run
(e.g. svn switch or svn update)
- Developer productivity

5. proton-c can be built, excluding its JNI binding, without requiring
non-standard tools*
6. proton-c can be built, excluding its JNI binding, from a standalone
checkout of the proton-c directory
- Developer productivity / tool familiarity

Neutral

1. A "tarball" source release of proton-c can be built by a user
without an external dependency on any other part of proton, e.g.
proton-api.
2. The aforementioned proton-c tarball release can be produced by
performing a simple "svn export" of proton-c.
- If I were building proton-c for my platform for tarball, I would
also want to run the tests to be sure proton-c functions correctly.
For this reason I question the usefulness of a proton-c tarball.  I
would want a tarball that included the whole tree including the tests.

7. Proton-c can be built without requiring non-standard tools*
9. Proton-c can be tested without requiring non-standard tools*
 - If we can achieve this without introducing too much complexity,
reinventing too many wheels and the result is portable across all
target platforms.

Detrimental

8. proton-c can be built from a standalone checkout of the proton-c
directory
 - I think that all proton developers who are changing either the C or
Java implementations should be running the system tests before each
commit.  If they are changing system tests then they need to run
against both implementations before each commit.

On 22 January 2013 17:09, Rafael Schloming <rh...@alum.mit.edu> wrote:
> Thanks for posting this, I think it's a very useful step. I'd suggest
> adding another Stakeholder -- someone testing a release artifact. Rob makes
> a good point that the release manager is a distinct view, but I think the
> desire to minimize deltas between the svn tree and the release artifacts is
> most directly motivated by my experience *testing* release artifacts. I
> remember going through qpid releases in the old days and having the very
> unpleasant experience of trying to remember from 8 or 10 months ago how
> exactly stuff worked in the release artifact as compared to the build tree.
> I very much like the fact that with a simple export I can be highly
> confident that my experience of stuff working in my checkout translates
> well to the release artifacts and testing them is a very familiar, quick,
> and easy process.
>
> Strictly speaking I think the requirement from a release management
> perspective is purely that we can produce releases at the rate we need, so
> it has to be quick and easy and robust to different environments, but I
> wouldn't say the export thing is a requirement of the release manager
> per/se. As many have pointed out we already use a script for this and it
> can remap things quite easily.
>
> I have more thoughts on the release process, especially as it is somewhat
> expanded now to produce java binaries and will need to expand more to
> include windows stuff, however I need to run an errand at the moment. I'll
> post and/or comment on the page later though.
>
> --Rafael
>
> I very much like the fact that our current release artifacts are trivial
>
> On Tue, Jan 22, 2013 at 11:43 AM, Phil Harvey <ph...@philharveyonline.com>wrote:
>
>> It sounds like we're still a little way away from reaching a consensus.  As
>> a step towards this, I would like to clarify the relative priority of the
>> various requirements that have come up.  I've therefore created a page on
>> the wiki that lists them, with a child page briefly describing the various
>> proposals.
>>
>>
>> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
>>
>> What are people's views on the relative priority of these requirements?
>> Are there any I've missed?  I think answering these questions is a
>> prerequisite for agreeing the technical solution.
>>
>> Phil

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
Thanks for posting this, I think it's a very useful step. I'd suggest
adding another Stakeholder -- someone testing a release artifact. Rob makes
a good point that the release manager is a distinct view, but I think the
desire to minimize deltas between the svn tree and the release artifacts is
most directly motivated by my experience *testing* release artifacts. I
remember going through qpid releases in the old days and having the very
unpleasant experience of trying to remember from 8 or 10 months ago how
exactly stuff worked in the release artifact as compared to the build tree.
I very much like the fact that with a simple export I can be highly
confident that my experience of stuff working in my checkout translates
well to the release artifacts and testing them is a very familiar, quick,
and easy process.

Strictly speaking I think the requirement from a release management
perspective is purely that we can produce releases at the rate we need, so
it has to be quick and easy and robust to different environments, but I
wouldn't say the export thing is a requirement of the release manager
per/se. As many have pointed out we already use a script for this and it
can remap things quite easily.

I have more thoughts on the release process, especially as it is somewhat
expanded now to produce java binaries and will need to expand more to
include windows stuff, however I need to run an errand at the moment. I'll
post and/or comment on the page later though.

--Rafael

I very much like the fact that our current release artifacts are trivial

On Tue, Jan 22, 2013 at 11:43 AM, Phil Harvey <ph...@philharveyonline.com>wrote:

> It sounds like we're still a little way away from reaching a consensus.  As
> a step towards this, I would like to clarify the relative priority of the
> various requirements that have come up.  I've therefore created a page on
> the wiki that lists them, with a child page briefly describing the various
> proposals.
>
>
> https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements
>
> What are people's views on the relative priority of these requirements?
> Are there any I've missed?  I think answering these questions is a
> prerequisite for agreeing the technical solution.
>
> Phil
>
>
> On 22 January 2013 13:34, Rob Godfrey <ro...@gmail.com> wrote:
>
> > On 22 January 2013 13:47, Rafael Schloming <rh...@alum.mit.edu> wrote:
> >
> > > On Tue, Jan 22, 2013 at 4:22 AM, Rob Godfrey <rob.j.godfrey@gmail.com
> > > >wrote:
> > >
> > > > On 21 January 2013 18:05, Rafael Schloming <rh...@alum.mit.edu> wrote:
> > > >
> > > > > On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey <
> > rob.j.godfrey@gmail.com
> > > > > >wrote:
> > > > >
> > > > > > Ummm... it's a dependency... you're familiar with those, yeah?
> > > > > >
> > > > > > The same way that the Qpid JMS clients depend on a JMS API jar,
> for
> > > > which
> > > > > > the source is readily available from another source. The JNI
> > binding
> > > > > would
> > > > > > build if the dependency was installed.  The same way I believe
> the
> > > SSL
> > > > > code
> > > > > > in the core of proton-c builds if the dependency for it is
> > installed.
> > > > > >
> > > > >
> > > > > That's not really a proper analogy. Again the JMS interfaces are
> > > defined
> > > > > outside of qpid. We don't release them, and we depend only on a
> well
> > > > > defined version of them, we don't share a release cycle with them.
> If
> > > the
> > > > > JMS API was something that we developed/defined right alongside the
> > > impl
> > > > > and was part of the same release process, we would certainly not be
> > > > allowed
> > > > > to release without the source.
> > > > >
> > > > >
> > > > This "releasing without the source" is a complete red herring and you
> > > know
> > > > it.  The source is released in whichever scheme we settle upon.
> > > >
> > > > If you want an example of dependencies within the qpid project, how
> did
> > > the
> > > > AMQP 1.0 work on the C++ broker get released for 0.20?  Did all the
> > > proton
> > > > source get released with the C++ Broker / client?  In the future are
> > you
> > > > expecting every part of the Qpid project which depends on proton to
> > > include
> > > > its full source?  If yes then how is the source tree going to work -
> is
> > > > everything to be a subdirectory of proton-c?
> > > >
> > >
> > > Again that's not really the same. If the Java API where on a separate
> > > (staggered) release cycle and the dependency was on a specific version,
> > > then that would be the same, but for what we're discussing, it really
> > > isn't. Proton and the cpp broker live under different trunks and
> > > branch/release separately, as far as I know this is not what you're
> > > proposing for the Java API, it is to live under the same trunk and
> > > branch/release together.
> > >
> > >
> > The point was that the source code doesn't need to be in the same tarball
> > let alone the same subdirectory in source control. If one considers that
> > the Java API is a dependency then whether it is released concurrently or
> > not with the JNI binding is moot.
> >
> > I've already said that it is preferable to have the source within the
> same
> > tarball for the source release, but if needs be then I can live with the
> > strict dependency view of things.
> >
> >
> > >
> > > >
> > > > I agree that having the source for the version of the Java API
> included
> > > in
> > > > the source release bundle is advantageous. But if the collective
> > decision
> > > > is that we have taken a religious position that the source tarballs
> can
> > > > only be svn exports of subdirectories of our source tree, then my
> > > > preference would be to use separated dependencies over duplication in
> > the
> > > > repository.  Personally I would think that having a more flexible
> > policy
> > > on
> > > > constructing the release source tarballs would make a lot more sense.
> > > >
> > >
> > > You can call it religious if you like, but I don't think there is
> > anything
> > > invalid about wanting to keep a simple mapping between release
> artifacts
> > > and proton developer environment. In the past we have had quite direct
> > > experience of exactly this factor contributing to very poor out of the
> > box
> > > experience for users. Correct me if I'm wrong, but I believe you
> yourself
> > > have actually advocated (or at least agreed with) this position in the
> > > past.
> > >
> > > That said, I don't think I'm asking for us to be entirely inflexible in
> > > that regard. There really are two opposing concerns here, one being the
> > > user experience for our release artifacts, and the other being the
> > > convenience of the development process for proton developers.
> >
> >
> > I actually think there are three perspectives here.  The user experience
> of
> > our release artefacts, the committer experience of working on the
> > checkedout codebase, and the release manager view of preparing the
> release
> > aretfacts from the source control.
> >
> >
> > > All I'm
> > > asking is that we recognize that there is a real tradeoff and be
> willing
> > to
> > > explore options that might preserve user experience albeit at a
> hopefully
> > > minor cost to developer convenience.
> >
> >
> > I think the issue here is that if you take a rigid view of how the
> release
> > artefacts correspond to the source repo directories then you impose
> > different trade-offs compared to not imposing those constraints.
> >
> > I'd be very happy with a solution that provided source tarballs with
> > exactly what the users need to build, and having the source tree having
> > everything in exactly once, if the cost was that the release process had
> a
> > little more processing in it.  Frankly if the source tarballs were
> > constructed by taking a cut at the top level of proton (rather than at -j
> > or -c) and irrelevant parts were simply pruned, then I don't (personally)
> > see a big issue.
> >
> >
> > > For any other aspect of software
> > > engineering this would be a no-brainer, you start from the user
> > > requirements/user experience and work your way backwards to the
> simplest
> > > solution that achieves this, however this proposal and the related
> > > requirements JIRA make zero mention of any *user* requirements merely
> > > developer requirements. This may be ok for Java where all your users
> will
> > > get stuff via jars and the source tarball is mostly a formality, but
> for
> > C
> > > the situation is different.
> > >
> > >
> > The proposed solution in the JIRA implies that in order to build the JNI
> > binding then the builder would need to "install" the Java API (in the
> same
> > way that they would need to "install" Java, Python, etc).  This is a
> > slightly greater burden on the end user who is building the binding, and
> > one which I would wish to avoid. However the fact of having a dependency
> > for a certain part of the build does not seem out of line with the rest
> of
> > the process which a user would have to go through.  If the Java API were
> > not present then the JNI binding would simply be skipped in the build.
> >
> > As above, I personally prefer having the API code available in the source
> > tarball, but if the only way to do this would be to have two copies of
> the
> > source code checked in (even if there were some scripts to try to keep
> the
> > copies in sync), I would choose to omit the source from the tar and go
> with
> > the strict dependency.
> >
> > -- Rob
> >
> >
> > > --Rafael
> > >
> >
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Phil Harvey <ph...@philharveyonline.com>.
It sounds like we're still a little way away from reaching a consensus.  As
a step towards this, I would like to clarify the relative priority of the
various requirements that have come up.  I've therefore created a page on
the wiki that lists them, with a child page briefly describing the various
proposals.

https://cwiki.apache.org/confluence/display/qpid/Proton+build+system+requirements

What are people's views on the relative priority of these requirements?
Are there any I've missed?  I think answering these questions is a
prerequisite for agreeing the technical solution.

Phil


On 22 January 2013 13:34, Rob Godfrey <ro...@gmail.com> wrote:

> On 22 January 2013 13:47, Rafael Schloming <rh...@alum.mit.edu> wrote:
>
> > On Tue, Jan 22, 2013 at 4:22 AM, Rob Godfrey <rob.j.godfrey@gmail.com
> > >wrote:
> >
> > > On 21 January 2013 18:05, Rafael Schloming <rh...@alum.mit.edu> wrote:
> > >
> > > > On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey <
> rob.j.godfrey@gmail.com
> > > > >wrote:
> > > >
> > > > > Ummm... it's a dependency... you're familiar with those, yeah?
> > > > >
> > > > > The same way that the Qpid JMS clients depend on a JMS API jar, for
> > > which
> > > > > the source is readily available from another source. The JNI
> binding
> > > > would
> > > > > build if the dependency was installed.  The same way I believe the
> > SSL
> > > > code
> > > > > in the core of proton-c builds if the dependency for it is
> installed.
> > > > >
> > > >
> > > > That's not really a proper analogy. Again the JMS interfaces are
> > defined
> > > > outside of qpid. We don't release them, and we depend only on a well
> > > > defined version of them, we don't share a release cycle with them. If
> > the
> > > > JMS API was something that we developed/defined right alongside the
> > impl
> > > > and was part of the same release process, we would certainly not be
> > > allowed
> > > > to release without the source.
> > > >
> > > >
> > > This "releasing without the source" is a complete red herring and you
> > know
> > > it.  The source is released in whichever scheme we settle upon.
> > >
> > > If you want an example of dependencies within the qpid project, how did
> > the
> > > AMQP 1.0 work on the C++ broker get released for 0.20?  Did all the
> > proton
> > > source get released with the C++ Broker / client?  In the future are
> you
> > > expecting every part of the Qpid project which depends on proton to
> > include
> > > its full source?  If yes then how is the source tree going to work - is
> > > everything to be a subdirectory of proton-c?
> > >
> >
> > Again that's not really the same. If the Java API where on a separate
> > (staggered) release cycle and the dependency was on a specific version,
> > then that would be the same, but for what we're discussing, it really
> > isn't. Proton and the cpp broker live under different trunks and
> > branch/release separately, as far as I know this is not what you're
> > proposing for the Java API, it is to live under the same trunk and
> > branch/release together.
> >
> >
> The point was that the source code doesn't need to be in the same tarball
> let alone the same subdirectory in source control. If one considers that
> the Java API is a dependency then whether it is released concurrently or
> not with the JNI binding is moot.
>
> I've already said that it is preferable to have the source within the same
> tarball for the source release, but if needs be then I can live with the
> strict dependency view of things.
>
>
> >
> > >
> > > I agree that having the source for the version of the Java API included
> > in
> > > the source release bundle is advantageous. But if the collective
> decision
> > > is that we have taken a religious position that the source tarballs can
> > > only be svn exports of subdirectories of our source tree, then my
> > > preference would be to use separated dependencies over duplication in
> the
> > > repository.  Personally I would think that having a more flexible
> policy
> > on
> > > constructing the release source tarballs would make a lot more sense.
> > >
> >
> > You can call it religious if you like, but I don't think there is
> anything
> > invalid about wanting to keep a simple mapping between release artifacts
> > and proton developer environment. In the past we have had quite direct
> > experience of exactly this factor contributing to very poor out of the
> box
> > experience for users. Correct me if I'm wrong, but I believe you yourself
> > have actually advocated (or at least agreed with) this position in the
> > past.
> >
> > That said, I don't think I'm asking for us to be entirely inflexible in
> > that regard. There really are two opposing concerns here, one being the
> > user experience for our release artifacts, and the other being the
> > convenience of the development process for proton developers.
>
>
> I actually think there are three perspectives here.  The user experience of
> our release artefacts, the committer experience of working on the
> checkedout codebase, and the release manager view of preparing the release
> aretfacts from the source control.
>
>
> > All I'm
> > asking is that we recognize that there is a real tradeoff and be willing
> to
> > explore options that might preserve user experience albeit at a hopefully
> > minor cost to developer convenience.
>
>
> I think the issue here is that if you take a rigid view of how the release
> artefacts correspond to the source repo directories then you impose
> different trade-offs compared to not imposing those constraints.
>
> I'd be very happy with a solution that provided source tarballs with
> exactly what the users need to build, and having the source tree having
> everything in exactly once, if the cost was that the release process had a
> little more processing in it.  Frankly if the source tarballs were
> constructed by taking a cut at the top level of proton (rather than at -j
> or -c) and irrelevant parts were simply pruned, then I don't (personally)
> see a big issue.
>
>
> > For any other aspect of software
> > engineering this would be a no-brainer, you start from the user
> > requirements/user experience and work your way backwards to the simplest
> > solution that achieves this, however this proposal and the related
> > requirements JIRA make zero mention of any *user* requirements merely
> > developer requirements. This may be ok for Java where all your users will
> > get stuff via jars and the source tarball is mostly a formality, but for
> C
> > the situation is different.
> >
> >
> The proposed solution in the JIRA implies that in order to build the JNI
> binding then the builder would need to "install" the Java API (in the same
> way that they would need to "install" Java, Python, etc).  This is a
> slightly greater burden on the end user who is building the binding, and
> one which I would wish to avoid. However the fact of having a dependency
> for a certain part of the build does not seem out of line with the rest of
> the process which a user would have to go through.  If the Java API were
> not present then the JNI binding would simply be skipped in the build.
>
> As above, I personally prefer having the API code available in the source
> tarball, but if the only way to do this would be to have two copies of the
> source code checked in (even if there were some scripts to try to keep the
> copies in sync), I would choose to omit the source from the tar and go with
> the strict dependency.
>
> -- Rob
>
>
> > --Rafael
> >
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rob Godfrey <ro...@gmail.com>.
On 22 January 2013 13:47, Rafael Schloming <rh...@alum.mit.edu> wrote:

> On Tue, Jan 22, 2013 at 4:22 AM, Rob Godfrey <rob.j.godfrey@gmail.com
> >wrote:
>
> > On 21 January 2013 18:05, Rafael Schloming <rh...@alum.mit.edu> wrote:
> >
> > > On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey <rob.j.godfrey@gmail.com
> > > >wrote:
> > >
> > > > Ummm... it's a dependency... you're familiar with those, yeah?
> > > >
> > > > The same way that the Qpid JMS clients depend on a JMS API jar, for
> > which
> > > > the source is readily available from another source. The JNI binding
> > > would
> > > > build if the dependency was installed.  The same way I believe the
> SSL
> > > code
> > > > in the core of proton-c builds if the dependency for it is installed.
> > > >
> > >
> > > That's not really a proper analogy. Again the JMS interfaces are
> defined
> > > outside of qpid. We don't release them, and we depend only on a well
> > > defined version of them, we don't share a release cycle with them. If
> the
> > > JMS API was something that we developed/defined right alongside the
> impl
> > > and was part of the same release process, we would certainly not be
> > allowed
> > > to release without the source.
> > >
> > >
> > This "releasing without the source" is a complete red herring and you
> know
> > it.  The source is released in whichever scheme we settle upon.
> >
> > If you want an example of dependencies within the qpid project, how did
> the
> > AMQP 1.0 work on the C++ broker get released for 0.20?  Did all the
> proton
> > source get released with the C++ Broker / client?  In the future are you
> > expecting every part of the Qpid project which depends on proton to
> include
> > its full source?  If yes then how is the source tree going to work - is
> > everything to be a subdirectory of proton-c?
> >
>
> Again that's not really the same. If the Java API where on a separate
> (staggered) release cycle and the dependency was on a specific version,
> then that would be the same, but for what we're discussing, it really
> isn't. Proton and the cpp broker live under different trunks and
> branch/release separately, as far as I know this is not what you're
> proposing for the Java API, it is to live under the same trunk and
> branch/release together.
>
>
The point was that the source code doesn't need to be in the same tarball
let alone the same subdirectory in source control. If one considers that
the Java API is a dependency then whether it is released concurrently or
not with the JNI binding is moot.

I've already said that it is preferable to have the source within the same
tarball for the source release, but if needs be then I can live with the
strict dependency view of things.


>
> >
> > I agree that having the source for the version of the Java API included
> in
> > the source release bundle is advantageous. But if the collective decision
> > is that we have taken a religious position that the source tarballs can
> > only be svn exports of subdirectories of our source tree, then my
> > preference would be to use separated dependencies over duplication in the
> > repository.  Personally I would think that having a more flexible policy
> on
> > constructing the release source tarballs would make a lot more sense.
> >
>
> You can call it religious if you like, but I don't think there is anything
> invalid about wanting to keep a simple mapping between release artifacts
> and proton developer environment. In the past we have had quite direct
> experience of exactly this factor contributing to very poor out of the box
> experience for users. Correct me if I'm wrong, but I believe you yourself
> have actually advocated (or at least agreed with) this position in the
> past.
>
> That said, I don't think I'm asking for us to be entirely inflexible in
> that regard. There really are two opposing concerns here, one being the
> user experience for our release artifacts, and the other being the
> convenience of the development process for proton developers.


I actually think there are three perspectives here.  The user experience of
our release artefacts, the committer experience of working on the
checkedout codebase, and the release manager view of preparing the release
aretfacts from the source control.


> All I'm
> asking is that we recognize that there is a real tradeoff and be willing to
> explore options that might preserve user experience albeit at a hopefully
> minor cost to developer convenience.


I think the issue here is that if you take a rigid view of how the release
artefacts correspond to the source repo directories then you impose
different trade-offs compared to not imposing those constraints.

I'd be very happy with a solution that provided source tarballs with
exactly what the users need to build, and having the source tree having
everything in exactly once, if the cost was that the release process had a
little more processing in it.  Frankly if the source tarballs were
constructed by taking a cut at the top level of proton (rather than at -j
or -c) and irrelevant parts were simply pruned, then I don't (personally)
see a big issue.


> For any other aspect of software
> engineering this would be a no-brainer, you start from the user
> requirements/user experience and work your way backwards to the simplest
> solution that achieves this, however this proposal and the related
> requirements JIRA make zero mention of any *user* requirements merely
> developer requirements. This may be ok for Java where all your users will
> get stuff via jars and the source tarball is mostly a formality, but for C
> the situation is different.
>
>
The proposed solution in the JIRA implies that in order to build the JNI
binding then the builder would need to "install" the Java API (in the same
way that they would need to "install" Java, Python, etc).  This is a
slightly greater burden on the end user who is building the binding, and
one which I would wish to avoid. However the fact of having a dependency
for a certain part of the build does not seem out of line with the rest of
the process which a user would have to go through.  If the Java API were
not present then the JNI binding would simply be skipped in the build.

As above, I personally prefer having the API code available in the source
tarball, but if the only way to do this would be to have two copies of the
source code checked in (even if there were some scripts to try to keep the
copies in sync), I would choose to omit the source from the tar and go with
the strict dependency.

-- Rob


> --Rafael
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Tue, Jan 22, 2013 at 4:22 AM, Rob Godfrey <ro...@gmail.com>wrote:

> On 21 January 2013 18:05, Rafael Schloming <rh...@alum.mit.edu> wrote:
>
> > On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey <rob.j.godfrey@gmail.com
> > >wrote:
> >
> > > Ummm... it's a dependency... you're familiar with those, yeah?
> > >
> > > The same way that the Qpid JMS clients depend on a JMS API jar, for
> which
> > > the source is readily available from another source. The JNI binding
> > would
> > > build if the dependency was installed.  The same way I believe the SSL
> > code
> > > in the core of proton-c builds if the dependency for it is installed.
> > >
> >
> > That's not really a proper analogy. Again the JMS interfaces are defined
> > outside of qpid. We don't release them, and we depend only on a well
> > defined version of them, we don't share a release cycle with them. If the
> > JMS API was something that we developed/defined right alongside the impl
> > and was part of the same release process, we would certainly not be
> allowed
> > to release without the source.
> >
> >
> This "releasing without the source" is a complete red herring and you know
> it.  The source is released in whichever scheme we settle upon.
>
> If you want an example of dependencies within the qpid project, how did the
> AMQP 1.0 work on the C++ broker get released for 0.20?  Did all the proton
> source get released with the C++ Broker / client?  In the future are you
> expecting every part of the Qpid project which depends on proton to include
> its full source?  If yes then how is the source tree going to work - is
> everything to be a subdirectory of proton-c?
>

Again that's not really the same. If the Java API where on a separate
(staggered) release cycle and the dependency was on a specific version,
then that would be the same, but for what we're discussing, it really
isn't. Proton and the cpp broker live under different trunks and
branch/release separately, as far as I know this is not what you're
proposing for the Java API, it is to live under the same trunk and
branch/release together.


>
> I agree that having the source for the version of the Java API included in
> the source release bundle is advantageous. But if the collective decision
> is that we have taken a religious position that the source tarballs can
> only be svn exports of subdirectories of our source tree, then my
> preference would be to use separated dependencies over duplication in the
> repository.  Personally I would think that having a more flexible policy on
> constructing the release source tarballs would make a lot more sense.
>

You can call it religious if you like, but I don't think there is anything
invalid about wanting to keep a simple mapping between release artifacts
and proton developer environment. In the past we have had quite direct
experience of exactly this factor contributing to very poor out of the box
experience for users. Correct me if I'm wrong, but I believe you yourself
have actually advocated (or at least agreed with) this position in the past.

That said, I don't think I'm asking for us to be entirely inflexible in
that regard. There really are two opposing concerns here, one being the
user experience for our release artifacts, and the other being the
convenience of the development process for proton developers. All I'm
asking is that we recognize that there is a real tradeoff and be willing to
explore options that might preserve user experience albeit at a hopefully
minor cost to developer convenience. For any other aspect of software
engineering this would be a no-brainer, you start from the user
requirements/user experience and work your way backwards to the simplest
solution that achieves this, however this proposal and the related
requirements JIRA make zero mention of any *user* requirements merely
developer requirements. This may be ok for Java where all your users will
get stuff via jars and the source tarball is mostly a formality, but for C
the situation is different.

--Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rob Godfrey <ro...@gmail.com>.
On 21 January 2013 18:05, Rafael Schloming <rh...@alum.mit.edu> wrote:

> On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey <rob.j.godfrey@gmail.com
> >wrote:
>
> > Ummm... it's a dependency... you're familiar with those, yeah?
> >
> > The same way that the Qpid JMS clients depend on a JMS API jar, for which
> > the source is readily available from another source. The JNI binding
> would
> > build if the dependency was installed.  The same way I believe the SSL
> code
> > in the core of proton-c builds if the dependency for it is installed.
> >
>
> That's not really a proper analogy. Again the JMS interfaces are defined
> outside of qpid. We don't release them, and we depend only on a well
> defined version of them, we don't share a release cycle with them. If the
> JMS API was something that we developed/defined right alongside the impl
> and was part of the same release process, we would certainly not be allowed
> to release without the source.
>
>
This "releasing without the source" is a complete red herring and you know
it.  The source is released in whichever scheme we settle upon.

If you want an example of dependencies within the qpid project, how did the
AMQP 1.0 work on the C++ broker get released for 0.20?  Did all the proton
source get released with the C++ Broker / client?  In the future are you
expecting every part of the Qpid project which depends on proton to include
its full source?  If yes then how is the source tree going to work - is
everything to be a subdirectory of proton-c?

I agree that having the source for the version of the Java API included in
the source release bundle is advantageous. But if the collective decision
is that we have taken a religious position that the source tarballs can
only be svn exports of subdirectories of our source tree, then my
preference would be to use separated dependencies over duplication in the
repository.  Personally I would think that having a more flexible policy on
constructing the release source tarballs would make a lot more sense.

-- Rob


> --Rafael
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Mon, Jan 21, 2013 at 9:33 AM, Rob Godfrey <ro...@gmail.com>wrote:

> Ummm... it's a dependency... you're familiar with those, yeah?
>
> The same way that the Qpid JMS clients depend on a JMS API jar, for which
> the source is readily available from another source. The JNI binding would
> build if the dependency was installed.  The same way I believe the SSL code
> in the core of proton-c builds if the dependency for it is installed.
>

That's not really a proper analogy. Again the JMS interfaces are defined
outside of qpid. We don't release them, and we depend only on a well
defined version of them, we don't share a release cycle with them. If the
JMS API was something that we developed/defined right alongside the impl
and was part of the same release process, we would certainly not be allowed
to release without the source.

--Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rob Godfrey <ro...@gmail.com>.
On 21 January 2013 15:11, Rafael Schloming <rh...@alum.mit.edu> wrote:

> On Mon, Jan 21, 2013 at 7:10 AM, Rob Godfrey <rob.j.godfrey@gmail.com
> >wrote:
>
> > >
> >
> > > > This results in something that is quite awkward for the C build. For
> > one
> > > > thing I'm not sure an svn export of the proton-c directory would be
> > > > considered releasable under this scheme as it would include the java
> > > > binding, but not the source code necessary to build it, and apache
> > policy
> > > > requires releases to include full source code. Regardless it would no
> > > > longer be a useful/sensible artifact to end-users since they couldn't
> > > > actually build the java binding.
> > > >
> > >
> > >
> > This seems a slightly odd position to take. The artefact doesn't include
> > the entire source to python, ruby, openssl, etc.  If the dependencies for
> > these are not present then the relevant parts of the tree are not built.
> >  The same is true in this proposal with respect to the java binding...
> > there is a dependency on the Java API being installed in order to build
> the
> > JNI bindings within the C build.
> >
>
> The problem isn't with not including the source code to external
> dependencies (i.e. Java in your analogy), the problem is with the fact that
> all of the Java binding (the API and the JNI implementation of it) is
> developed within the qpid project, and the artifact would not include all
> of it. The apache release policy is quite clear on this front:
>
> "The Apache Software Foundation produces open source software. All releases
> are in the form of the source materials needed to make changes to the
> software being released. In some cases, binary/bytecode packages are also
> produced as a convenience to users that might not have the appropriate
> tools to build a compiled version of the source. In all such cases, the
> binary/bytecode package must have the same version number as the source
> release and may only add binary/bytecode files that are the result of
> compiling that version of the source code release."
>
> Producing an artifact that has source code for impls, but not source for
> the interfaces would quite clearly constitute an artifact that didn't
> include all the source materials needed to make changes.
>
>

Ummm... it's a dependency... you're familiar with those, yeah?

The same way that the Qpid JMS clients depend on a JMS API jar, for which
the source is readily available from another source. The JNI binding would
build if the dependency was installed.  The same way I believe the SSL code
in the core of proton-c builds if the dependency for it is installed.


> I must admit I remain bemused by the idea that trying to maintain two
> > copies of the Java API in the source tree makes any kind of sense.
> >
> > I think we are contorting ourselves and adding potentially huge
> > complication to our build/development process in order to try to satisfy
> a
> > number of somewhat arbitrary "requirements" that are being imposed on the
> > directory structure.
> >
>
> You're arguing against a straw man here. Nobody has proposed "copying" the
> API the way you keep describing it. The original solution implemented on
> the JNI branch was to have the API in two places at once via svn externals.
>

This isn't in "two places"... it's very clearly in "one place" in the
repository, with another place "linking" to it, through a rather inelegant
manner.

Having said that, the externals solution is not a particularly pleasant
solution and was only put in place because of the "requirement" to be able
to check out from a subdirectory of proton.  Having further considered the
matter, my feeling is that it is better to re-examine the need to be able
to check out just a single subdirectory of the proton tree.


> This however does violate one of the "fundamental tenants of source
> control" as you put it since it fundamentally loses track of what version
> of the API source goes with what version of the implementation source.
>


Umm... no it doesn't.  Again... I'm not pushing for svn:externals but if
you insist on the "each subdirectory must be provide able to be
independently checked out" then I think svn:externals is a better solution
than the copy.  The original svn:externals proposal makes it very clear
that the version of the Java API code that the JNI binding works with must
be the same as that which the Java impl works with.  The externals is to a
sibling directory within the same project.  So long as you consider the
proton project as a whole then it is never unclear as to which version you
should be using.  Only in w orld where the Java and C versions are not
progressed with a common API does this become a problem.  If you do not
believe the two should have a common API then I think we need to have a
wider discussion (since we've been working pretty hard until now to keep
the APIs in sync).


> Branching the API into two places and putting the necessary scripts in
> place to enforce that the C version of that branch is a read only copy of
> the Java version is simply another way to achieve exactly what is currently
>

It's not a branch though... not in way in which the term is normally
recognised. There is no concurrent development going on here.  There should
be only one API.  In the same way that for a given release you don;t have a
separate set of .h files for the Ruby binding and for the Python binding.


> proposed on the JNI branch without the problems associated with svn
> externals. I'm honestly somewhat befuddled why there is such vehement
> resistance to even exploring it.
>
>
> > Personally I don't perceive there to be an actually need to allow
> checking
> > out of only part of the Proton tree.  Indeed I would wish to strongly
> > discourage the sort of silo-d attitude that checking out only Java or
> only
> > C would imply.
> >
> > Moreover, while I see that it is advantageous to be able to release
> > "source" packages directly as svn exports from points in the tree... I
> > don't find this so compelling that I would break fundamental tenets of
> > source control is expected to be used.
> >
> > Personally, given that our current plan is to release all of Proton at
> the
> > same time, I'm not sure what would be wrong with simply shipping a single
> > source tarball of the entire directory structure.  People who wish to
> build
> > from source would thus be able to build whatever they so desired.
> >
>
> We've been quite clear from the beginning of the project on why exactly we
> want to produce C and Java release tarballs that are simple, familiar, and
> easy to consume within their respective development communities.
>
>
These can be produced, per Robbie's comment... Having these be produced in
a way that is slightly more involved than a simple svn export seems like a
reasonable step to me if we are trading this off against having a
development process which requires running scripts and merging every time
we make a change to the Java API.

So... how do we move forward? Other than myself, Rafi, Keith, Robbie and
Phil is there anyone else who wants to volunteer an opinion?

-- Rob

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Mon, Jan 21, 2013 at 7:10 AM, Rob Godfrey <ro...@gmail.com>wrote:

> >
>
> > > This results in something that is quite awkward for the C build. For
> one
> > > thing I'm not sure an svn export of the proton-c directory would be
> > > considered releasable under this scheme as it would include the java
> > > binding, but not the source code necessary to build it, and apache
> policy
> > > requires releases to include full source code. Regardless it would no
> > > longer be a useful/sensible artifact to end-users since they couldn't
> > > actually build the java binding.
> > >
> >
> >
> This seems a slightly odd position to take. The artefact doesn't include
> the entire source to python, ruby, openssl, etc.  If the dependencies for
> these are not present then the relevant parts of the tree are not built.
>  The same is true in this proposal with respect to the java binding...
> there is a dependency on the Java API being installed in order to build the
> JNI bindings within the C build.
>

The problem isn't with not including the source code to external
dependencies (i.e. Java in your analogy), the problem is with the fact that
all of the Java binding (the API and the JNI implementation of it) is
developed within the qpid project, and the artifact would not include all
of it. The apache release policy is quite clear on this front:

"The Apache Software Foundation produces open source software. All releases
are in the form of the source materials needed to make changes to the
software being released. In some cases, binary/bytecode packages are also
produced as a convenience to users that might not have the appropriate
tools to build a compiled version of the source. In all such cases, the
binary/bytecode package must have the same version number as the source
release and may only add binary/bytecode files that are the result of
compiling that version of the source code release."

Producing an artifact that has source code for impls, but not source for
the interfaces would quite clearly constitute an artifact that didn't
include all the source materials needed to make changes.

I must admit I remain bemused by the idea that trying to maintain two
> copies of the Java API in the source tree makes any kind of sense.
>
> I think we are contorting ourselves and adding potentially huge
> complication to our build/development process in order to try to satisfy a
> number of somewhat arbitrary "requirements" that are being imposed on the
> directory structure.
>

You're arguing against a straw man here. Nobody has proposed "copying" the
API the way you keep describing it. The original solution implemented on
the JNI branch was to have the API in two places at once via svn externals.
This however does violate one of the "fundamental tenants of source
control" as you put it since it fundamentally loses track of what version
of the API source goes with what version of the implementation source.
Branching the API into two places and putting the necessary scripts in
place to enforce that the C version of that branch is a read only copy of
the Java version is simply another way to achieve exactly what is currently
proposed on the JNI branch without the problems associated with svn
externals. I'm honestly somewhat befuddled why there is such vehement
resistance to even exploring it.


> Personally I don't perceive there to be an actually need to allow checking
> out of only part of the Proton tree.  Indeed I would wish to strongly
> discourage the sort of silo-d attitude that checking out only Java or only
> C would imply.
>
> Moreover, while I see that it is advantageous to be able to release
> "source" packages directly as svn exports from points in the tree... I
> don't find this so compelling that I would break fundamental tenets of
> source control is expected to be used.
>
> Personally, given that our current plan is to release all of Proton at the
> same time, I'm not sure what would be wrong with simply shipping a single
> source tarball of the entire directory structure.  People who wish to build
> from source would thus be able to build whatever they so desired.
>

We've been quite clear from the beginning of the project on why exactly we
want to produce C and Java release tarballs that are simple, familiar, and
easy to consume within their respective development communities.

--Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Tue, Jan 22, 2013 at 6:13 AM, Robbie Gemmell <ro...@gmail.com>wrote:

> Most of the times I download and compile a release (something even us Java
> developers still do on occasion :P), the first thing that is required is
> changing into a directory, which is true with the proton-c releases.
> Changing down 2 directories doesnt sound overly taxing to me, and proton-c
> seems like a fairly obvious directory to change into if I want to compile
> the C version of proton and dont find a makefile/src dir in the initial
> version directory.
>
> That said, looking at the 0.3 proton-c source tar I notice that it is
> already more complex than a simple export of the proton-c subdir in the
> repo, since the contents of the two dont match directly. As there is
> already some processing going on somewhere (release.sh), it is really that
> complex to export two extra directories from the top level into the release
> tar instead of the one extra that is being done already?
>

The only processing that goes on now for the C tarball is to include the
tests, and this has no impact on the behaviour of the build system. Doing
this with the java api directory would be more complex as I suspect there
would need to be two different paths through the build system depending on
whether you're building from SVN or building from a release artifact.

Even this little level of processing isn't great though, and there used to
be even more than this. When we had the examples at the top level they were
also copied into the C source tarball. I eliminated this processing and
moved them in there directly after getting bug reports on the README. It
turns out the README was written/modified to reflect the svn layout and as
this was different from the release artifact layout the READMEs ended up
referencing locations that didn't exist.

--Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Robbie Gemmell <ro...@gmail.com>.
Most of the times I download and compile a release (something even us Java
developers still do on occasion :P), the first thing that is required is
changing into a directory, which is true with the proton-c releases.
Changing down 2 directories doesnt sound overly taxing to me, and proton-c
seems like a fairly obvious directory to change into if I want to compile
the C version of proton and dont find a makefile/src dir in the initial
version directory.

That said, looking at the 0.3 proton-c source tar I notice that it is
already more complex than a simple export of the proton-c subdir in the
repo, since the contents of the two dont match directly. As there is
already some processing going on somewhere (release.sh), it is really that
complex to export two extra directories from the top level into the release
tar instead of the one extra that is being done already?

Robbie

On 21 January 2013 17:55, Rafael Schloming <rh...@alum.mit.edu> wrote:

> On Mon, Jan 21, 2013 at 8:03 AM, Robbie Gemmell <robbie.gemmell@gmail.com
> >wrote:
>
> > I would echo some of Robs points (since he beat me to saying them msyelf
> :)
> > ) and add some of my own.
> >
> > I also dont see a need to check out proton-c or proton-j in isolation, if
> > the tests for both of them sit a level up then thats what people should
> be
> > grabbing in my mind.
> >
> > Duplicating code sounds fishy to start with, but doing so given the
> > apparent real need to check out the common parent directory seems easily
> > questionable.
> >
> > One possible adjustment I might suggest (but dont personally see the need
> > for) would be that if the compile requirement for maven to generate the
> > proton-api jar used by the C tree to build the JNI bindings is considered
> > unworkable for some, if its just a simple jar it could also be built with
> > CMake for the C build, leaving Maven to do it for the Java build. I'm not
> > sure how such developers would be planning to run the common test suite
> > that still needed Maven though.
> >
> > If we are releasing the C and Java components at the same time, and the
> > tests sit at the top level, why does there need to be two source tars? We
> > had this discussion with regards to the various Qpid clients and brokers
> > some time ago and the agreed (but never fully implemented, we still have
> > subset source tars) outcome was that we should do away with the
> > component-specific source tars and only have the one main source tar
> which
> > is actually 'the release' in terms of the project, with e.g Java binaries
> > being separate complementary things.
> >
>
> I'm not sure I can answer this in a way that will be satisfying to you as
> the answer is based a lot on C development standards where source tarballs
> play a much more active role as a means to distribute software than in the
> Java world where everything is distributed via binaries. But I'll try by
> saying that having a C project where you can't simply untar it and do one
> of <src>/configure && make or cmake <src> && make is a bit like having a
> Java project that doesn't use maven or ant. I'm aware we could have cmake
> at the top level alongside a pom.xml, and some third entry script that
> invokes both for system tests and the like, and while I would encourage
> that for proton developers, it is imposing a very complex set of entry
> points onto our users. I can see that this might impact Java users less as
> they may care less about src distros, but it is far from an ideal release
> artifact for C users.
>
> As for producing a C tarball by post processing a large source tarball,
> it's simply something I would prefer to avoid given that there are
> alternatives as having a complex mapping from source control to release
> artifact is in my experience quite bad for the health of a project. It
> means developers are more detached from what their users experience "out of
> the box".
>
> --Rafael
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Mon, Jan 21, 2013 at 8:03 AM, Robbie Gemmell <ro...@gmail.com>wrote:

> I would echo some of Robs points (since he beat me to saying them msyelf :)
> ) and add some of my own.
>
> I also dont see a need to check out proton-c or proton-j in isolation, if
> the tests for both of them sit a level up then thats what people should be
> grabbing in my mind.
>
> Duplicating code sounds fishy to start with, but doing so given the
> apparent real need to check out the common parent directory seems easily
> questionable.
>
> One possible adjustment I might suggest (but dont personally see the need
> for) would be that if the compile requirement for maven to generate the
> proton-api jar used by the C tree to build the JNI bindings is considered
> unworkable for some, if its just a simple jar it could also be built with
> CMake for the C build, leaving Maven to do it for the Java build. I'm not
> sure how such developers would be planning to run the common test suite
> that still needed Maven though.
>
> If we are releasing the C and Java components at the same time, and the
> tests sit at the top level, why does there need to be two source tars? We
> had this discussion with regards to the various Qpid clients and brokers
> some time ago and the agreed (but never fully implemented, we still have
> subset source tars) outcome was that we should do away with the
> component-specific source tars and only have the one main source tar which
> is actually 'the release' in terms of the project, with e.g Java binaries
> being separate complementary things.
>

I'm not sure I can answer this in a way that will be satisfying to you as
the answer is based a lot on C development standards where source tarballs
play a much more active role as a means to distribute software than in the
Java world where everything is distributed via binaries. But I'll try by
saying that having a C project where you can't simply untar it and do one
of <src>/configure && make or cmake <src> && make is a bit like having a
Java project that doesn't use maven or ant. I'm aware we could have cmake
at the top level alongside a pom.xml, and some third entry script that
invokes both for system tests and the like, and while I would encourage
that for proton developers, it is imposing a very complex set of entry
points onto our users. I can see that this might impact Java users less as
they may care less about src distros, but it is far from an ideal release
artifact for C users.

As for producing a C tarball by post processing a large source tarball,
it's simply something I would prefer to avoid given that there are
alternatives as having a complex mapping from source control to release
artifact is in my experience quite bad for the health of a project. It
means developers are more detached from what their users experience "out of
the box".

--Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Robbie Gemmell <ro...@gmail.com>.
I would echo some of Robs points (since he beat me to saying them msyelf :)
) and add some of my own.

I also dont see a need to check out proton-c or proton-j in isolation, if
the tests for both of them sit a level up then thats what people should be
grabbing in my mind.

Duplicating code sounds fishy to start with, but doing so given the
apparent real need to check out the common parent directory seems easily
questionable.

One possible adjustment I might suggest (but dont personally see the need
for) would be that if the compile requirement for maven to generate the
proton-api jar used by the C tree to build the JNI bindings is considered
unworkable for some, if its just a simple jar it could also be built with
CMake for the C build, leaving Maven to do it for the Java build. I'm not
sure how such developers would be planning to run the common test suite
that still needed Maven though.

If we are releasing the C and Java components at the same time, and the
tests sit at the top level, why does there need to be two source tars? We
had this discussion with regards to the various Qpid clients and brokers
some time ago and the agreed (but never fully implemented, we still have
subset source tars) outcome was that we should do away with the
component-specific source tars and only have the one main source tar which
is actually 'the release' in terms of the project, with e.g Java binaries
being separate complementary things.

If we did just have a single source artifact to consitutate the full
release and either we or some third party then wanted to build a c-only
source artifact for some reason, that could of course still be done by
simply processing the contents of the repository or single 'the release'
tar appropriately. E.g, the 'individual component' source releases in Qpid
arent simple svn exports, they contain different parts of the tree bundled
into a tar, which is I guess ok because they are not actually 'the release'.

Robbie

On 21 January 2013 12:10, Rob Godfrey <ro...@gmail.com> wrote:

> >
>
> > > This results in something that is quite awkward for the C build. For
> one
> > > thing I'm not sure an svn export of the proton-c directory would be
> > > considered releasable under this scheme as it would include the java
> > > binding, but not the source code necessary to build it, and apache
> policy
> > > requires releases to include full source code. Regardless it would no
> > > longer be a useful/sensible artifact to end-users since they couldn't
> > > actually build the java binding.
> > >
> >
> >
> This seems a slightly odd position to take. The artefact doesn't include
> the entire source to python, ruby, openssl, etc.  If the dependencies for
> these are not present then the relevant parts of the tree are not built.
>  The same is true in this proposal with respect to the java binding...
> there is a dependency on the Java API being installed in order to build the
> JNI bindings within the C build.
>
>
> I must admit I remain bemused by the idea that trying to maintain two
> copies of the Java API in the source tree makes any kind of sense.
>
> I think we are contorting ourselves and adding potentially huge
> complication to our build/development process in order to try to satisfy a
> number of somewhat arbitrary "requirements" that are being imposed on the
> directory structure.
>
> Personally I don't perceive there to be an actually need to allow checking
> out of only part of the Proton tree.  Indeed I would wish to strongly
> discourage the sort of silo-d attitude that checking out only Java or only
> C would imply.
>
> Moreover, while I see that it is advantageous to be able to release
> "source" packages directly as svn exports from points in the tree... I
> don't find this so compelling that I would break fundamental tenets of
> source control is expected to be used.
>
> Personally, given that our current plan is to release all of Proton at the
> same time, I'm not sure what would be wrong with simply shipping a single
> source tarball of the entire directory structure.  People who wish to build
> from source would thus be able to build whatever they so desired.
>
> -- Rob
>

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rob Godfrey <ro...@gmail.com>.
>

> > This results in something that is quite awkward for the C build. For one
> > thing I'm not sure an svn export of the proton-c directory would be
> > considered releasable under this scheme as it would include the java
> > binding, but not the source code necessary to build it, and apache policy
> > requires releases to include full source code. Regardless it would no
> > longer be a useful/sensible artifact to end-users since they couldn't
> > actually build the java binding.
> >
>
>
This seems a slightly odd position to take. The artefact doesn't include
the entire source to python, ruby, openssl, etc.  If the dependencies for
these are not present then the relevant parts of the tree are not built.
 The same is true in this proposal with respect to the java binding...
there is a dependency on the Java API being installed in order to build the
JNI bindings within the C build.


I must admit I remain bemused by the idea that trying to maintain two
copies of the Java API in the source tree makes any kind of sense.

I think we are contorting ourselves and adding potentially huge
complication to our build/development process in order to try to satisfy a
number of somewhat arbitrary "requirements" that are being imposed on the
directory structure.

Personally I don't perceive there to be an actually need to allow checking
out of only part of the Proton tree.  Indeed I would wish to strongly
discourage the sort of silo-d attitude that checking out only Java or only
C would imply.

Moreover, while I see that it is advantageous to be able to release
"source" packages directly as svn exports from points in the tree... I
don't find this so compelling that I would break fundamental tenets of
source control is expected to be used.

Personally, given that our current plan is to release all of Proton at the
same time, I'm not sure what would be wrong with simply shipping a single
source tarball of the entire directory structure.  People who wish to build
from source would thus be able to build whatever they so desired.

-- Rob

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Sat, Jan 19, 2013 at 5:48 PM, Phil Harvey <ph...@philharveyonline.com>wrote:

> I worked with Keith on this proposal so I should state up front that I'm
> not coming to this debate from a neutral standpoint.
>
> Hopefully we can find a solution that is acceptable to everyone.  To this
> end, we listed our understanding of the requirements on
> https://issues.apache.org/jira/browse/PROTON-194.  I'm hoping that this
> discussion will allow us to clarify our requirements, such that the best
> technical solution naturally follows.
>
> I've added some comments in-line below...
>
> On 18 January 2013 19:29, Rafael Schloming <rh...@alum.mit.edu> wrote:
>
> > On Fri, Jan 18, 2013 at 11:17 AM, Keith W <ke...@gmail.com> wrote:
> >
> > > We are currently in the process of implementing the proton-jni binding
> > > for the proton-c library that implements the Java Proton-API, allow
> > > Java users to choose the C based proton stack if they wish. This work
> > > is being performed on the jni-branch under PROTON-192 (for the JNI
> > > work) and PROTON-194 (for the build system changes).
> > >
> > > Currently, Proton has two independent build systems: one for the
> > > proton-c and its ruby/perl/python/php bindings (based on Cmake/Make),
> > > and second a separate build system for proton-j (based on Maven).  As
> > > proton-jni will cut across both technology areas, non trivial changes
> > > are required to both build systems.
> > >
> > > The nub of the problem is the sharing of the Java Proton-API between
> > > both proton-c and proton-j trees. Solutions based on svn-external and
> > > a simple tree copy have been considered and discussed at length on
> > > conference calls.  We have identified drawbacks in both solutions.
> > >
> >
> > To be honest I don't think we've sufficiently explored the "copy" option.
> > While its true there were a lot of hypothetical issues thrown around on
> the
> > calls, many of them have quite reasonable solutions that may well be less
> > work than the alternatives.
> >
> > In my experience, maintaining two copies of any code is usually a bad
> thing.  However, I try to be open minded so I agree that it's worth
> exploring this option.  I'd be interested to hear your opinion on (a) the
> scenarios when it would be acceptable for these two copies to diverge and
> (b) the mechanism you're envisaging for achieving convergence.  I imagine
> there are both technical and "process" dimensions to making this work.
>
>
This is a good question, sorry I missed it with the flurry of other posts.
To answer (a), I think on trunk these two things should probably never (or
very rarely at least) diverge. On very specific feature development
branches I think we've seen it can be convenient to let them diverge a
little, but as the whole point of a feature branch is to be able to break
things I think that's neither here nor there, either way I would consider
non matching APIs to be a broken state of things.

The mechanism I'd propose would be to add a check to the C build system
that would cause a build failure if the API as viewed from the JNI binding
was any different from the API as it exists in the Java source tree. I
believe for most developer scenarios this would achieve *almost* the same
thing that svn externals does without the inherent drawbacks. I'll detail
the scenarios I've thought of below:

1. Changing the Java API from the Java tree
  If the Java developer changes the API, the C build will break due to
check failure.
  If the Java developer changes both to avoid the check, the C build will
break to do compile failure.

2. Changing the Java API from the C tree
  If the C developer forgets to change the Java API, then the C build will
break due to check failure.
  If the C developer changes both to avoid the check, the Java build will
break due to compile failure.

I believe the above scenarios enforce pretty much the same thing svn
externals does. The only added step is the need to copy changes to both
places when you are being a good citizen and bringing forward both at the
same time. I would hope that this would become less and less of an issue as
the API should really stabilize and not change, however if that is an issue
for the near term I would propose adding a sync script on the C side to
pull the changes over to a local checkout. This would result in the
following process for someone changing both simultaneously:

  - Make you changes and test on the Java build. When this works,
transition over to the C build and see what the impact of the changes has
been. The first breakage will be the check failure and running the sync
script will fix this. You can then proceed to see what other build failures
there are and how to fix them.

I would hope overall this would minimize the impact of the syncing as I
would expect API changes to be primarily driven from the Java side. Either
way, I think the only development process difference between this setup and
the svn externals one is that for the C build you fix the check breakage by
running the sync script before proceeding to fix remaining build failures.

--Rafael

Re: Changing the Proton build system to accommodate jni bindings

Posted by Phil Harvey <ph...@philharveyonline.com>.
I worked with Keith on this proposal so I should state up front that I'm
not coming to this debate from a neutral standpoint.

Hopefully we can find a solution that is acceptable to everyone.  To this
end, we listed our understanding of the requirements on
https://issues.apache.org/jira/browse/PROTON-194.  I'm hoping that this
discussion will allow us to clarify our requirements, such that the best
technical solution naturally follows.

I've added some comments in-line below...

On 18 January 2013 19:29, Rafael Schloming <rh...@alum.mit.edu> wrote:

> On Fri, Jan 18, 2013 at 11:17 AM, Keith W <ke...@gmail.com> wrote:
>
> > We are currently in the process of implementing the proton-jni binding
> > for the proton-c library that implements the Java Proton-API, allow
> > Java users to choose the C based proton stack if they wish. This work
> > is being performed on the jni-branch under PROTON-192 (for the JNI
> > work) and PROTON-194 (for the build system changes).
> >
> > Currently, Proton has two independent build systems: one for the
> > proton-c and its ruby/perl/python/php bindings (based on Cmake/Make),
> > and second a separate build system for proton-j (based on Maven).  As
> > proton-jni will cut across both technology areas, non trivial changes
> > are required to both build systems.
> >
> > The nub of the problem is the sharing of the Java Proton-API between
> > both proton-c and proton-j trees. Solutions based on svn-external and
> > a simple tree copy have been considered and discussed at length on
> > conference calls.  We have identified drawbacks in both solutions.
> >
>
> To be honest I don't think we've sufficiently explored the "copy" option.
> While its true there were a lot of hypothetical issues thrown around on the
> calls, many of them have quite reasonable solutions that may well be less
> work than the alternatives.
>
> In my experience, maintaining two copies of any code is usually a bad
thing.  However, I try to be open minded so I agree that it's worth
exploring this option.  I'd be interested to hear your opinion on (a) the
scenarios when it would be acceptable for these two copies to diverge and
(b) the mechanism you're envisaging for achieving convergence.  I imagine
there are both technical and "process" dimensions to making this work.

>
> > This email proposes another solution. The hope is that this proposal
> > can be developed on list into a solution that is acceptable to all.
> >
> > Proposal:
> >
> > Move the Java Proton-API to the top level so that it can be shared
> > simply and conveniently by both proton-j and proton-c.
> >
> > * Maven builds the proton-api JAR to a well known location
> > * Cmake/make builds proton-c and all bindings including java. As the
> > building of the java binding requires the Java Proton API, it is
> > optional and only takes place if proton-api has been previously
> > created by Maven (or found by other means).
> > * Maven builds of proton-j
> > * Maven runs the system tests against either proton-c or proton-j. The
> > system tests are currently written in Python but are being augmented
> > with new ones written in Java.
> >
> > Proposed Directory Structure:
> >
> > proton
> > |-- release.sh/bat                  # Builds, tests and packages
> > proton-c and proton-j
> > |-- pom.xml
> > |
> > |-- proton-api                      # Java Proton-API
> > |   |-- pom.xml                     # Will create proton-api.jar at a
> > well known location in tree
> > |       `-- main
> > |
> > |-- proton-c                        # Proton-C and Proton-C bindings
> > |   |-- CMakeLists.txt
> > |   `-- bindings
> > |       |-- CMakeLists.txt
> > |       `-- java
> > |           |-- CMakeLists.txt
> > |           `-- jni
> > |               `-- CMakeLists.txt  # Creates proton-jni.jar using
> > proton-api.jar from a well known
> > |                                   # location in tree or skip if jar
> > cannot be found
> > |
> > |-- proton-j                        # Proton-J
> > |   |-- pom.xml                     # Creates proton-j.jar using
> > proton-api.jar (found via Maven)
> > |   `-- src
> > |       `-- main
> > |
> > `-- tests                           # Python and Java based system
> > tests that test equally Proton-C and
> >     |                               # Proton-J.
> >     |-- pom.xml
> >     `-- src
> >         `-- test
> >
> > Use cases:
> >
> > usecase #1 - Proton-C Developer exclusively focused on Proton-C
> >
> > This developer may choose to check out the proton-c subtree.  The
> > build tool set remains unchanged from today i.e. cmake and make.  By
> > default, all bindings will be built expect for the java bindings (as
> > Cmake would fail to find the proton-api.jar).  For flexibility, we
> > would include option to have cmake search another directory allowing
> > proton-api.jar to be found in non-standard locations.
> >
> > usecase #2 - Proton-C Developer who wishes to run all system tests
> >
> > This developer must check out the complete proton tree.  The build
> > tool set now includes maven in order to build the proton-api and run
> > the complete system test suite.
> >
> > Typical commands used by this developer would be:
> > mvn -f proton-api/pom.xml install  # build and install proton-api to
> > the well known location
> > <build proton-c>
> > mvn test                           # run all the system tests against
> > proton-c.
> >
> > usecase #3 - Proton-J Developer
> >
> > This developer must check out the complete proton tree.   The build
> > tool set is maven.
> >
> > mvn test -Pproton-j                # build all and run the system
> > tests against proton-j (the -P switch to specifies the Maven profile
> > to use)
> >
> > usecase #4 - Release manager
> >
> > The release manager must check out the complete proton tree.  The tool
> > set will be maven, cmake, make in order to build and test all
> > artefacts. A release script could be added to make this a single step.
> >
> > release.sh/bat
> >
>
> This results in something that is quite awkward for the C build. For one
> thing I'm not sure an svn export of the proton-c directory would be
> considered releasable under this scheme as it would include the java
> binding, but not the source code necessary to build it, and apache policy
> requires releases to include full source code. Regardless it would no
> longer be a useful/sensible artifact to end-users since they couldn't
> actually build the java binding.
>

This doesn't seem like an insurmountable problem, though admittedly I am
less familiar with the proton-c release process than you.  A couple of
suggestions follow.

1. I had imagined that a user wanting to use proton-c's would simply treat
proton-api.jar as an optional build dependency (optional in the sense that
it's only required if you want to build the JNI binding).  I think it is
reasonable for them to retrieve this separately, e.g. by downloading it
from the Apache Maven repo, or building it if they wish.  Java developers
are unlikely to balk at installing libraries in their local Maven repo.

2. You seem to be implying that an svn export of the proton-c sub-folder
needs to be completely self-contained, i.e. it would necessarily contain
the proton-api code.  Is this really true?  I could be persuaded that a
proton-c *release* needs to be self-contained.  If this really is the case,
could we delay copying the proton-api code into the proton-c tree until
build time?  We wouldn't commit that duplicated code to SVN, but we would
include it in the released source tarball.

Of the two suggestions above I favour the first one, but its feasibility
depends on what our requirements really are.


> It also has the drawback of adding a maven dependency to building the java
> binding. The way this is done on the jni branch now where the java API code
> is built with cmake actually has some significant advantages since the java
> binding would still be buildable on platforms that don't have a
> sufficiently recent maven to build the Java bits (e.g. RHEL and Debian).
>

I believe it would be rare for a team to want to build the JNI bindings but
to be unable to access a reasonably recent version of Maven.  If proton-j
requires an unreasonably recent Maven version then that's a separate
problem that we need to address anyway.
Out of interest, what is the problem on RHEL and Debian?


> --Rafael
>

Phil

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rajith Attapattu <ra...@gmail.com>.
> On Fri, Jan 18, 2013 at 2:29 PM, Rafael Schloming <rh...@alum.mit.edu> wrote:
>> The nub of the problem is the sharing of the Java Proton-API between
>> both proton-c and proton-j trees. Solutions based on svn-external and
>> a simple tree copy have been considered and discussed at length on
>> conference calls.  We have identified drawbacks in both solutions.

It would be great if things are discussed on the mailing lists as
opposed to conference calls.
However I applaud Keith for posting a detailed summary of the proposal
on the list.

Rajith

Re: Changing the Proton build system to accommodate jni bindings

Posted by Rafael Schloming <rh...@alum.mit.edu>.
On Fri, Jan 18, 2013 at 11:17 AM, Keith W <ke...@gmail.com> wrote:

> We are currently in the process of implementing the proton-jni binding
> for the proton-c library that implements the Java Proton-API, allow
> Java users to choose the C based proton stack if they wish. This work
> is being performed on the jni-branch under PROTON-192 (for the JNI
> work) and PROTON-194 (for the build system changes).
>
> Currently, Proton has two independent build systems: one for the
> proton-c and its ruby/perl/python/php bindings (based on Cmake/Make),
> and second a separate build system for proton-j (based on Maven).  As
> proton-jni will cut across both technology areas, non trivial changes
> are required to both build systems.
>
> The nub of the problem is the sharing of the Java Proton-API between
> both proton-c and proton-j trees. Solutions based on svn-external and
> a simple tree copy have been considered and discussed at length on
> conference calls.  We have identified drawbacks in both solutions.
>

To be honest I don't think we've sufficiently explored the "copy" option.
While its true there were a lot of hypothetical issues thrown around on the
calls, many of them have quite reasonable solutions that may well be less
work than the alternatives.


> This email proposes another solution. The hope is that this proposal
> can be developed on list into a solution that is acceptable to all.
>
> Proposal:
>
> Move the Java Proton-API to the top level so that it can be shared
> simply and conveniently by both proton-j and proton-c.
>
> * Maven builds the proton-api JAR to a well known location
> * Cmake/make builds proton-c and all bindings including java. As the
> building of the java binding requires the Java Proton API, it is
> optional and only takes place if proton-api has been previously
> created by Maven (or found by other means).
> * Maven builds of proton-j
> * Maven runs the system tests against either proton-c or proton-j. The
> system tests are currently written in Python but are being augmented
> with new ones written in Java.
>
> Proposed Directory Structure:
>
> proton
> |-- release.sh/bat                  # Builds, tests and packages
> proton-c and proton-j
> |-- pom.xml
> |
> |-- proton-api                      # Java Proton-API
> |   |-- pom.xml                     # Will create proton-api.jar at a
> well known location in tree
> |       `-- main
> |
> |-- proton-c                        # Proton-C and Proton-C bindings
> |   |-- CMakeLists.txt
> |   `-- bindings
> |       |-- CMakeLists.txt
> |       `-- java
> |           |-- CMakeLists.txt
> |           `-- jni
> |               `-- CMakeLists.txt  # Creates proton-jni.jar using
> proton-api.jar from a well known
> |                                   # location in tree or skip if jar
> cannot be found
> |
> |-- proton-j                        # Proton-J
> |   |-- pom.xml                     # Creates proton-j.jar using
> proton-api.jar (found via Maven)
> |   `-- src
> |       `-- main
> |
> `-- tests                           # Python and Java based system
> tests that test equally Proton-C and
>     |                               # Proton-J.
>     |-- pom.xml
>     `-- src
>         `-- test
>
> Use cases:
>
> usecase #1 - Proton-C Developer exclusively focused on Proton-C
>
> This developer may choose to check out the proton-c subtree.  The
> build tool set remains unchanged from today i.e. cmake and make.  By
> default, all bindings will be built expect for the java bindings (as
> Cmake would fail to find the proton-api.jar).  For flexibility, we
> would include option to have cmake search another directory allowing
> proton-api.jar to be found in non-standard locations.
>
> usecase #2 - Proton-C Developer who wishes to run all system tests
>
> This developer must check out the complete proton tree.  The build
> tool set now includes maven in order to build the proton-api and run
> the complete system test suite.
>
> Typical commands used by this developer would be:
> mvn -f proton-api/pom.xml install  # build and install proton-api to
> the well known location
> <build proton-c>
> mvn test                           # run all the system tests against
> proton-c.
>
> usecase #3 - Proton-J Developer
>
> This developer must check out the complete proton tree.   The build
> tool set is maven.
>
> mvn test -Pproton-j                # build all and run the system
> tests against proton-j (the -P switch to specifies the Maven profile
> to use)
>
> usecase #4 - Release manager
>
> The release manager must check out the complete proton tree.  The tool
> set will be maven, cmake, make in order to build and test all
> artefacts. A release script could be added to make this a single step.
>
> release.sh/bat
>

This results in something that is quite awkward for the C build. For one
thing I'm not sure an svn export of the proton-c directory would be
considered releasable under this scheme as it would include the java
binding, but not the source code necessary to build it, and apache policy
requires releases to include full source code. Regardless it would no
longer be a useful/sensible artifact to end-users since they couldn't
actually build the java binding.

It also has the drawback of adding a maven dependency to building the java
binding. The way this is done on the jni branch now where the java API code
is built with cmake actually has some significant advantages since the java
binding would still be buildable on platforms that don't have a
sufficiently recent maven to build the Java bits (e.g. RHEL and Debian).

--Rafael