You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@community.apache.org by Ross Gardler <rg...@apache.org> on 2010/04/22 16:24:12 UTC

Reviewing the evaluation process

I'd like to make a few comments about issues that have arisen during the 
evaluation process for GSoC. I'm going to give my opinion on each, 
please treat this as lazy consensus - do speak up if you wish to 
disagree or add more items:

Not enough visibility of the process
====================================

There will always be someone who doesn't read the stuff we send out. 
Where this is the case I don't think we should worry ourselves.

Sending to PMCs (including the incubator PMC) is sufficient to reach 
people. We don't want to send out to committers@ as the project as a 
whole needs to be behind taking on a GSoC student.

Reaching PPMCs is more problematic, I think we should continue to rely 
on incubator mentors taking the message to their projects if they feel 
it is appropriate.

PROPOSAL
--------
Make it explicit that incubator mentors should pass the message on to 
PPMCs if appropriate.

Marking experience mentors up
=============================

I really don't like the idea up to 2 points for having been a successful 
mentor before, firstly it is error prone (e.g. both Bertrand and Luciano 
have been mentors *and* admins, yet the admin this year was unaware of 
that). Secondly, just because someone has mentored a student in the past 
doesn't mean they will be better than another mentor. Finally, mentoring 
a failing student is, in many ways, more educational than mentoring a 
successful one.

We already have "Does the mentor show an understanding of how to mentor 
a student? (0-4 points)" - I'm more interested in whether the mentor 
knows what is expected. However, applying this score is difficult.

PROPOSAL
--------
Remove "Has the mentor had a successful student in the past (0-2 
points)" from the admin rankings

Add some docs to the ranking process about what admins are looking for 
with respect to "oes the mentor show an understanding of how to mentor a 
student? (0-4 points)" (Noirins mail with the subject Admin coordination 
to alexei.fedotov on code-awards wouild be a good starting point)

Original Ideas are good
=======================

Past experience has shown that if a student proposes their own idea and 
it is accepted the student is going to be strong.

PROPOSAL
--------

Add the following to the mentor ranking:

Is the project definition and idea originally the mentee's, the
mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
if collaborative, 0 if mentor's)

Re: Reviewing the evaluation process

Posted by Ross Gardler <rg...@apache.org>.
On 02/09/2010 17:53, Rahul Akolkar wrote:
> On Thu, Sep 2, 2010 at 12:37 PM, Kathey Marsden
> <km...@sbcglobal.net>  wrote:
>> On 9/2/2010 9:21 AM, Rahul Akolkar wrote:
>>
>> Original Ideas are good
>>
>> =======================
>>
>> Past experience has shown that if a student proposes their own idea and it
>> is accepted the student is going to be strong.
>>
>> PROPOSAL
>> --------
>>
>> Add the following to the mentor ranking:
>>
>> Is the project definition and idea originally the mentee's, the
>> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
>> if collaborative, 0 if mentor's)
>>
>> I am still not such a big fan of the "original idea" points in a standards
>> base product
> <snip/>
>
> OK, the idea is same as before (my comment on standards elsewhere in
> the thread):
>
>    http://markmail.org/message/j6qkbipevrnphe6v
>
>
>
>> and marking down for collaboration where we want to encourage
>> interaction with the community.
>>
> <snap/>
>
> Yes, I can see how the wording can be perceived that way so it does
> need tweaking.

I'm with Kathey on this one. I think we need a wording that emphases the 
students contributions to the idea as a whole. That is whether the idea 
comes from the community or the student is not relevant, what is 
relevant is whether the student shows originality in expanding the idea 
into a complete proposal.

I'm sure you can come up with some suitable wording.

Ross

Re: Reviewing the evaluation process

Posted by Rahul Akolkar <ra...@gmail.com>.
On Thu, Sep 2, 2010 at 12:37 PM, Kathey Marsden
<km...@sbcglobal.net> wrote:
> On 9/2/2010 9:21 AM, Rahul Akolkar wrote:
>
> Original Ideas are good
>
> =======================
>
> Past experience has shown that if a student proposes their own idea and it
> is accepted the student is going to be strong.
>
> PROPOSAL
> --------
>
> Add the following to the mentor ranking:
>
> Is the project definition and idea originally the mentee's, the
> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
> if collaborative, 0 if mentor's)
>
> I am still not such a big fan of the "original idea" points in a standards
> base product
<snip/>

OK, the idea is same as before (my comment on standards elsewhere in
the thread):

  http://markmail.org/message/j6qkbipevrnphe6v



> and marking down for collaboration where we want to encourage
> interaction with the community.
>
<snap/>

Yes, I can see how the wording can be perceived that way so it does
need tweaking.


> One  minor clarification I would like to see made is in this item:
>
> How does the mentor rate the student's chances of success, based on an
> in-person (face-to-face, video, audio, email) interview? (0-3 points, or 0-1
> if email interview only)
>
> To add an IRC interview as one of the 0-3 options.  Our students the last
> two summers  have mostly been from Sri Lanka and China and so other options
> hard to coordinate.  Even the IRC interview will require someone getting up
> in the middle of the night.
>
<snip/>

I had a VoIP conversation with one of my eventual mentees from China.
Even with the time difference and a slight language problem, I found
it to be the most useful and expeditious thing I did in that
particular evaluation. I say this to recount my experience, not to
preclude the use of IRC with respect to the question above.

-Rahul

Re: Reviewing the evaluation process

Posted by Kathey Marsden <km...@sbcglobal.net>.
  On 9/2/2010 9:21 AM, Rahul Akolkar wrote:
> Original Ideas are good
>> =======================
>>
>> Past experience has shown that if a student proposes their own idea and it
>> is accepted the student is going to be strong.
>>
>> PROPOSAL
>> --------
>>
>> Add the following to the mentor ranking:
>>
>> Is the project definition and idea originally the mentee's, the
>> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
>> if collaborative, 0 if mentor's)
>>
I am still not such a big fan of the "original idea" points in a 
standards base product and marking down for collaboration where we want 
to encourage interaction with the community.

One  minor clarification I would like to see made is in this item:

    * How does the mentor rate the student's chances of success, based
      on an in-person (face-to-face, video, audio, email) interview?
      (0-3 points, or 0-1 if email interview only)

To add an IRC interview as one of the 0-3 options.  Our students the 
last two summers  have mostly been from Sri Lanka and China and so other 
options hard to coordinate.  Even the IRC interview will require someone 
getting up in the middle of the night.






Re: Reviewing the evaluation process

Posted by Ross Gardler <rg...@apache.org>.
Thanks Rahul

Sent from my mobile device.

On 5 Sep 2010, at 04:10, Rahul Akolkar <ra...@gmail.com> wrote:

> On Fri, Sep 3, 2010 at 10:36 AM, Ross Gardler <rg...@apache.org> wrote:
>> Added to confluence group
>> 
> <snip/>
> 
> Changes discussed in this thread (other than PPMCs bit) reflected in
> v3 of mentee ranking process page.
> 
> -Rahul
> 
> 
>> On 03/09/2010 15:03, Rahul Akolkar wrote:
>>> 
>>> On Thu, Sep 2, 2010 at 6:54 PM, Ross Gardler<rg...@apache.org>  wrote:
>>>> 
>>>> On 02/09/2010 17:21, Rahul Akolkar wrote:
>>>>> 
>>>>> Now that this year's program is done and there seemed to be consensus
>>>>> on the items below when discussed, can we get the proposals below
>>>>> reflected on the mentee ranking page [1] and other places as
>>>>> appropriate?
>>>> 
>>>> Yes - thanks for picking it up.
>>>> 
>>>>> 
>>>>> I don't have a cwiki account, but that can be fixed if needed :-)
>>>> 
>>>> You are a committer right? As long as you are then please go ahead and
>>>> create a CWiki account and I'll give you the necessary rights.
>>>> 
>>> <snip/>
>>> 
>>> Cool, done, the account username is "ra" (my initials).
>>> 
>>> -Rahul
>>> 
>>> 
>>>> The reason we need you to be a committer is that since we auto-publish
>>>> the
>>>> wiki to the website we need to ensure we have a CLA on file for you.
>>>> 
>>>> Ross
>>>> 
>>> <snap/>
>> 
>> 
>> --
>> rgardler@apache.org
>> @rgardler
>> 

Re: Reviewing the evaluation process

Posted by Rahul Akolkar <ra...@gmail.com>.
On Fri, Sep 3, 2010 at 10:36 AM, Ross Gardler <rg...@apache.org> wrote:
> Added to confluence group
>
<snip/>

Changes discussed in this thread (other than PPMCs bit) reflected in
v3 of mentee ranking process page.

-Rahul


> On 03/09/2010 15:03, Rahul Akolkar wrote:
>>
>> On Thu, Sep 2, 2010 at 6:54 PM, Ross Gardler<rg...@apache.org>  wrote:
>>>
>>> On 02/09/2010 17:21, Rahul Akolkar wrote:
>>>>
>>>> Now that this year's program is done and there seemed to be consensus
>>>> on the items below when discussed, can we get the proposals below
>>>> reflected on the mentee ranking page [1] and other places as
>>>> appropriate?
>>>
>>> Yes - thanks for picking it up.
>>>
>>>>
>>>> I don't have a cwiki account, but that can be fixed if needed :-)
>>>
>>> You are a committer right? As long as you are then please go ahead and
>>> create a CWiki account and I'll give you the necessary rights.
>>>
>> <snip/>
>>
>> Cool, done, the account username is "ra" (my initials).
>>
>> -Rahul
>>
>>
>>> The reason we need you to be a committer is that since we auto-publish
>>> the
>>> wiki to the website we need to ensure we have a CLA on file for you.
>>>
>>> Ross
>>>
>> <snap/>
>
>
> --
> rgardler@apache.org
> @rgardler
>

Re: Reviewing the evaluation process

Posted by Ross Gardler <rg...@apache.org>.
Added to confluence group

On 03/09/2010 15:03, Rahul Akolkar wrote:
> On Thu, Sep 2, 2010 at 6:54 PM, Ross Gardler<rg...@apache.org>  wrote:
>> On 02/09/2010 17:21, Rahul Akolkar wrote:
>>>
>>> Now that this year's program is done and there seemed to be consensus
>>> on the items below when discussed, can we get the proposals below
>>> reflected on the mentee ranking page [1] and other places as
>>> appropriate?
>>
>> Yes - thanks for picking it up.
>>
>>>
>>> I don't have a cwiki account, but that can be fixed if needed :-)
>>
>> You are a committer right? As long as you are then please go ahead and
>> create a CWiki account and I'll give you the necessary rights.
>>
> <snip/>
>
> Cool, done, the account username is "ra" (my initials).
>
> -Rahul
>
>
>> The reason we need you to be a committer is that since we auto-publish the
>> wiki to the website we need to ensure we have a CLA on file for you.
>>
>> Ross
>>
> <snap/>


-- 
rgardler@apache.org
@rgardler

Re: Reviewing the evaluation process

Posted by Rahul Akolkar <ra...@gmail.com>.
On Thu, Sep 2, 2010 at 6:54 PM, Ross Gardler <rg...@apache.org> wrote:
> On 02/09/2010 17:21, Rahul Akolkar wrote:
>>
>> Now that this year's program is done and there seemed to be consensus
>> on the items below when discussed, can we get the proposals below
>> reflected on the mentee ranking page [1] and other places as
>> appropriate?
>
> Yes - thanks for picking it up.
>
>>
>> I don't have a cwiki account, but that can be fixed if needed :-)
>
> You are a committer right? As long as you are then please go ahead and
> create a CWiki account and I'll give you the necessary rights.
>
<snip/>

Cool, done, the account username is "ra" (my initials).

-Rahul


> The reason we need you to be a committer is that since we auto-publish the
> wiki to the website we need to ensure we have a CLA on file for you.
>
> Ross
>
<snap/>

Re: Reviewing the evaluation process

Posted by Ross Gardler <rg...@apache.org>.
On 02/09/2010 17:21, Rahul Akolkar wrote:
> Now that this year's program is done and there seemed to be consensus
> on the items below when discussed, can we get the proposals below
> reflected on the mentee ranking page [1] and other places as
> appropriate?

Yes - thanks for picking it up.

>
> I don't have a cwiki account, but that can be fixed if needed :-)

You are a committer right? As long as you are then please go ahead and 
create a CWiki account and I'll give you the necessary rights.

The reason we need you to be a committer is that since we auto-publish 
the wiki to the website we need to ensure we have a CLA on file for you.

Ross

>
> -Rahul
>
> [1] http://community.apache.org/mentee-ranking-process.html
>
>
> On Thu, Apr 22, 2010 at 10:24 AM, Ross Gardler<rg...@apache.org>  wrote:
>> I'd like to make a few comments about issues that have arisen during the
>> evaluation process for GSoC. I'm going to give my opinion on each, please
>> treat this as lazy consensus - do speak up if you wish to disagree or add
>> more items:
>>
>> Not enough visibility of the process
>> ====================================
>>
>> There will always be someone who doesn't read the stuff we send out. Where
>> this is the case I don't think we should worry ourselves.
>>
>> Sending to PMCs (including the incubator PMC) is sufficient to reach people.
>> We don't want to send out to committers@ as the project as a whole needs to
>> be behind taking on a GSoC student.
>>
>> Reaching PPMCs is more problematic, I think we should continue to rely on
>> incubator mentors taking the message to their projects if they feel it is
>> appropriate.
>>
>> PROPOSAL
>> --------
>> Make it explicit that incubator mentors should pass the message on to PPMCs
>> if appropriate.
>>
>> Marking experience mentors up
>> =============================
>>
>> I really don't like the idea up to 2 points for having been a successful
>> mentor before, firstly it is error prone (e.g. both Bertrand and Luciano
>> have been mentors *and* admins, yet the admin this year was unaware of
>> that). Secondly, just because someone has mentored a student in the past
>> doesn't mean they will be better than another mentor. Finally, mentoring a
>> failing student is, in many ways, more educational than mentoring a
>> successful one.
>>
>> We already have "Does the mentor show an understanding of how to mentor a
>> student? (0-4 points)" - I'm more interested in whether the mentor knows
>> what is expected. However, applying this score is difficult.
>>
>> PROPOSAL
>> --------
>> Remove "Has the mentor had a successful student in the past (0-2 points)"
>> from the admin rankings
>>
>> Add some docs to the ranking process about what admins are looking for with
>> respect to "oes the mentor show an understanding of how to mentor a student?
>> (0-4 points)" (Noirins mail with the subject Admin coordination to
>> alexei.fedotov on code-awards wouild be a good starting point)
>>
>> Original Ideas are good
>> =======================
>>
>> Past experience has shown that if a student proposes their own idea and it
>> is accepted the student is going to be strong.
>>
>> PROPOSAL
>> --------
>>
>> Add the following to the mentor ranking:
>>
>> Is the project definition and idea originally the mentee's, the
>> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
>> if collaborative, 0 if mentor's)
>>


Re: Reviewing the evaluation process

Posted by Rahul Akolkar <ra...@gmail.com>.
Now that this year's program is done and there seemed to be consensus
on the items below when discussed, can we get the proposals below
reflected on the mentee ranking page [1] and other places as
appropriate?

I don't have a cwiki account, but that can be fixed if needed :-)

-Rahul

[1] http://community.apache.org/mentee-ranking-process.html


On Thu, Apr 22, 2010 at 10:24 AM, Ross Gardler <rg...@apache.org> wrote:
> I'd like to make a few comments about issues that have arisen during the
> evaluation process for GSoC. I'm going to give my opinion on each, please
> treat this as lazy consensus - do speak up if you wish to disagree or add
> more items:
>
> Not enough visibility of the process
> ====================================
>
> There will always be someone who doesn't read the stuff we send out. Where
> this is the case I don't think we should worry ourselves.
>
> Sending to PMCs (including the incubator PMC) is sufficient to reach people.
> We don't want to send out to committers@ as the project as a whole needs to
> be behind taking on a GSoC student.
>
> Reaching PPMCs is more problematic, I think we should continue to rely on
> incubator mentors taking the message to their projects if they feel it is
> appropriate.
>
> PROPOSAL
> --------
> Make it explicit that incubator mentors should pass the message on to PPMCs
> if appropriate.
>
> Marking experience mentors up
> =============================
>
> I really don't like the idea up to 2 points for having been a successful
> mentor before, firstly it is error prone (e.g. both Bertrand and Luciano
> have been mentors *and* admins, yet the admin this year was unaware of
> that). Secondly, just because someone has mentored a student in the past
> doesn't mean they will be better than another mentor. Finally, mentoring a
> failing student is, in many ways, more educational than mentoring a
> successful one.
>
> We already have "Does the mentor show an understanding of how to mentor a
> student? (0-4 points)" - I'm more interested in whether the mentor knows
> what is expected. However, applying this score is difficult.
>
> PROPOSAL
> --------
> Remove "Has the mentor had a successful student in the past (0-2 points)"
> from the admin rankings
>
> Add some docs to the ranking process about what admins are looking for with
> respect to "oes the mentor show an understanding of how to mentor a student?
> (0-4 points)" (Noirins mail with the subject Admin coordination to
> alexei.fedotov on code-awards wouild be a good starting point)
>
> Original Ideas are good
> =======================
>
> Past experience has shown that if a student proposes their own idea and it
> is accepted the student is going to be strong.
>
> PROPOSAL
> --------
>
> Add the following to the mentor ranking:
>
> Is the project definition and idea originally the mentee's, the
> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
> if collaborative, 0 if mentor's)
>

Re: Reviewing the evaluation process

Posted by Grant Ingersoll <gs...@apache.org>.
Instead of redirecting, I ask the author to ask, so as not to presume to take something they deem private into the public.  Chris Hostetter has a handy URL for this: http://people.apache.org/~hossman/#private_q

-Grant


On Apr 27, 2010, at 2:58 PM, Ted Dunning wrote:

> When mentoring, I also try to provide some gentle help on this by answering
> any privately sent code questions in public by redirecting to the mailing
> list.  I also send a gentle nudge pointing out that my answer is on the
> mailing list and that they would get a faster answer on average if they
> asked there in the first place.
> 
> On Tue, Apr 27, 2010 at 6:05 AM, Bertrand Delacretaz <bdelacretaz@apache.org
>> wrote:
> 
>> On Tue, Apr 27, 2010 at 3:01 PM, Grant Ingersoll <gs...@apache.org>
>> wrote:
>>> ...More or less, here's what I tell them:
>>> 1. I won't answer any development questions privately...
>>> 2. The stuff I will answer privately has to do with you...
>> 
>> I like this - I did work like that when mentoring students, without
>> spelling it out as clearly as you did. Way to go.
>> 



Re: Reviewing the evaluation process

Posted by Ted Dunning <te...@gmail.com>.
When mentoring, I also try to provide some gentle help on this by answering
any privately sent code questions in public by redirecting to the mailing
list.  I also send a gentle nudge pointing out that my answer is on the
mailing list and that they would get a faster answer on average if they
asked there in the first place.

On Tue, Apr 27, 2010 at 6:05 AM, Bertrand Delacretaz <bdelacretaz@apache.org
> wrote:

> On Tue, Apr 27, 2010 at 3:01 PM, Grant Ingersoll <gs...@apache.org>
> wrote:
> > ...More or less, here's what I tell them:
> > 1. I won't answer any development questions privately...
> > 2. The stuff I will answer privately has to do with you...
>
> I like this - I did work like that when mentoring students, without
> spelling it out as clearly as you did. Way to go.
>

Re: Reviewing the evaluation process

Posted by Ross Gardler <rg...@apache.org>.
On 27/04/2010 14:05, Bertrand Delacretaz wrote:
> On Tue, Apr 27, 2010 at 3:01 PM, Grant Ingersoll<gs...@apache.org>  wrote:
>> ...More or less, here's what I tell them:
>> 1. I won't answer any development questions privately...
>> 2. The stuff I will answer privately has to do with you...
>
> I like this - I did work like that when mentoring students, without
> spelling it out as clearly as you did. Way to go.

+1 this can go in the mentor pages.

Ross

Re: Reviewing the evaluation process

Posted by Bertrand Delacretaz <bd...@apache.org>.
On Tue, Apr 27, 2010 at 3:01 PM, Grant Ingersoll <gs...@apache.org> wrote:
> ...More or less, here's what I tell them:
> 1. I won't answer any development questions privately...
> 2. The stuff I will answer privately has to do with you...

I like this - I did work like that when mentoring students, without
spelling it out as clearly as you did. Way to go.

-Bertrand

Re: Reviewing the evaluation process

Posted by Grant Ingersoll <gs...@apache.org>.
Two cents on a slightly different, but related subject.  I interviewed 3 different people this year and as part of that laid out my basic "mentoring" philosophy and each seemed a bit surprised by it at first (but they quickly get why), but to me it is a reflection of the Apache Way.

More or less, here's what I tell them:
1. I won't answer any development questions privately.  I am not your "development" mentor.  All of those types of questions must be on list.  In most cases, the student probably knows more about the algorithm they are implementing than I do.
2. The stuff I will answer privately has to do with you and only you and is more mentoring in terms of: status checking, vacation, life issues, recommendations for jobs, etc.

At any rate, just thought I would share and also see how other people approach this and if they've had similar responses/thoughts.

Cheers,
Grant

Re: Reviewing the evaluation process

Posted by Ted Dunning <te...@gmail.com>.
This is exactly the observation I have from the Mahout students.  I would
rather not have a project re-invent something that we already have, but I am
thrilled to have a student who is self-directed and horrified at the thought
of having a student who needs to be micro-managed.  Even if the
self-directed student starts in the right direction, they are enormously
more likely to wind up somewhere useful for the community.  Conversely, the
micro-managee might start with the best initial idea (mine, of course)  but
they will almost certainly wind up nowhere.

Academic-style originality is  a nearly orthogonal concept.

On Sat, Apr 24, 2010 at 11:31 AM, Rahul Akolkar <ra...@gmail.com>wrote:

> The motivation for original ideas isn't purely in the academic sense
> of it, but also in the observation that on average, those who come up
> with their own proposals feel more connected with the project at hand
> and are more likely to retain interest and drive the work to a
> successful completion.
>

Re: Reviewing the evaluation process

Posted by Rahul Akolkar <ra...@gmail.com>.
On Fri, Apr 23, 2010 at 8:10 PM, Benson Margulies <bi...@gmail.com> wrote:
> I've been holding off on wading into this, but I think that some sort of an
> idea has jelled.
>
> I wonder about Google's statement of the mission. If Google's statement of
> the mission is: "Get smart students involved in open source," then we have
> one situation. If, on the other hand, it is more like "Get smart students to
> do original work (in the academic sense of the term) in an open source
> context," then we have another.
>
<snip/>

Its upto the participating orgs really to decide which students they
want to accept. We want the best students with the highest chances of
successfully completing their GSoC projects (and even beyond, remain
interested in OSS development).

The motivation for original ideas isn't purely in the academic sense
of it, but also in the observation that on average, those who come up
with their own proposals feel more connected with the project at hand
and are more likely to retain interest and drive the work to a
successful completion.

-Rahul


> In the former, then those TLPs that are are stable and structured and have,
> as it were, a limited set of available itches, should not be penalized. If
> it's the later, than those TLPs are at an unavoidable disadvantage.
>
>
>
> On Fri, Apr 23, 2010 at 7:38 PM, Kathey Marsden <kmarsdenderby@sbcglobal.net
>> wrote:
>
>> On 4/23/2010 4:14 PM, Rahul Akolkar wrote:
>>
>>>
>>> <snap/>
>>>
>>> The ability to formulate a GSoC proposal and attract mentor(s) for it
>>> must be rewarded.
>>>
>>
>> If original just means not going for one of the ideas  proposed by a mentor
>> for GSoC and exploring what's needed, including the issues in Jira and
>> attracting a mentor, than this would be fine as a global factor I think.
>> Perhaps I misunderstood  what was meant by original idea.
>>
>>
>>
>

Re: Reviewing the evaluation process

Posted by Benson Margulies <bi...@gmail.com>.
I've been holding off on wading into this, but I think that some sort of an
idea has jelled.

I wonder about Google's statement of the mission. If Google's statement of
the mission is: "Get smart students involved in open source," then we have
one situation. If, on the other hand, it is more like "Get smart students to
do original work (in the academic sense of the term) in an open source
context," then we have another.

In the former, then those TLPs that are are stable and structured and have,
as it were, a limited set of available itches, should not be penalized. If
it's the later, than those TLPs are at an unavoidable disadvantage.



On Fri, Apr 23, 2010 at 7:38 PM, Kathey Marsden <kmarsdenderby@sbcglobal.net
> wrote:

> On 4/23/2010 4:14 PM, Rahul Akolkar wrote:
>
>>
>> <snap/>
>>
>> The ability to formulate a GSoC proposal and attract mentor(s) for it
>> must be rewarded.
>>
>
> If original just means not going for one of the ideas  proposed by a mentor
> for GSoC and exploring what's needed, including the issues in Jira and
> attracting a mentor, than this would be fine as a global factor I think.
> Perhaps I misunderstood  what was meant by original idea.
>
>
>

Re: Reviewing the evaluation process

Posted by Kathey Marsden <km...@sbcglobal.net>.
On 4/23/2010 4:45 PM, Rahul Akolkar wrote:
> On Fri, Apr 23, 2010 at 7:38 PM, Kathey Marsden
> <km...@sbcglobal.net>  wrote:
>    
>> On 4/23/2010 4:14 PM, Rahul Akolkar wrote:
>>      
>>> The ability to formulate a GSoC proposal and attract mentor(s) for it
>>> must be rewarded.
>>>        
>> If original just means not going for one of the ideas  proposed by a mentor
>> for GSoC and exploring what's needed, including the issues in Jira and
>> attracting a mentor, than this would be fine as a global factor I think.
>>      
> <snip/>
>
> Thats the intent, yes. Think, identify, design, execute. A ready-made
> proposal from a mentor misses one or more of those steps.
>
>    
I think that is fine then.  Sorry for the confusion.  I think somehow 
the wording should be a bit different to avoid the same confusion by 
others or perhaps I just got off track because the original examples.   
That said, a better wording escapes me on a Friday afternoon.   Have a 
good one everyone!

Kathey


Re: Reviewing the evaluation process

Posted by Rahul Akolkar <ra...@gmail.com>.
On Fri, Apr 23, 2010 at 7:38 PM, Kathey Marsden
<km...@sbcglobal.net> wrote:
> On 4/23/2010 4:14 PM, Rahul Akolkar wrote:
>>
>> The ability to formulate a GSoC proposal and attract mentor(s) for it
>> must be rewarded.
>
> If original just means not going for one of the ideas  proposed by a mentor
> for GSoC and exploring what's needed, including the issues in Jira and
> attracting a mentor, than this would be fine as a global factor I think.
<snip/>

Thats the intent, yes. Think, identify, design, execute. A ready-made
proposal from a mentor misses one or more of those steps.

-Rahul


> Perhaps I misunderstood  what was meant by original idea.
>

Re: Reviewing the evaluation process

Posted by Kathey Marsden <km...@sbcglobal.net>.
On 4/23/2010 4:14 PM, Rahul Akolkar wrote:
>
> <snap/>
>
> The ability to formulate a GSoC proposal and attract mentor(s) for it
> must be rewarded.

If original just means not going for one of the ideas  proposed by a 
mentor for GSoC and exploring what's needed, including the issues in 
Jira and attracting a mentor, than this would be fine as a global factor 
I think.   Perhaps I misunderstood  what was meant by original idea.



Re: Reviewing the evaluation process

Posted by Rahul Akolkar <ra...@gmail.com>.
On Thu, Apr 22, 2010 at 11:49 AM, Kathey Marsden
<km...@sbcglobal.net> wrote:
> On 4/22/2010 7:24 AM, Ross Gardler wrote:
>>
<snip/>
>> Original Ideas are good
>> =======================
>>
>> Past experience has shown that if a student proposes their own idea and it
>> is accepted the student is going to be strong.
>>
>> PROPOSAL
>> --------
>>
>> Add the following to the mentor ranking:
>>
>> Is the project definition and idea originally the mentee's, the
>> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
>> if collaborative, 0 if mentor's)
>>
> I think this one is tricky.  In a standards based product like Derby there
> is not that much room for creativity in the initial  "idea" and projects
>  are mostly based on existing Jira entries.  Also really everything is
> collaborative in the community and that should be encouraged.    I think I'd
> rather see this one left out.
>
<snap/>

The ability to formulate a GSoC proposal and attract mentor(s) for it
must be rewarded. While I understand it may be easier to be original
with some projects than others, its possible to be original and
creative anywhere :-) FWIW, in my standards work, I see progress
coming from implementations doing new and interesting things and
pushing boundaries.

Note that, by design, the question carries relatively low weight
(other questions have upto 4 points). Overall, its 2/21.

It is our failure if we create the impression that we are only looking
for GSoC coders who will deliver defined modules of code for us. Lets
look for the next generation of innovators at the ASF. Lets make that
clear.


> Additional items -
> 1) I think 2 points for a live (non-email) interview is too much.  Opposing
> timezones can make this hard to coordinate and I think skills in written
> communication is what is key to success.   If we have to keep it, make it
> clear that IRC is also ok for full points.
>
<snip/>

My experience is that the above (need for those 2 points) is what
pursuaded me to have a VoIP conversation with a mentee who is exactly
12 hours apart. Even with the time difference and a slight language
problem, I found it to be the most useful and expeditious thing I did
in that particular evaluation.

Having said that, I'm fine with IRC if thats the only feasible option.


> 2) I think uneven ranking is still something of an issue, but I don't know
> how to address it without putting a lot more work on the admins to review
> the mentor ranking.
<snap/>

Yup, its uneven and its a hard problem unless someone is willing to
put a lot of effort into it every year.

-Rahul


>  I think Alexei's 0 point comments for suggested
> adjustments or more information from students were good, but if we all did
> that, it would mean a lot of comments for the admins to review and
> negotiation that would take more time than we have.
>
> And just a couple doc issues ...
>
> 3) Since the ranking process is public (and I am glad that it is), we should
> advise students and mentors to look at it carefully and make sure their
> proposal meets the criteria to the best of their ability.
>
> 4) Mentors should be encouraged to  monitor their student's application
> during the critical ranking period to spot any mistakes or injustice and
> speak up at the time.
>
> Kathey
>
>
>

Re: Reviewing the evaluation process

Posted by Ted Dunning <te...@gmail.com>.
I should add that our community is very active (occasionally even
aggressive) in providing feedback for ideas whether or not they came in as
part of GSoC.  As we see it, the Summer of Code is just a one reason to
contribute among many others (a hobby, a full time job, religious commitment
to a learning algorithm being others).  That tends to correlate with ideas
coming in a little less well formed (but not at all always thus) and with a
slightly higher hand-holding requirement (but not much higher).

All of the students who have become committers for Mahout have been ones who
provided a lot of their own impetus, or as we say in Apache land, scratch
their own itch.

On Thu, Apr 22, 2010 at 9:26 AM, Grant Ingersoll <gs...@apache.org>wrote:

> > Either way, it is a very valuable feature for us.
>
> +1.  I would almost go so far as to say we very actively encourage students
> to submit their own ideas.  This is, after all, how open source works, no?
>  Scratch your own itch.

Re: Reviewing the evaluation process

Posted by Rahul Akolkar <ra...@gmail.com>.
On Fri, Apr 23, 2010 at 1:13 PM, Kathey Marsden
<km...@sbcglobal.net> wrote:
> On 4/22/2010 9:26 AM, Grant Ingersoll wrote:
>>
>> On Apr 22, 2010, at 12:14 PM, Ted Dunning wrote:
>>
>>
>>>
>>> On the other hand, this has been the single criterion that has defined
>>> successful students in Mahout (which is definitely less standards
>>> driven).
>>>
>>> In Derby and similar projects, I think that this can be interpreted
>>> differently, but it still is a useful ranking indicator.  Within the set
>>> of
>>> Derby applicants, this would be very useful.  Perhaps there should be a
>>> countervailing feature that allows Derby to be marked as "project that is
>>> very hard for students to be entirely original in their proposal (+1)"
>>> would
>>> allow a global comparison to be reasonably valid.  Or perhaps gating by
>>> number of mentors first so the ranking is mostly within the project would
>>> solve that.
>>>
>>> Either way, it is a very valuable feature for us.
>>>
>>
>>
>
> I can see that this is an important project specific factor.  I imagine each
> project could identify such a factor that could be given from (0-2) points.
> Perhaps for Derby it might be experience or course work in database or
> something else.  My concern is that factors more specific to weighted some
> projects go into the global ranking.  A finer grained analysis would be
> great where each project had whatever project specific factor they choose
> get a 0-2 ranking but that might  be hard to manage and communicate.
>
<snip/>

Yeah, project-specific seems like a can of worms. If we think its
uneven now ... ;-)

-Rahul


>
> Thanks
>
> Kathey
>
>

Re: Reviewing the evaluation process

Posted by Kathey Marsden <km...@sbcglobal.net>.
On 4/22/2010 9:26 AM, Grant Ingersoll wrote:
> On Apr 22, 2010, at 12:14 PM, Ted Dunning wrote:
>
>    
>> On the other hand, this has been the single criterion that has defined
>> successful students in Mahout (which is definitely less standards driven).
>>
>> In Derby and similar projects, I think that this can be interpreted
>> differently, but it still is a useful ranking indicator.  Within the set of
>> Derby applicants, this would be very useful.  Perhaps there should be a
>> countervailing feature that allows Derby to be marked as "project that is
>> very hard for students to be entirely original in their proposal (+1)" would
>> allow a global comparison to be reasonably valid.  Or perhaps gating by
>> number of mentors first so the ranking is mostly within the project would
>> solve that.
>>
>> Either way, it is a very valuable feature for us.
>>      
>    
I can see that this is an important project specific factor.  I imagine 
each project could identify such a factor that could be given from (0-2) 
points. Perhaps for Derby it might be experience or course work in 
database or something else.  My concern is that factors more specific to 
weighted some projects go into the global ranking.  A finer grained 
analysis would be great where each project had whatever project specific 
factor they choose get a 0-2 ranking but that might  be hard to manage 
and communicate.


Thanks

Kathey


Re: Reviewing the evaluation process

Posted by Grant Ingersoll <gs...@apache.org>.
On Apr 22, 2010, at 12:14 PM, Ted Dunning wrote:

> On the other hand, this has been the single criterion that has defined
> successful students in Mahout (which is definitely less standards driven).
> 
> In Derby and similar projects, I think that this can be interpreted
> differently, but it still is a useful ranking indicator.  Within the set of
> Derby applicants, this would be very useful.  Perhaps there should be a
> countervailing feature that allows Derby to be marked as "project that is
> very hard for students to be entirely original in their proposal (+1)" would
> allow a global comparison to be reasonably valid.  Or perhaps gating by
> number of mentors first so the ranking is mostly within the project would
> solve that.
> 
> Either way, it is a very valuable feature for us.

+1.  I would almost go so far as to say we very actively encourage students to submit their own ideas.  This is, after all, how open source works, no?  Scratch your own itch.

> 
> On Thu, Apr 22, 2010 at 8:49 AM, Kathey Marsden <kmarsdenderby@sbcglobal.net
>> wrote:
> 
>> Is the project definition and idea originally the mentee's, the
>>> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
>>> if collaborative, 0 if mentor's)
>>> 
>>> I think this one is tricky.  In a standards based product like Derby
>> there is not that much room for creativity in the initial  "idea" and
>> projects  are mostly based on existing Jira entries.  Also really everything
>> is collaborative in the community and that should be encouraged.    I think
>> I'd rather see this one left out.



Re: Reviewing the evaluation process

Posted by Ted Dunning <te...@gmail.com>.
On the other hand, this has been the single criterion that has defined
successful students in Mahout (which is definitely less standards driven).

In Derby and similar projects, I think that this can be interpreted
differently, but it still is a useful ranking indicator.  Within the set of
Derby applicants, this would be very useful.  Perhaps there should be a
countervailing feature that allows Derby to be marked as "project that is
very hard for students to be entirely original in their proposal (+1)" would
allow a global comparison to be reasonably valid.  Or perhaps gating by
number of mentors first so the ranking is mostly within the project would
solve that.

Either way, it is a very valuable feature for us.

On Thu, Apr 22, 2010 at 8:49 AM, Kathey Marsden <kmarsdenderby@sbcglobal.net
> wrote:

> Is the project definition and idea originally the mentee's, the
>> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
>> if collaborative, 0 if mentor's)
>>
>>  I think this one is tricky.  In a standards based product like Derby
> there is not that much room for creativity in the initial  "idea" and
> projects  are mostly based on existing Jira entries.  Also really everything
> is collaborative in the community and that should be encouraged.    I think
> I'd rather see this one left out.

Re: Reviewing the evaluation process

Posted by Kathey Marsden <km...@sbcglobal.net>.
On 4/22/2010 7:24 AM, Ross Gardler wrote:
> I'd like to make a few comments about issues that have arisen during 
> the evaluation process for GSoC. I'm going to give my opinion on each, 
> please treat this as lazy consensus - do speak up if you wish to 
> disagree or add more items:
Thank's Ross,
A few comments inline.

>
> Not enough visibility of the process
> ====================================
>
> There will always be someone who doesn't read the stuff we send out. 
> Where this is the case I don't think we should worry ourselves.
>
> Sending to PMCs (including the incubator PMC) is sufficient to reach 
> people. We don't want to send out to committers@ as the project as a 
> whole needs to be behind taking on a GSoC student.
>
> Reaching PPMCs is more problematic, I think we should continue to rely 
> on incubator mentors taking the message to their projects if they feel 
> it is appropriate.
>
> PROPOSAL
> --------
> Make it explicit that incubator mentors should pass the message on to 
> PPMCs if appropriate.
>
> Marking experience mentors up
> =============================
>
> I really don't like the idea up to 2 points for having been a 
> successful mentor before, firstly it is error prone (e.g. both 
> Bertrand and Luciano have been mentors *and* admins, yet the admin 
> this year was unaware of that). Secondly, just because someone has 
> mentored a student in the past doesn't mean they will be better than 
> another mentor. Finally, mentoring a failing student is, in many ways, 
> more educational than mentoring a successful one.
>
> We already have "Does the mentor show an understanding of how to 
> mentor a student? (0-4 points)" - I'm more interested in whether the 
> mentor knows what is expected. However, applying this score is difficult.
>
I think we should also have part of the process where the mentor puts in 
a zero point comment with their evidence that they do understand how to 
mentor a student.  It will give the admins something to reference as 
they probably have no personal understanding of the the mentor's 
understanding or experience.
> PROPOSAL
> --------
> Remove "Has the mentor had a successful student in the past (0-2 
> points)" from the admin rankings
>
> Add some docs to the ranking process about what admins are looking for 
> with respect to "oes the mentor show an understanding of how to mentor 
> a student? (0-4 points)" (Noirins mail with the subject Admin 
> coordination to alexei.fedotov on code-awards wouild be a good 
> starting point)
>
+1 Two points is a lot and might discourage new mentors to engage in the 
program.

> Original Ideas are good
> =======================
>
> Past experience has shown that if a student proposes their own idea 
> and it is accepted the student is going to be strong.
>
> PROPOSAL
> --------
>
> Add the following to the mentor ranking:
>
> Is the project definition and idea originally the mentee's, the
> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
> if collaborative, 0 if mentor's)
>
I think this one is tricky.  In a standards based product like Derby 
there is not that much room for creativity in the initial  "idea" and 
projects  are mostly based on existing Jira entries.  Also really 
everything is collaborative in the community and that should be 
encouraged.    I think I'd rather see this one left out.

Additional items -
1) I think 2 points for a live (non-email) interview is too much.  
Opposing timezones can make this hard to coordinate and I think skills 
in written communication is what is key to success.   If we have to keep 
it, make it clear that IRC is also ok for full points.

2) I think uneven ranking is still something of an issue, but I don't 
know how to address it without putting a lot more work on the admins to 
review the mentor ranking.  I think Alexei's 0 point comments for 
suggested adjustments or more information from students were good, but 
if we all did that, it would mean a lot of comments for the admins to 
review and negotiation that would take more time than we have.

And just a couple doc issues ...

3) Since the ranking process is public (and I am glad that it is), we 
should advise students and mentors to look at it carefully and make sure 
their proposal meets the criteria to the best of their ability.

4) Mentors should be encouraged to  monitor their student's application 
during the critical ranking period to spot any mistakes or injustice and 
speak up at the time.

Kathey






Re: Reviewing the evaluation process

Posted by Rahul Akolkar <ra...@gmail.com>.
+1 to all below.

-Rahul


On Thu, Apr 22, 2010 at 10:24 AM, Ross Gardler <rg...@apache.org> wrote:
> I'd like to make a few comments about issues that have arisen during the
> evaluation process for GSoC. I'm going to give my opinion on each, please
> treat this as lazy consensus - do speak up if you wish to disagree or add
> more items:
>
> Not enough visibility of the process
> ====================================
>
> There will always be someone who doesn't read the stuff we send out. Where
> this is the case I don't think we should worry ourselves.
>
> Sending to PMCs (including the incubator PMC) is sufficient to reach people.
> We don't want to send out to committers@ as the project as a whole needs to
> be behind taking on a GSoC student.
>
> Reaching PPMCs is more problematic, I think we should continue to rely on
> incubator mentors taking the message to their projects if they feel it is
> appropriate.
>
> PROPOSAL
> --------
> Make it explicit that incubator mentors should pass the message on to PPMCs
> if appropriate.
>
> Marking experience mentors up
> =============================
>
> I really don't like the idea up to 2 points for having been a successful
> mentor before, firstly it is error prone (e.g. both Bertrand and Luciano
> have been mentors *and* admins, yet the admin this year was unaware of
> that). Secondly, just because someone has mentored a student in the past
> doesn't mean they will be better than another mentor. Finally, mentoring a
> failing student is, in many ways, more educational than mentoring a
> successful one.
>
> We already have "Does the mentor show an understanding of how to mentor a
> student? (0-4 points)" - I'm more interested in whether the mentor knows
> what is expected. However, applying this score is difficult.
>
> PROPOSAL
> --------
> Remove "Has the mentor had a successful student in the past (0-2 points)"
> from the admin rankings
>
> Add some docs to the ranking process about what admins are looking for with
> respect to "oes the mentor show an understanding of how to mentor a student?
> (0-4 points)" (Noirins mail with the subject Admin coordination to
> alexei.fedotov on code-awards wouild be a good starting point)
>
> Original Ideas are good
> =======================
>
> Past experience has shown that if a student proposes their own idea and it
> is accepted the student is going to be strong.
>
> PROPOSAL
> --------
>
> Add the following to the mentor ranking:
>
> Is the project definition and idea originally the mentee's, the
> mentor's or a collaborative effort? (0-2 points, 2 if mentee's idea, 1
> if collaborative, 0 if mentor's)
>