You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Sean Owen <so...@cloudera.com> on 2016/06/01 18:36:06 UTC

Spark 2.0.0-preview artifacts still not available in Maven

Just checked and they are still not published this week. Can these be
published ASAP to complete the 2.0.0-preview release?

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Reynold Xin <rx...@databricks.com>.
One thing we can do is to do monthly milestone releases, similar to other
projects (e.g. Scala).

So we can have Apache Spark 2.1.0-M1, Apache Spark 2.1.0-M2.




On Thu, Jun 2, 2016 at 12:42 PM, Tom Graves <tg...@yahoo.com> wrote:

> The documentation for the preview release also seem to be missing?
>
> Also what happens if we want to do a second preview release?  The naming
> doesn't seem to allow then unless we call it preview 2.
>
> Tom
>
>
> On Wednesday, June 1, 2016 6:27 PM, Sean Owen <so...@cloudera.com> wrote:
>
>
> On Wed, Jun 1, 2016 at 5:58 PM, Reynold Xin <rx...@databricks.com> wrote:
> > The preview release is available here:
> > http://spark.apache.org/downloads.html (there is an entire section
> dedicated
> > to it and also there is a news link to it on the right).
>
> Oops, it is indeed down there at the bottom, before nightlies. I
> honestly missed it below the fold. I'd advocate for making it a (non
> default?) option in the main downloads dropdown, but this then becomes
> a minor issue. The core source/binary artifacts _are_ publicly
> available.
>
>
> > "In addition to the distribution directory, project that use Maven or a
> > related build tool sometimes place their releases on
> repository.apache.org
> > beside some convenience binaries. The distribution directory is required,
> > while the repository system is an optional convenience."
>
> Agree. The question is what makes this release special? because other
> releases have been published to Maven. I think the argument is that
> it's a buggy alpha/beta/preview release, but so were 0.x releases.
> Reasonable people could make up different policies, so here I'm
> appealing to guidance: http://www.apache.org/dev/release.html
>
> "Releases are packages that have been approved for general public
> release, with varying degrees of caveat regarding their perceived
> quality or potential for change. Releases that are intended for
> everyday usage by non-developers are usually referred to as "stable"
> or "general availability (GA)" releases. Releases that are believed to
> be usable by testers and developers outside the project, but perhaps
> not yet stable in terms of features or functionality, are usually
> referred to as "beta" or "unstable". Releases that only represent a
> project milestone and are intended only for bleeding-edge developers
> working outside the project are called "alpha"."
>
> I don't think releases are defined by whether they're stable or buggy,
> but by whether they were produced by a sanctioned process that
> protects contributors under the ASF umbrella, etc etc. Compare to a
> nightly build which we don't want everyone to consume, not so much
> because it might be buggier, but because these protections don't
> apply.
>
> Certainly, it's vital to communicate how to interpret the stability of
> the releases, but -preview releases are still normal releases to the
> public.
>
> I don't think bugginess therefore is the question. Any Spark dev knows
> that x.y.0 Spark releases have gone out with even Critical and in the
> past Blocker issues unresolved, and the world failed to fall apart.
> (We're better about this now.) I actually think the -preview release
> idea is worth repeating for this reason -- .0-preview is the new .0.
> It'd be more accurate IMHO and better for all.
>
>
> > I think it'd be pretty bad if preview releases in anyway become "default
> > version", because they are unstable and contain a lot of blocker bugs.
>
> Why would this happen? releases happen ~3 months and could happen
> faster if this is a concern. 2.0.0 final is, I'd wager, coming in <1
> month.
>
>
> > 2. On the download page, have two sections. One listing the normal
> releases,
> > and the other listing preview releases.
>
> +1, that puts it above the fold and easily findable to anyone willing
> to consume such a thing.
>
>
> > 3. Everywhere we mention preview releases, include the proper disclaimer
> > e.g. "This preview is not a stable release in terms of either API or
> > functionality, but it is meant to give the community early access to try
> the
> > code that will become Spark 2.0."
>
> Can't hurt to overcommunicate this for -preview releases in general.
>
>
> > 4. Publish normal releases to maven central, and preview releases only to
> > the staging maven repo. But of course we should include the temporary
> maven
> > repo for preview releases on the download page.
>
> This is the only thing I disagree with. AFAIK other ASF projects
> readily publish alpha and beta releases, under varying naming
> conventions (alpha, beta, RC1, etc) It's not something that needs to
> be hidden like a nightly.
>
> The audience for Maven artifacts are developers, not admins or users.
> Compare the risk of a developer somehow not understanding what they're
> getting, to the friction caused by making developers add a repo to get
>
> at it.
>
>
> I get it, that seems minor. But given the recent concern about making
> sure "2.0.0 preview" is available as an ASF release, I'd advise us to
> make sure this release is not any harder to get at than others, to
> really put that to bed.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>
>
>
>

Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Tom Graves <tg...@yahoo.com.INVALID>.
The documentation for the preview release also seem to be missing?
Also what happens if we want to do a second preview release?  The naming doesn't seem to allow then unless we call it preview 2.
Tom 

    On Wednesday, June 1, 2016 6:27 PM, Sean Owen <so...@cloudera.com> wrote:
 

 On Wed, Jun 1, 2016 at 5:58 PM, Reynold Xin <rx...@databricks.com> wrote:
> The preview release is available here:
> http://spark.apache.org/downloads.html (there is an entire section dedicated
> to it and also there is a news link to it on the right).

Oops, it is indeed down there at the bottom, before nightlies. I
honestly missed it below the fold. I'd advocate for making it a (non
default?) option in the main downloads dropdown, but this then becomes
a minor issue. The core source/binary artifacts _are_ publicly
available.


> "In addition to the distribution directory, project that use Maven or a
> related build tool sometimes place their releases on repository.apache.org
> beside some convenience binaries. The distribution directory is required,
> while the repository system is an optional convenience."

Agree. The question is what makes this release special? because other
releases have been published to Maven. I think the argument is that
it's a buggy alpha/beta/preview release, but so were 0.x releases.
Reasonable people could make up different policies, so here I'm
appealing to guidance: http://www.apache.org/dev/release.html

"Releases are packages that have been approved for general public
release, with varying degrees of caveat regarding their perceived
quality or potential for change. Releases that are intended for
everyday usage by non-developers are usually referred to as "stable"
or "general availability (GA)" releases. Releases that are believed to
be usable by testers and developers outside the project, but perhaps
not yet stable in terms of features or functionality, are usually
referred to as "beta" or "unstable". Releases that only represent a
project milestone and are intended only for bleeding-edge developers
working outside the project are called "alpha"."

I don't think releases are defined by whether they're stable or buggy,
but by whether they were produced by a sanctioned process that
protects contributors under the ASF umbrella, etc etc. Compare to a
nightly build which we don't want everyone to consume, not so much
because it might be buggier, but because these protections don't
apply.

Certainly, it's vital to communicate how to interpret the stability of
the releases, but -preview releases are still normal releases to the
public.

I don't think bugginess therefore is the question. Any Spark dev knows
that x.y.0 Spark releases have gone out with even Critical and in the
past Blocker issues unresolved, and the world failed to fall apart.
(We're better about this now.) I actually think the -preview release
idea is worth repeating for this reason -- .0-preview is the new .0.
It'd be more accurate IMHO and better for all.


> I think it'd be pretty bad if preview releases in anyway become "default
> version", because they are unstable and contain a lot of blocker bugs.

Why would this happen? releases happen ~3 months and could happen
faster if this is a concern. 2.0.0 final is, I'd wager, coming in <1
month.


> 2. On the download page, have two sections. One listing the normal releases,
> and the other listing preview releases.

+1, that puts it above the fold and easily findable to anyone willing
to consume such a thing.


> 3. Everywhere we mention preview releases, include the proper disclaimer
> e.g. "This preview is not a stable release in terms of either API or
> functionality, but it is meant to give the community early access to try the
> code that will become Spark 2.0."

Can't hurt to overcommunicate this for -preview releases in general.


> 4. Publish normal releases to maven central, and preview releases only to
> the staging maven repo. But of course we should include the temporary maven
> repo for preview releases on the download page.

This is the only thing I disagree with. AFAIK other ASF projects
readily publish alpha and beta releases, under varying naming
conventions (alpha, beta, RC1, etc) It's not something that needs to
be hidden like a nightly.

The audience for Maven artifacts are developers, not admins or users.
Compare the risk of a developer somehow not understanding what they're
getting, to the friction caused by making developers add a repo to get
at it.

I get it, that seems minor. But given the recent concern about making
sure "2.0.0 preview" is available as an ASF release, I'd advise us to
make sure this release is not any harder to get at than others, to
really put that to bed.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org



  

Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Sean Owen <so...@cloudera.com>.
On Wed, Jun 1, 2016 at 5:58 PM, Reynold Xin <rx...@databricks.com> wrote:
> The preview release is available here:
> http://spark.apache.org/downloads.html (there is an entire section dedicated
> to it and also there is a news link to it on the right).

Oops, it is indeed down there at the bottom, before nightlies. I
honestly missed it below the fold. I'd advocate for making it a (non
default?) option in the main downloads dropdown, but this then becomes
a minor issue. The core source/binary artifacts _are_ publicly
available.


> "In addition to the distribution directory, project that use Maven or a
> related build tool sometimes place their releases on repository.apache.org
> beside some convenience binaries. The distribution directory is required,
> while the repository system is an optional convenience."

Agree. The question is what makes this release special? because other
releases have been published to Maven. I think the argument is that
it's a buggy alpha/beta/preview release, but so were 0.x releases.
Reasonable people could make up different policies, so here I'm
appealing to guidance: http://www.apache.org/dev/release.html

"Releases are packages that have been approved for general public
release, with varying degrees of caveat regarding their perceived
quality or potential for change. Releases that are intended for
everyday usage by non-developers are usually referred to as "stable"
or "general availability (GA)" releases. Releases that are believed to
be usable by testers and developers outside the project, but perhaps
not yet stable in terms of features or functionality, are usually
referred to as "beta" or "unstable". Releases that only represent a
project milestone and are intended only for bleeding-edge developers
working outside the project are called "alpha"."

I don't think releases are defined by whether they're stable or buggy,
but by whether they were produced by a sanctioned process that
protects contributors under the ASF umbrella, etc etc. Compare to a
nightly build which we don't want everyone to consume, not so much
because it might be buggier, but because these protections don't
apply.

Certainly, it's vital to communicate how to interpret the stability of
the releases, but -preview releases are still normal releases to the
public.

I don't think bugginess therefore is the question. Any Spark dev knows
that x.y.0 Spark releases have gone out with even Critical and in the
past Blocker issues unresolved, and the world failed to fall apart.
(We're better about this now.) I actually think the -preview release
idea is worth repeating for this reason -- .0-preview is the new .0.
It'd be more accurate IMHO and better for all.


> I think it'd be pretty bad if preview releases in anyway become "default
> version", because they are unstable and contain a lot of blocker bugs.

Why would this happen? releases happen ~3 months and could happen
faster if this is a concern. 2.0.0 final is, I'd wager, coming in <1
month.


> 2. On the download page, have two sections. One listing the normal releases,
> and the other listing preview releases.

+1, that puts it above the fold and easily findable to anyone willing
to consume such a thing.


> 3. Everywhere we mention preview releases, include the proper disclaimer
> e.g. "This preview is not a stable release in terms of either API or
> functionality, but it is meant to give the community early access to try the
> code that will become Spark 2.0."

Can't hurt to overcommunicate this for -preview releases in general.


> 4. Publish normal releases to maven central, and preview releases only to
> the staging maven repo. But of course we should include the temporary maven
> repo for preview releases on the download page.

This is the only thing I disagree with. AFAIK other ASF projects
readily publish alpha and beta releases, under varying naming
conventions (alpha, beta, RC1, etc) It's not something that needs to
be hidden like a nightly.

The audience for Maven artifacts are developers, not admins or users.
Compare the risk of a developer somehow not understanding what they're
getting, to the friction caused by making developers add a repo to get
at it.

I get it, that seems minor. But given the recent concern about making
sure "2.0.0 preview" is available as an ASF release, I'd advise us to
make sure this release is not any harder to get at than others, to
really put that to bed.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Reynold Xin <rx...@databricks.com>.
Hi Sean,

(writing this email with my Apache hat on only and not Databricks hat)

The preview release is available here:
http://spark.apache.org/downloads.html (there is an entire section
dedicated to it and also there is a news link to it on the right).

Again, I think this is a good opportunity to define what a release should
contain. Based on
http://www.apache.org/dev/release.html#where-do-releases-go

"In addition to the distribution directory, project that use Maven or a
related build tool sometimes place their releases on repository.apache.org
beside some convenience binaries. The distribution directory is required,
while the repository system is an optional convenience."

So I'm reading it as that maven publication is not necessary. My
understanding is that the general community (beyond who follows the dev
list) should understand that preview is not a stable release, and we as the
PMC should set expectations accordingly. Developers that can test the
preview releases tend to be more savvy and are comfortable on the bleeding
edge. It is actually fairly easy for them to add a maven repo. Now reading
the page I realized no where on the page did we mention the temporary maven
repo. I will fix that.

I think it'd be pretty bad if preview releases in anyway become "default
version", because they are unstable and contain a lot of blocker bugs.

So my concrete proposal is:

1. Separate (officially voted) releases into normal and preview.

2. On the download page, have two sections. One listing the normal
releases, and the other listing preview releases.

3. Everywhere we mention preview releases, include the proper disclaimer
e.g. "This preview is not a stable release in terms of either API or
functionality, but it is meant to give the community early access to try
the code that will become Spark 2.0."

4. Publish normal releases to maven central, and preview releases only to
the staging maven repo. But of course we should include the temporary maven
repo for preview releases on the download page.






On Wed, Jun 1, 2016 at 3:10 PM, Sean Owen <so...@cloudera.com> wrote:

> I'll be more specific about the issue that I think trumps all this,
> which I realize maybe not everyone was aware of.
>
> There was a long and contentious discussion on the PMC about, among
> other things, advertising a "Spark 2.0 preview" from Databricks, such
> as at
> https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html
>
> That post has already been updated/fixed from an earlier version, but
> part of the resolution was to make a full "2.0.0 preview" release in
> order to continue to be able to advertise it as such. Without it, I
> believe the PMC's conclusion remains that this blog post / product
> announcement is not allowed by ASF policy. Hence, either the product
> announcements need to be taken down and a bunch of wording changed in
> the Databricks product, or, this needs to be a normal release.
>
> Obviously, it seems far easier to just finish the release per usual. I
> actually didn't realize this had not been offered for download at
> http://spark.apache.org/downloads.html either. It needs to be
> accessible there too.
>
>
> We can get back in the weeds about what a "preview" release means,
> but, normal voted releases can and even should be alpha/beta
> (http://www.apache.org/dev/release.html) The culture is, in theory, to
> release early and often. I don't buy an argument that it's too old, at
> 2 weeks, when the alternative is having nothing at all to test
> against.
>
> On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust <mi...@databricks.com>
> wrote:
> >> I'd think we want less effort, not more, to let people test it? for
> >> example, right now I can't easily try my product build against
> >> 2.0.0-preview.
> >
> >
> > I don't feel super strongly one way or the other, so if we need to
> publish
> > it permanently we can.
> >
> > However, either way you can still test against this release.  You just
> need
> > to add a resolver as well (which is how I have always tested packages
> > against RCs).  One concern with making it permeant is this preview
> release
> > is already fairly far behind branch-2.0, so many of the issues that
> people
> > might report have already been fixed and that might continue even after
> the
> > release is made.  I'd rather be able to force upgrades eventually when we
> > vote on the final 2.0 release.
> >
>

Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Ovidiu-Cristian MARCU <ov...@inria.fr>.
Hi all

IMHO the preview ‘release’ is good at is is now, so no further changes required.
For me the preview was a trigger to what will be the next Spark 2.0, really appreciate the effort guys made to describe it and market it:)

I’ll appreciate if the Apache Spark team will start a vote for a new alpha-beta release and point the current status of the project. Since the preview was released there are numerous updates.

Best,
Ovidiu 
 
> On 05 Jun 2016, at 00:42, Sean Owen <so...@cloudera.com> wrote:
> 
> Artifacts that are not for public consumption shouldn't be in a public
> release; this is instead what nightlies are for. However, this was a
> normal public release.
> 
> I am not even sure why it's viewed as particularly unsafe, but, unsafe
> alpha and beta releases are just releases, and their name and
> documentation clarify their status for those who care. These are
> regularly released by other projects.
> 
> That is, the question is not, is this a beta? Everyone agrees it
> probably is, and is documented as such.
> 
> The question is, can you just not fully release it? I don't think so,
> even as a matter of process, and don't see a good reason not to.
> 
> To Reynold's quote, I think that's suggesting that not all projects
> will release to a repo at all (e.g. OpenOffice?). I don't think it
> means you're free to not release some things to Maven, if that's
> appropriate and common for the type of project.
> 
> Regarding risk, remember that the audience for Maven artifacts are
> developers, not admins or end users. I understand that developers can
> temporarily change their build to use a different resolver if they
> care, but, why? (and, where would someone figure this out?)
> 
> Regardless: the 2.0.0-preview docs aren't published to go along with
> the source/binary releases. Those need be released to the project
> site, though probably under a different /preview/ path or something.
> If they are, is it weird that someone wouldn't find the release in the
> usual place in Maven then?
> 
> Given that the driver of this was concern over wide access to
> 2.0.0-preview, I think it's best to err on the side openness vs some
> theoretical problem.
> 
> On Sat, Jun 4, 2016 at 11:24 PM, Matei Zaharia <ma...@gmail.com> wrote:
>> Personally I'd just put them on the staging repo and link to that on the
>> downloads page. It will create less confusion for people browsing Maven
>> Central later and wondering which releases are safe to use.
>> 
>> Matei
>> 
>> On Jun 3, 2016, at 8:22 AM, Mark Hamstra <ma...@clearstorydata.com> wrote:
>> 
>> It's not a question of whether the preview artifacts can be made available
>> on Maven central, but rather whether they must be or should be.  I've got no
>> problems leaving these unstable, transitory artifacts out of the more
>> permanent, canonical repository.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Marcin Tustin <mt...@handybook.com>.
+1 agree that right the problem is theoretical esp if the preview label is
in the version coordinates as it should be.

On Saturday, June 4, 2016, Sean Owen <so...@cloudera.com> wrote:

> Artifacts that are not for public consumption shouldn't be in a public
> release; this is instead what nightlies are for. However, this was a
> normal public release.
>
> I am not even sure why it's viewed as particularly unsafe, but, unsafe
> alpha and beta releases are just releases, and their name and
> documentation clarify their status for those who care. These are
> regularly released by other projects.
>
> That is, the question is not, is this a beta? Everyone agrees it
> probably is, and is documented as such.
>
> The question is, can you just not fully release it? I don't think so,
> even as a matter of process, and don't see a good reason not to.
>
> To Reynold's quote, I think that's suggesting that not all projects
> will release to a repo at all (e.g. OpenOffice?). I don't think it
> means you're free to not release some things to Maven, if that's
> appropriate and common for the type of project.
>
> Regarding risk, remember that the audience for Maven artifacts are
> developers, not admins or end users. I understand that developers can
> temporarily change their build to use a different resolver if they
> care, but, why? (and, where would someone figure this out?)
>
> Regardless: the 2.0.0-preview docs aren't published to go along with
> the source/binary releases. Those need be released to the project
> site, though probably under a different /preview/ path or something.
> If they are, is it weird that someone wouldn't find the release in the
> usual place in Maven then?
>
> Given that the driver of this was concern over wide access to
> 2.0.0-preview, I think it's best to err on the side openness vs some
> theoretical problem.
>
> On Sat, Jun 4, 2016 at 11:24 PM, Matei Zaharia <matei.zaharia@gmail.com
> <javascript:;>> wrote:
> > Personally I'd just put them on the staging repo and link to that on the
> > downloads page. It will create less confusion for people browsing Maven
> > Central later and wondering which releases are safe to use.
> >
> > Matei
> >
> > On Jun 3, 2016, at 8:22 AM, Mark Hamstra <mark@clearstorydata.com
> <javascript:;>> wrote:
> >
> > It's not a question of whether the preview artifacts can be made
> available
> > on Maven central, but rather whether they must be or should be.  I've
> got no
> > problems leaving these unstable, transitory artifacts out of the more
> > permanent, canonical repository.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org <javascript:;>
> For additional commands, e-mail: dev-help@spark.apache.org <javascript:;>
>
>

-- 
Want to work at Handy? Check out our culture deck and open roles 
<http://www.handy.com/careers>
Latest news <http://www.handy.com/press> at Handy
Handy just raised $50m 
<http://venturebeat.com/2015/11/02/on-demand-home-service-handy-raises-50m-in-round-led-by-fidelity/> led 
by Fidelity


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Sean Owen <so...@cloudera.com>.
Artifacts that are not for public consumption shouldn't be in a public
release; this is instead what nightlies are for. However, this was a
normal public release.

I am not even sure why it's viewed as particularly unsafe, but, unsafe
alpha and beta releases are just releases, and their name and
documentation clarify their status for those who care. These are
regularly released by other projects.

That is, the question is not, is this a beta? Everyone agrees it
probably is, and is documented as such.

The question is, can you just not fully release it? I don't think so,
even as a matter of process, and don't see a good reason not to.

To Reynold's quote, I think that's suggesting that not all projects
will release to a repo at all (e.g. OpenOffice?). I don't think it
means you're free to not release some things to Maven, if that's
appropriate and common for the type of project.

Regarding risk, remember that the audience for Maven artifacts are
developers, not admins or end users. I understand that developers can
temporarily change their build to use a different resolver if they
care, but, why? (and, where would someone figure this out?)

Regardless: the 2.0.0-preview docs aren't published to go along with
the source/binary releases. Those need be released to the project
site, though probably under a different /preview/ path or something.
If they are, is it weird that someone wouldn't find the release in the
usual place in Maven then?

Given that the driver of this was concern over wide access to
2.0.0-preview, I think it's best to err on the side openness vs some
theoretical problem.

On Sat, Jun 4, 2016 at 11:24 PM, Matei Zaharia <ma...@gmail.com> wrote:
> Personally I'd just put them on the staging repo and link to that on the
> downloads page. It will create less confusion for people browsing Maven
> Central later and wondering which releases are safe to use.
>
> Matei
>
> On Jun 3, 2016, at 8:22 AM, Mark Hamstra <ma...@clearstorydata.com> wrote:
>
> It's not a question of whether the preview artifacts can be made available
> on Maven central, but rather whether they must be or should be.  I've got no
> problems leaving these unstable, transitory artifacts out of the more
> permanent, canonical repository.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Matei Zaharia <ma...@gmail.com>.
Personally I'd just put them on the staging repo and link to that on the downloads page. It will create less confusion for people browsing Maven Central later and wondering which releases are safe to use.

Matei

> On Jun 3, 2016, at 8:22 AM, Mark Hamstra <ma...@clearstorydata.com> wrote:
> 
> It's not a question of whether the preview artifacts can be made available on Maven central, but rather whether they must be or should be.  I've got no problems leaving these unstable, transitory artifacts out of the more permanent, canonical repository.
> 
> On Fri, Jun 3, 2016 at 1:53 AM, Steve Loughran <stevel@hortonworks.com <ma...@hortonworks.com>> wrote:
> 
> It's been voted on by the project, so can go up on central
> 
> There's already some JIRAs being filed against it, this is a metric of success as pre-beta of the artifacts.
> 
> The risk of exercising the m2 central option is that people may get expectations that they can point their code at the 2.0.0-preview and then, when a release comes out, simply
> update their dependency; this may/may not be the case. But is it harmful if people do start building and testing against the preview? If it finds problems early, it can only be a good thing
> 
> 
> > On 1 Jun 2016, at 23:10, Sean Owen <sowen@cloudera.com <ma...@cloudera.com>> wrote:
> >
> > I'll be more specific about the issue that I think trumps all this,
> > which I realize maybe not everyone was aware of.
> >
> > There was a long and contentious discussion on the PMC about, among
> > other things, advertising a "Spark 2.0 preview" from Databricks, such
> > as at https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html <https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html>
> >
> > That post has already been updated/fixed from an earlier version, but
> > part of the resolution was to make a full "2.0.0 preview" release in
> > order to continue to be able to advertise it as such. Without it, I
> > believe the PMC's conclusion remains that this blog post / product
> > announcement is not allowed by ASF policy. Hence, either the product
> > announcements need to be taken down and a bunch of wording changed in
> > the Databricks product, or, this needs to be a normal release.
> >
> > Obviously, it seems far easier to just finish the release per usual. I
> > actually didn't realize this had not been offered for download at
> > http://spark.apache.org/downloads.html <http://spark.apache.org/downloads.html> either. It needs to be
> > accessible there too.
> >
> >
> > We can get back in the weeds about what a "preview" release means,
> > but, normal voted releases can and even should be alpha/beta
> > (http://www.apache.org/dev/release.html <http://www.apache.org/dev/release.html>) The culture is, in theory, to
> > release early and often. I don't buy an argument that it's too old, at
> > 2 weeks, when the alternative is having nothing at all to test
> > against.
> >
> > On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust <michael@databricks.com <ma...@databricks.com>> wrote:
> >>> I'd think we want less effort, not more, to let people test it? for
> >>> example, right now I can't easily try my product build against
> >>> 2.0.0-preview.
> >>
> >>
> >> I don't feel super strongly one way or the other, so if we need to publish
> >> it permanently we can.
> >>
> >> However, either way you can still test against this release.  You just need
> >> to add a resolver as well (which is how I have always tested packages
> >> against RCs).  One concern with making it permeant is this preview release
> >> is already fairly far behind branch-2.0, so many of the issues that people
> >> might report have already been fixed and that might continue even after the
> >> release is made.  I'd rather be able to force upgrades eventually when we
> >> vote on the final 2.0 release.
> >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org <ma...@spark.apache.org>
> > For additional commands, e-mail: dev-help@spark.apache.org <ma...@spark.apache.org>
> >
> >
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org <ma...@spark.apache.org>
> For additional commands, e-mail: dev-help@spark.apache.org <ma...@spark.apache.org>
> 
> 


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Mark Hamstra <ma...@clearstorydata.com>.
It's not a question of whether the preview artifacts can be made available
on Maven central, but rather whether they must be or should be.  I've got
no problems leaving these unstable, transitory artifacts out of the more
permanent, canonical repository.

On Fri, Jun 3, 2016 at 1:53 AM, Steve Loughran <st...@hortonworks.com>
wrote:

>
> It's been voted on by the project, so can go up on central
>
> There's already some JIRAs being filed against it, this is a metric of
> success as pre-beta of the artifacts.
>
> The risk of exercising the m2 central option is that people may get
> expectations that they can point their code at the 2.0.0-preview and then,
> when a release comes out, simply
> update their dependency; this may/may not be the case. But is it harmful
> if people do start building and testing against the preview? If it finds
> problems early, it can only be a good thing
>
>
> > On 1 Jun 2016, at 23:10, Sean Owen <so...@cloudera.com> wrote:
> >
> > I'll be more specific about the issue that I think trumps all this,
> > which I realize maybe not everyone was aware of.
> >
> > There was a long and contentious discussion on the PMC about, among
> > other things, advertising a "Spark 2.0 preview" from Databricks, such
> > as at
> https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html
> >
> > That post has already been updated/fixed from an earlier version, but
> > part of the resolution was to make a full "2.0.0 preview" release in
> > order to continue to be able to advertise it as such. Without it, I
> > believe the PMC's conclusion remains that this blog post / product
> > announcement is not allowed by ASF policy. Hence, either the product
> > announcements need to be taken down and a bunch of wording changed in
> > the Databricks product, or, this needs to be a normal release.
> >
> > Obviously, it seems far easier to just finish the release per usual. I
> > actually didn't realize this had not been offered for download at
> > http://spark.apache.org/downloads.html either. It needs to be
> > accessible there too.
> >
> >
> > We can get back in the weeds about what a "preview" release means,
> > but, normal voted releases can and even should be alpha/beta
> > (http://www.apache.org/dev/release.html) The culture is, in theory, to
> > release early and often. I don't buy an argument that it's too old, at
> > 2 weeks, when the alternative is having nothing at all to test
> > against.
> >
> > On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust <mi...@databricks.com>
> wrote:
> >>> I'd think we want less effort, not more, to let people test it? for
> >>> example, right now I can't easily try my product build against
> >>> 2.0.0-preview.
> >>
> >>
> >> I don't feel super strongly one way or the other, so if we need to
> publish
> >> it permanently we can.
> >>
> >> However, either way you can still test against this release.  You just
> need
> >> to add a resolver as well (which is how I have always tested packages
> >> against RCs).  One concern with making it permeant is this preview
> release
> >> is already fairly far behind branch-2.0, so many of the issues that
> people
> >> might report have already been fixed and that might continue even after
> the
> >> release is made.  I'd rather be able to force upgrades eventually when
> we
> >> vote on the final 2.0 release.
> >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> > For additional commands, e-mail: dev-help@spark.apache.org
> >
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Steve Loughran <st...@hortonworks.com>.
It's been voted on by the project, so can go up on central

There's already some JIRAs being filed against it, this is a metric of success as pre-beta of the artifacts.

The risk of exercising the m2 central option is that people may get expectations that they can point their code at the 2.0.0-preview and then, when a release comes out, simply
update their dependency; this may/may not be the case. But is it harmful if people do start building and testing against the preview? If it finds problems early, it can only be a good thing


> On 1 Jun 2016, at 23:10, Sean Owen <so...@cloudera.com> wrote:
> 
> I'll be more specific about the issue that I think trumps all this,
> which I realize maybe not everyone was aware of.
> 
> There was a long and contentious discussion on the PMC about, among
> other things, advertising a "Spark 2.0 preview" from Databricks, such
> as at https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html
> 
> That post has already been updated/fixed from an earlier version, but
> part of the resolution was to make a full "2.0.0 preview" release in
> order to continue to be able to advertise it as such. Without it, I
> believe the PMC's conclusion remains that this blog post / product
> announcement is not allowed by ASF policy. Hence, either the product
> announcements need to be taken down and a bunch of wording changed in
> the Databricks product, or, this needs to be a normal release.
> 
> Obviously, it seems far easier to just finish the release per usual. I
> actually didn't realize this had not been offered for download at
> http://spark.apache.org/downloads.html either. It needs to be
> accessible there too.
> 
> 
> We can get back in the weeds about what a "preview" release means,
> but, normal voted releases can and even should be alpha/beta
> (http://www.apache.org/dev/release.html) The culture is, in theory, to
> release early and often. I don't buy an argument that it's too old, at
> 2 weeks, when the alternative is having nothing at all to test
> against.
> 
> On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust <mi...@databricks.com> wrote:
>>> I'd think we want less effort, not more, to let people test it? for
>>> example, right now I can't easily try my product build against
>>> 2.0.0-preview.
>> 
>> 
>> I don't feel super strongly one way or the other, so if we need to publish
>> it permanently we can.
>> 
>> However, either way you can still test against this release.  You just need
>> to add a resolver as well (which is how I have always tested packages
>> against RCs).  One concern with making it permeant is this preview release
>> is already fairly far behind branch-2.0, so many of the issues that people
>> might report have already been fixed and that might continue even after the
>> release is made.  I'd rather be able to force upgrades eventually when we
>> vote on the final 2.0 release.
>> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
> 
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Sean Owen <so...@cloudera.com>.
I'll be more specific about the issue that I think trumps all this,
which I realize maybe not everyone was aware of.

There was a long and contentious discussion on the PMC about, among
other things, advertising a "Spark 2.0 preview" from Databricks, such
as at https://databricks.com/blog/2016/05/11/apache-spark-2-0-technical-preview-easier-faster-and-smarter.html

That post has already been updated/fixed from an earlier version, but
part of the resolution was to make a full "2.0.0 preview" release in
order to continue to be able to advertise it as such. Without it, I
believe the PMC's conclusion remains that this blog post / product
announcement is not allowed by ASF policy. Hence, either the product
announcements need to be taken down and a bunch of wording changed in
the Databricks product, or, this needs to be a normal release.

Obviously, it seems far easier to just finish the release per usual. I
actually didn't realize this had not been offered for download at
http://spark.apache.org/downloads.html either. It needs to be
accessible there too.


We can get back in the weeds about what a "preview" release means,
but, normal voted releases can and even should be alpha/beta
(http://www.apache.org/dev/release.html) The culture is, in theory, to
release early and often. I don't buy an argument that it's too old, at
2 weeks, when the alternative is having nothing at all to test
against.

On Wed, Jun 1, 2016 at 5:02 PM, Michael Armbrust <mi...@databricks.com> wrote:
>> I'd think we want less effort, not more, to let people test it? for
>> example, right now I can't easily try my product build against
>> 2.0.0-preview.
>
>
> I don't feel super strongly one way or the other, so if we need to publish
> it permanently we can.
>
> However, either way you can still test against this release.  You just need
> to add a resolver as well (which is how I have always tested packages
> against RCs).  One concern with making it permeant is this preview release
> is already fairly far behind branch-2.0, so many of the issues that people
> might report have already been fixed and that might continue even after the
> release is made.  I'd rather be able to force upgrades eventually when we
> vote on the final 2.0 release.
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Michael Armbrust <mi...@databricks.com>.
>
> I'd think we want less effort, not more, to let people test it? for
> example, right now I can't easily try my product build against
> 2.0.0-preview.


I don't feel super strongly one way or the other, so if we need to publish
it permanently we can.

However, either way you can still test against this release.  You just need
to add a resolver as well (which is how I have always tested packages
against RCs).  One concern with making it permeant is this preview release
is already fairly far behind branch-2.0, so many of the issues that people
might report have already been fixed and that might continue even after the
release is made.  I'd rather be able to force upgrades eventually when we
vote on the final 2.0 release.

Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Wed, Jun 1, 2016 at 2:51 PM, Sean Owen <so...@cloudera.com> wrote:
> I'd think we want less effort, not more, to let people test it? for
> example, right now I can't easily try my product build against
> 2.0.0-preview.

While I understand your point of view, I like the extra effort to get
to these artifacts because it prevents people from easily building
their applications on top of what is known to be an unstable release
(either API-wise or quality wise).

I see this preview release more like a snapshot release that was voted
on for wide testing, instead of a proper release that we want to
encourage people to build on. And like snapshots, I like that to use
it on your application you have to go out of your way and add a
separate repository instead of just changing a version string or
command line argument.

My 2 bits.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Sean Owen <so...@cloudera.com>.
An RC is something that gets voted on, and the final one is turned
into a blessed release. I agree that RCs don't get published to Maven
Central, but releases do of course.

This was certainly to be an official release, right? A beta or alpha
can still be an official, published release. The proximate motivation
was to solve a problem of advertising "Apache Spark 2.0.0 preview" in
a product, when no such release existed from the ASF. Hence the point
was to produce a full regular release, and I think that needs to
include the usual Maven artifacts.

I'd think we want less effort, not more, to let people test it? for
example, right now I can't easily try my product build against
2.0.0-preview.

On Wed, Jun 1, 2016 at 3:53 PM, Marcelo Vanzin <va...@cloudera.com> wrote:
> So are RCs, aren't they?
>
> Personally I'm fine with not releasing to maven central. Any extra
> effort needed by regular users to use a preview / RC is good with me.
>
> On Wed, Jun 1, 2016 at 1:50 PM, Reynold Xin <rx...@databricks.com> wrote:
>> To play devil's advocate, previews are technically not RCs. They are
>> actually voted releases.
>>
>> On Wed, Jun 1, 2016 at 1:46 PM, Michael Armbrust <mi...@databricks.com>
>> wrote:
>>>
>>> Yeah, we don't usually publish RCs to central, right?
>>>
>>> On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin <rx...@databricks.com> wrote:
>>>>
>>>> They are here ain't they?
>>>>
>>>> https://repository.apache.org/content/repositories/orgapachespark-1182/
>>>>
>>>> Did you mean publishing them to maven central? My understanding is that
>>>> publishing to maven central isn't a required step of doing theses. This
>>>> might be a good opportunity to discuss that. My thought is that it is since
>>>> Maven central is immutable, and the purposes of the preview releases are to
>>>> get people to test it early on in preparation for the actual release, it
>>>> might be better to not publish preview releases to maven central. Users
>>>> testing with preview releases can just use the temporary repository above.
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen <so...@cloudera.com> wrote:
>>>>>
>>>>> Just checked and they are still not published this week. Can these be
>>>>> published ASAP to complete the 2.0.0-preview release?
>>>>
>>>>
>>>
>>
>
>
>
> --
> Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Jonathan Kelly <jo...@gmail.com>.
I think what Reynold probably means is that previews are releases for which
a vote *passed*.

~ Jonathan

On Wed, Jun 1, 2016 at 1:53 PM Marcelo Vanzin <va...@cloudera.com> wrote:

> So are RCs, aren't they?
>
> Personally I'm fine with not releasing to maven central. Any extra
> effort needed by regular users to use a preview / RC is good with me.
>
> On Wed, Jun 1, 2016 at 1:50 PM, Reynold Xin <rx...@databricks.com> wrote:
> > To play devil's advocate, previews are technically not RCs. They are
> > actually voted releases.
> >
> > On Wed, Jun 1, 2016 at 1:46 PM, Michael Armbrust <michael@databricks.com
> >
> > wrote:
> >>
> >> Yeah, we don't usually publish RCs to central, right?
> >>
> >> On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin <rx...@databricks.com>
> wrote:
> >>>
> >>> They are here ain't they?
> >>>
> >>>
> https://repository.apache.org/content/repositories/orgapachespark-1182/
> >>>
> >>> Did you mean publishing them to maven central? My understanding is that
> >>> publishing to maven central isn't a required step of doing theses. This
> >>> might be a good opportunity to discuss that. My thought is that it is
> since
> >>> Maven central is immutable, and the purposes of the preview releases
> are to
> >>> get people to test it early on in preparation for the actual release,
> it
> >>> might be better to not publish preview releases to maven central. Users
> >>> testing with preview releases can just use the temporary repository
> above.
> >>>
> >>>
> >>>
> >>>
> >>> On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen <so...@cloudera.com> wrote:
> >>>>
> >>>> Just checked and they are still not published this week. Can these be
> >>>> published ASAP to complete the 2.0.0-preview release?
> >>>
> >>>
> >>
> >
>
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Marcelo Vanzin <va...@cloudera.com>.
So are RCs, aren't they?

Personally I'm fine with not releasing to maven central. Any extra
effort needed by regular users to use a preview / RC is good with me.

On Wed, Jun 1, 2016 at 1:50 PM, Reynold Xin <rx...@databricks.com> wrote:
> To play devil's advocate, previews are technically not RCs. They are
> actually voted releases.
>
> On Wed, Jun 1, 2016 at 1:46 PM, Michael Armbrust <mi...@databricks.com>
> wrote:
>>
>> Yeah, we don't usually publish RCs to central, right?
>>
>> On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin <rx...@databricks.com> wrote:
>>>
>>> They are here ain't they?
>>>
>>> https://repository.apache.org/content/repositories/orgapachespark-1182/
>>>
>>> Did you mean publishing them to maven central? My understanding is that
>>> publishing to maven central isn't a required step of doing theses. This
>>> might be a good opportunity to discuss that. My thought is that it is since
>>> Maven central is immutable, and the purposes of the preview releases are to
>>> get people to test it early on in preparation for the actual release, it
>>> might be better to not publish preview releases to maven central. Users
>>> testing with preview releases can just use the temporary repository above.
>>>
>>>
>>>
>>>
>>> On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen <so...@cloudera.com> wrote:
>>>>
>>>> Just checked and they are still not published this week. Can these be
>>>> published ASAP to complete the 2.0.0-preview release?
>>>
>>>
>>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Reynold Xin <rx...@databricks.com>.
To play devil's advocate, previews are technically not RCs. They are
actually voted releases.

On Wed, Jun 1, 2016 at 1:46 PM, Michael Armbrust <mi...@databricks.com>
wrote:

> Yeah, we don't usually publish RCs to central, right?
>
> On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin <rx...@databricks.com> wrote:
>
>> They are here ain't they?
>>
>> https://repository.apache.org/content/repositories/orgapachespark-1182/
>>
>> Did you mean publishing them to maven central? My understanding is that
>> publishing to maven central isn't a required step of doing theses. This
>> might be a good opportunity to discuss that. My thought is that it is since
>> Maven central is immutable, and the purposes of the preview releases are to
>> get people to test it early on in preparation for the actual release, it
>> might be better to not publish preview releases to maven central. Users
>> testing with preview releases can just use the temporary repository above.
>>
>>
>>
>>
>> On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen <so...@cloudera.com> wrote:
>>
>>> Just checked and they are still not published this week. Can these be
>>> published ASAP to complete the 2.0.0-preview release?
>>>
>>
>>
>

Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Michael Armbrust <mi...@databricks.com>.
Yeah, we don't usually publish RCs to central, right?

On Wed, Jun 1, 2016 at 1:06 PM, Reynold Xin <rx...@databricks.com> wrote:

> They are here ain't they?
>
> https://repository.apache.org/content/repositories/orgapachespark-1182/
>
> Did you mean publishing them to maven central? My understanding is that
> publishing to maven central isn't a required step of doing theses. This
> might be a good opportunity to discuss that. My thought is that it is since
> Maven central is immutable, and the purposes of the preview releases are to
> get people to test it early on in preparation for the actual release, it
> might be better to not publish preview releases to maven central. Users
> testing with preview releases can just use the temporary repository above.
>
>
>
>
> On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen <so...@cloudera.com> wrote:
>
>> Just checked and they are still not published this week. Can these be
>> published ASAP to complete the 2.0.0-preview release?
>>
>
>

Re: Spark 2.0.0-preview artifacts still not available in Maven

Posted by Reynold Xin <rx...@databricks.com>.
They are here ain't they?

https://repository.apache.org/content/repositories/orgapachespark-1182/

Did you mean publishing them to maven central? My understanding is that
publishing to maven central isn't a required step of doing theses. This
might be a good opportunity to discuss that. My thought is that it is since
Maven central is immutable, and the purposes of the preview releases are to
get people to test it early on in preparation for the actual release, it
might be better to not publish preview releases to maven central. Users
testing with preview releases can just use the temporary repository above.




On Wed, Jun 1, 2016 at 11:36 AM, Sean Owen <so...@cloudera.com> wrote:

> Just checked and they are still not published this week. Can these be
> published ASAP to complete the 2.0.0-preview release?
>