You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Reynold Xin <rx...@databricks.com> on 2016/07/06 05:35:50 UTC

[VOTE] Release Apache Spark 2.0.0 (RC2)

Please vote on releasing the following candidate as Apache Spark version
2.0.0. The vote is open until Friday, July 8, 2016 at 23:00 PDT and passes
if a majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.0.0
[ ] -1 Do not release this package because ...


The tag to be voted on is v2.0.0-rc2
(4a55b2326c8cf50f772907a8b73fd5e7b3d1aa06).

This release candidate resolves ~2500 issues:
https://s.apache.org/spark-2.0.0-jira

The release files, including signatures, digests, etc. can be found at:
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-bin/

Release artifacts are signed with the following key:
https://people.apache.org/keys/committer/pwendell.asc

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1189/

The documentation corresponding to this release can be found at:
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/


=================================
How can I help test this release?
=================================
If you are a Spark user, you can help us test this release by taking an
existing Spark workload and running on this release candidate, then
reporting any regressions from 1.x.

==========================================
What justifies a -1 vote for this release?
==========================================
Critical bugs impacting major functionalities.

Bugs already present in 1.x, missing features, or bugs related to new
features will not necessarily block this release. Note that historically
Spark documentation has been published on the website separately from the
main release so we do not need to block the release due to documentation
errors either.

Re: [VOTE] Release Apache Spark 2.0.0 (RC2)

Posted by Reynold Xin <rx...@databricks.com>.
This vote is cancelled in favor of rc4.


On Tue, Jul 5, 2016 at 10:35 PM, Reynold Xin <rx...@databricks.com> wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.0.0. The vote is open until Friday, July 8, 2016 at 23:00 PDT and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.0
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.0-rc2
> (4a55b2326c8cf50f772907a8b73fd5e7b3d1aa06).
>
> This release candidate resolves ~2500 issues:
> https://s.apache.org/spark-2.0.0-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1189/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/
>
>
> =================================
> How can I help test this release?
> =================================
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 1.x.
>
> ==========================================
> What justifies a -1 vote for this release?
> ==========================================
> Critical bugs impacting major functionalities.
>
> Bugs already present in 1.x, missing features, or bugs related to new
> features will not necessarily block this release. Note that historically
> Spark documentation has been published on the website separately from the
> main release so we do not need to block the release due to documentation
> errors either.
>
>

Re: [VOTE] Release Apache Spark 2.0.0 (RC2)

Posted by Jonathan Kelly <jo...@gmail.com>.
I see that all blockers targeted for 2.0.0 have either been resolved or
downgraded. Do you have an ETA for the next RC?

Thanks,
Jonathan

On Mon, Jul 11, 2016 at 4:33 AM Sean Owen <so...@cloudera.com> wrote:

> Yeah there were already other blockers when the RC was released. This
> one was already noted in this thread. There will another RC soon I'm
> sure. I guess it would be ideal if the remaining blockers were
> resolved one way or the other before that, to make it possible that
> RC3 could be the final release:
>
> SPARK-14808 Spark MLlib, GraphX, SparkR 2.0 QA umbrella
> SPARK-14812 ML, Graph 2.0 QA: API: Experimental, DeveloperApi, final,
> sealed audit
> SPARK-14813 ML 2.0 QA: API: Python API coverage
> SPARK-14816 Update MLlib, GraphX, SparkR websites for 2.0
> SPARK-14817 ML, Graph, R 2.0 QA: Programming guide update and migration
> guide
> SPARK-15124 R 2.0 QA: New R APIs and API docs
> SPARK-15623 2.0 python coverage ml.feature
> SPARK-15630 2.0 python coverage ml root module
>
> These are possibly all or mostly resolved already and have been
> knocking around a while.
>
> In any event, even a DoA RC3 might be useful if it kept up the testing.
>
> Sean
>
> On Mon, Jul 11, 2016 at 11:12 AM, Sun Rui <su...@163.com> wrote:
> > -1
> > https://issues.apache.org/jira/browse/SPARK-16379
> >
> > On Jul 6, 2016, at 19:28, Maciej Bryński <ma...@brynski.pl> wrote:
> >
> > -1
> > https://issues.apache.org/jira/browse/SPARK-16379
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 2.0.0 (RC2)

Posted by Sean Owen <so...@cloudera.com>.
Yeah there were already other blockers when the RC was released. This
one was already noted in this thread. There will another RC soon I'm
sure. I guess it would be ideal if the remaining blockers were
resolved one way or the other before that, to make it possible that
RC3 could be the final release:

SPARK-14808 Spark MLlib, GraphX, SparkR 2.0 QA umbrella
SPARK-14812 ML, Graph 2.0 QA: API: Experimental, DeveloperApi, final,
sealed audit
SPARK-14813 ML 2.0 QA: API: Python API coverage
SPARK-14816 Update MLlib, GraphX, SparkR websites for 2.0
SPARK-14817 ML, Graph, R 2.0 QA: Programming guide update and migration guide
SPARK-15124 R 2.0 QA: New R APIs and API docs
SPARK-15623 2.0 python coverage ml.feature
SPARK-15630 2.0 python coverage ml root module

These are possibly all or mostly resolved already and have been
knocking around a while.

In any event, even a DoA RC3 might be useful if it kept up the testing.

Sean

On Mon, Jul 11, 2016 at 11:12 AM, Sun Rui <su...@163.com> wrote:
> -1
> https://issues.apache.org/jira/browse/SPARK-16379
>
> On Jul 6, 2016, at 19:28, Maciej Bryński <ma...@brynski.pl> wrote:
>
> -1
> https://issues.apache.org/jira/browse/SPARK-16379
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.0.0 (RC2)

Posted by Sun Rui <su...@163.com>.
-1
https://issues.apache.org/jira/browse/SPARK-16379 <https://issues.apache.org/jira/browse/SPARK-16379>

> On Jul 6, 2016, at 19:28, Maciej Bryński <ma...@brynski.pl> wrote:
> 
> -1
> https://issues.apache.org/jira/browse/SPARK-16379 <https://issues.apache.org/jira/browse/SPARK-16379>

Re: [VOTE] Release Apache Spark 2.0.0 (RC2)

Posted by Maciej Bryński <ma...@brynski.pl>.
-1
https://issues.apache.org/jira/browse/SPARK-16379
https://issues.apache.org/jira/browse/SPARK-16371

2016-07-06 7:35 GMT+02:00 Reynold Xin <rx...@databricks.com>:
> Please vote on releasing the following candidate as Apache Spark version
> 2.0.0. The vote is open until Friday, July 8, 2016 at 23:00 PDT and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.0
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.0-rc2
> (4a55b2326c8cf50f772907a8b73fd5e7b3d1aa06).
>
> This release candidate resolves ~2500 issues:
> https://s.apache.org/spark-2.0.0-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1189/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/
>
>
> =================================
> How can I help test this release?
> =================================
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 1.x.
>
> ==========================================
> What justifies a -1 vote for this release?
> ==========================================
> Critical bugs impacting major functionalities.
>
> Bugs already present in 1.x, missing features, or bugs related to new
> features will not necessarily block this release. Note that historically
> Spark documentation has been published on the website separately from the
> main release so we do not need to block the release due to documentation
> errors either.
>



-- 
Maciek Bryński

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.0.0 (RC2)

Posted by Dmitry Zhukov <dz...@transferwise.com>.
Sorry for bringing this topic up. Any updates here?

Really looking forward to the upcoming RC.

Thanks!

On Wed, Jul 6, 2016 at 6:19 PM, Ted Yu <yu...@gmail.com> wrote:

> Running the following command:
> build/mvn clean -Phive -Phive-thriftserver -Pyarn -Phadoop-2.6 -Psparkr
> -Dhadoop.version=2.7.0 package
>
> The build stopped with this test failure:
>
> ^[[31m- SPARK-9757 Persist Parquet relation with decimal column *** FAILED
> ***^[[0m
>
>
> On Wed, Jul 6, 2016 at 6:25 AM, Sean Owen <so...@cloudera.com> wrote:
>
>> Yeah we still have some blockers; I agree SPARK-16379 is a blocker
>> which came up yesterday. We also have 5 existing blockers, all doc
>> related:
>>
>> SPARK-14808 Spark MLlib, GraphX, SparkR 2.0 QA umbrella
>> SPARK-14812 ML, Graph 2.0 QA: API: Experimental, DeveloperApi, final,
>> sealed audit
>> SPARK-14816 Update MLlib, GraphX, SparkR websites for 2.0
>> SPARK-14817 ML, Graph, R 2.0 QA: Programming guide update and migration
>> guide
>> SPARK-15124 R 2.0 QA: New R APIs and API docs
>>
>> While we'll almost surely need another RC, this one is well worth
>> testing. It's much closer than even the last one.
>>
>> The sigs/hashes check out, and I successfully built with Ubuntu 16 /
>> Java 8 with -Pyarn -Phadoop-2.7 -Phive. Tests pass except for:
>>
>> DirectKafkaStreamSuite:
>> - offset recovery *** FAILED ***
>>   The code passed to eventually never returned normally. Attempted 196
>> times over 10.028979855 seconds. Last failure message:
>> strings.forall({
>>     ((x$1: Any) => DirectKafkaStreamSuite.collectedData.contains(x$1))
>>   }) was false. (DirectKafkaStreamSuite.scala:250)
>> - Direct Kafka stream report input information
>>
>> I know we've seen this before and tried to fix it but it may need another
>> look.
>>
>> On Wed, Jul 6, 2016 at 6:35 AM, Reynold Xin <rx...@databricks.com> wrote:
>> > Please vote on releasing the following candidate as Apache Spark version
>> > 2.0.0. The vote is open until Friday, July 8, 2016 at 23:00 PDT and
>> passes
>> > if a majority of at least 3 +1 PMC votes are cast.
>> >
>> > [ ] +1 Release this package as Apache Spark 2.0.0
>> > [ ] -1 Do not release this package because ...
>> >
>> >
>> > The tag to be voted on is v2.0.0-rc2
>> > (4a55b2326c8cf50f772907a8b73fd5e7b3d1aa06).
>> >
>> > This release candidate resolves ~2500 issues:
>> > https://s.apache.org/spark-2.0.0-jira
>> >
>> > The release files, including signatures, digests, etc. can be found at:
>> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-bin/
>> >
>> > Release artifacts are signed with the following key:
>> > https://people.apache.org/keys/committer/pwendell.asc
>> >
>> > The staging repository for this release can be found at:
>> > https://repository.apache.org/content/repositories/orgapachespark-1189/
>> >
>> > The documentation corresponding to this release can be found at:
>> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/
>> >
>> >
>> > =================================
>> > How can I help test this release?
>> > =================================
>> > If you are a Spark user, you can help us test this release by taking an
>> > existing Spark workload and running on this release candidate, then
>> > reporting any regressions from 1.x.
>> >
>> > ==========================================
>> > What justifies a -1 vote for this release?
>> > ==========================================
>> > Critical bugs impacting major functionalities.
>> >
>> > Bugs already present in 1.x, missing features, or bugs related to new
>> > features will not necessarily block this release. Note that historically
>> > Spark documentation has been published on the website separately from
>> the
>> > main release so we do not need to block the release due to documentation
>> > errors either.
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>
>

Re: [VOTE] Release Apache Spark 2.0.0 (RC2)

Posted by Ted Yu <yu...@gmail.com>.
Running the following command:
build/mvn clean -Phive -Phive-thriftserver -Pyarn -Phadoop-2.6 -Psparkr
-Dhadoop.version=2.7.0 package

The build stopped with this test failure:

^[[31m- SPARK-9757 Persist Parquet relation with decimal column *** FAILED
***^[[0m


On Wed, Jul 6, 2016 at 6:25 AM, Sean Owen <so...@cloudera.com> wrote:

> Yeah we still have some blockers; I agree SPARK-16379 is a blocker
> which came up yesterday. We also have 5 existing blockers, all doc
> related:
>
> SPARK-14808 Spark MLlib, GraphX, SparkR 2.0 QA umbrella
> SPARK-14812 ML, Graph 2.0 QA: API: Experimental, DeveloperApi, final,
> sealed audit
> SPARK-14816 Update MLlib, GraphX, SparkR websites for 2.0
> SPARK-14817 ML, Graph, R 2.0 QA: Programming guide update and migration
> guide
> SPARK-15124 R 2.0 QA: New R APIs and API docs
>
> While we'll almost surely need another RC, this one is well worth
> testing. It's much closer than even the last one.
>
> The sigs/hashes check out, and I successfully built with Ubuntu 16 /
> Java 8 with -Pyarn -Phadoop-2.7 -Phive. Tests pass except for:
>
> DirectKafkaStreamSuite:
> - offset recovery *** FAILED ***
>   The code passed to eventually never returned normally. Attempted 196
> times over 10.028979855 seconds. Last failure message:
> strings.forall({
>     ((x$1: Any) => DirectKafkaStreamSuite.collectedData.contains(x$1))
>   }) was false. (DirectKafkaStreamSuite.scala:250)
> - Direct Kafka stream report input information
>
> I know we've seen this before and tried to fix it but it may need another
> look.
>
> On Wed, Jul 6, 2016 at 6:35 AM, Reynold Xin <rx...@databricks.com> wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> > 2.0.0. The vote is open until Friday, July 8, 2016 at 23:00 PDT and
> passes
> > if a majority of at least 3 +1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 2.0.0
> > [ ] -1 Do not release this package because ...
> >
> >
> > The tag to be voted on is v2.0.0-rc2
> > (4a55b2326c8cf50f772907a8b73fd5e7b3d1aa06).
> >
> > This release candidate resolves ~2500 issues:
> > https://s.apache.org/spark-2.0.0-jira
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1189/
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/
> >
> >
> > =================================
> > How can I help test this release?
> > =================================
> > If you are a Spark user, you can help us test this release by taking an
> > existing Spark workload and running on this release candidate, then
> > reporting any regressions from 1.x.
> >
> > ==========================================
> > What justifies a -1 vote for this release?
> > ==========================================
> > Critical bugs impacting major functionalities.
> >
> > Bugs already present in 1.x, missing features, or bugs related to new
> > features will not necessarily block this release. Note that historically
> > Spark documentation has been published on the website separately from the
> > main release so we do not need to block the release due to documentation
> > errors either.
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 2.0.0 (RC2)

Posted by Cody Koeninger <co...@koeninger.org>.
I know some usages of the 0.10 kafka connector will be broken until
https://github.com/apache/spark/pull/14026  is merged, but the 0.10
connector is a new feature, so not blocking.

Sean I'm assuming the DirectKafkaStreamSuite failure you saw was for
0.8?  I'll take another look at it.

On Wed, Jul 6, 2016 at 8:25 AM, Sean Owen <so...@cloudera.com> wrote:
> Yeah we still have some blockers; I agree SPARK-16379 is a blocker
> which came up yesterday. We also have 5 existing blockers, all doc
> related:
>
> SPARK-14808 Spark MLlib, GraphX, SparkR 2.0 QA umbrella
> SPARK-14812 ML, Graph 2.0 QA: API: Experimental, DeveloperApi, final,
> sealed audit
> SPARK-14816 Update MLlib, GraphX, SparkR websites for 2.0
> SPARK-14817 ML, Graph, R 2.0 QA: Programming guide update and migration guide
> SPARK-15124 R 2.0 QA: New R APIs and API docs
>
> While we'll almost surely need another RC, this one is well worth
> testing. It's much closer than even the last one.
>
> The sigs/hashes check out, and I successfully built with Ubuntu 16 /
> Java 8 with -Pyarn -Phadoop-2.7 -Phive. Tests pass except for:
>
> DirectKafkaStreamSuite:
> - offset recovery *** FAILED ***
>   The code passed to eventually never returned normally. Attempted 196
> times over 10.028979855 seconds. Last failure message:
> strings.forall({
>     ((x$1: Any) => DirectKafkaStreamSuite.collectedData.contains(x$1))
>   }) was false. (DirectKafkaStreamSuite.scala:250)
> - Direct Kafka stream report input information
>
> I know we've seen this before and tried to fix it but it may need another look.
>
> On Wed, Jul 6, 2016 at 6:35 AM, Reynold Xin <rx...@databricks.com> wrote:
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.0.0. The vote is open until Friday, July 8, 2016 at 23:00 PDT and passes
>> if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.0.0
>> [ ] -1 Do not release this package because ...
>>
>>
>> The tag to be voted on is v2.0.0-rc2
>> (4a55b2326c8cf50f772907a8b73fd5e7b3d1aa06).
>>
>> This release candidate resolves ~2500 issues:
>> https://s.apache.org/spark-2.0.0-jira
>>
>> The release files, including signatures, digests, etc. can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1189/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/
>>
>>
>> =================================
>> How can I help test this release?
>> =================================
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions from 1.x.
>>
>> ==========================================
>> What justifies a -1 vote for this release?
>> ==========================================
>> Critical bugs impacting major functionalities.
>>
>> Bugs already present in 1.x, missing features, or bugs related to new
>> features will not necessarily block this release. Note that historically
>> Spark documentation has been published on the website separately from the
>> main release so we do not need to block the release due to documentation
>> errors either.
>>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [VOTE] Release Apache Spark 2.0.0 (RC2)

Posted by Sean Owen <so...@cloudera.com>.
Yeah we still have some blockers; I agree SPARK-16379 is a blocker
which came up yesterday. We also have 5 existing blockers, all doc
related:

SPARK-14808 Spark MLlib, GraphX, SparkR 2.0 QA umbrella
SPARK-14812 ML, Graph 2.0 QA: API: Experimental, DeveloperApi, final,
sealed audit
SPARK-14816 Update MLlib, GraphX, SparkR websites for 2.0
SPARK-14817 ML, Graph, R 2.0 QA: Programming guide update and migration guide
SPARK-15124 R 2.0 QA: New R APIs and API docs

While we'll almost surely need another RC, this one is well worth
testing. It's much closer than even the last one.

The sigs/hashes check out, and I successfully built with Ubuntu 16 /
Java 8 with -Pyarn -Phadoop-2.7 -Phive. Tests pass except for:

DirectKafkaStreamSuite:
- offset recovery *** FAILED ***
  The code passed to eventually never returned normally. Attempted 196
times over 10.028979855 seconds. Last failure message:
strings.forall({
    ((x$1: Any) => DirectKafkaStreamSuite.collectedData.contains(x$1))
  }) was false. (DirectKafkaStreamSuite.scala:250)
- Direct Kafka stream report input information

I know we've seen this before and tried to fix it but it may need another look.

On Wed, Jul 6, 2016 at 6:35 AM, Reynold Xin <rx...@databricks.com> wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 2.0.0. The vote is open until Friday, July 8, 2016 at 23:00 PDT and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.0
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.0-rc2
> (4a55b2326c8cf50f772907a8b73fd5e7b3d1aa06).
>
> This release candidate resolves ~2500 issues:
> https://s.apache.org/spark-2.0.0-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1189/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/
>
>
> =================================
> How can I help test this release?
> =================================
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 1.x.
>
> ==========================================
> What justifies a -1 vote for this release?
> ==========================================
> Critical bugs impacting major functionalities.
>
> Bugs already present in 1.x, missing features, or bugs related to new
> features will not necessarily block this release. Note that historically
> Spark documentation has been published on the website separately from the
> main release so we do not need to block the release due to documentation
> errors either.
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org