You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Patrick Wendell <pw...@gmail.com> on 2015/06/03 05:53:32 UTC

[VOTE] Release Apache Spark 1.4.0 (RC4)

Please vote on releasing the following candidate as Apache Spark version 1.4.0!

The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
22596c534a38cfdda91aef18aa9037ab101e4251

The release files, including signatures, digests, etc. can be found at:
http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/

Release artifacts are signed with the following key:
https://people.apache.org/keys/committer/pwendell.asc

The staging repository for this release can be found at:
[published as version: 1.4.0]
https://repository.apache.org/content/repositories/orgapachespark-1111/
[published as version: 1.4.0-rc4]
https://repository.apache.org/content/repositories/orgapachespark-1112/

The documentation corresponding to this release can be found at:
http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/

Please vote on releasing this package as Apache Spark 1.4.0!

The vote is open until Saturday, June 06, at 05:00 UTC and passes
if a majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 1.4.0
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see
http://spark.apache.org/

== What has changed since RC3 ==
In addition to may smaller fixes, three blocker issues were fixed:
4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
metadataHive get constructed too early
6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton

== How can I help test this release? ==
If you are a Spark user, you can help us test this release by
taking a Spark 1.3 workload and running on this release candidate,
then reporting any regressions.

== What justifies a -1 vote for this release? ==
This vote is happening towards the end of the 1.4 QA period,
so -1 votes should only occur for significant regressions from 1.3.1.
Bugs already present in 1.3.X, minor regressions, or bugs related
to new features will not block this release.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Kousuke Saruta <sa...@oss.nttdata.co.jp>.
+1
Built on Mac OS X with -Dhadoop.version=2.4.0 -Pyarn -Phive 
-Phive-thriftserver.
Tested on YARN (cluster/client) on CentOS 7.
Also WebUI incuding DAG and Timeline View work.

On 2015/06/05 15:01, Burak Yavuz wrote:
> +1
>
> Tested on Mac OS X
>
> Burak
>
> On Thu, Jun 4, 2015 at 6:35 PM, Calvin Jia <jia.calvin@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     +1
>
>     Tested with input from Tachyon and persist off heap.
>
>     On Thu, Jun 4, 2015 at 6:26 PM, Timothy Chen <tnachen@gmail.com
>     <ma...@gmail.com>> wrote:
>
>         +1
>
>         Been testing cluster mode and client mode with mesos with 6
>         nodes cluster.
>
>         Everything works so far.
>
>         Tim
>
>         On Jun 4, 2015, at 5:47 PM, Andrew Or <andrew@databricks.com
>         <ma...@databricks.com>> wrote:
>
>>         +1 (binding)
>>
>>         Ran the same tests I did for RC3:
>>
>>         Tested the standalone cluster mode REST submission gateway -
>>         submit / status / kill
>>         Tested simple applications on YARN client / cluster modes
>>         with and without --jars
>>         Tested python applications on YARN client / cluster modes
>>         with and without --py-files*
>>         Tested dynamic allocation on YARN client / cluster modes**
>>
>>         All good. A couple of known issues:
>>
>>         *SPARK-8017: YARN cluster python --py-files not working - not
>>         a blocker (new feature)
>>         ** SPARK-8088: noisy output when min executors is set - not a
>>         blocker (output can be disabled)
>>
>>         2015-06-04 13:35 GMT-07:00 Matei Zaharia
>>         <matei.zaharia@gmail.com <ma...@gmail.com>>:
>>
>>             +1
>>
>>             Tested on Mac OS X
>>
>>             > On Jun 4, 2015, at 1:09 PM, Patrick Wendell
>>             <pwendell@gmail.com <ma...@gmail.com>> wrote:
>>             >
>>             > I will give +1 as well.
>>             >
>>             > On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin
>>             <rxin@databricks.com <ma...@databricks.com>> wrote:
>>             >> Let me give you the 1st
>>             >>
>>             >> +1
>>             >>
>>             >>
>>             >>
>>             >> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell
>>             <pwendell@gmail.com <ma...@gmail.com>> wrote:
>>             >>>
>>             >>> He all - a tiny nit from the last e-mail. The tag is
>>             v1.4.0-rc4. The
>>             >>> exact commit and all other information is correct.
>>             (thanks Shivaram
>>             >>> who pointed this out).
>>             >>>
>>             >>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell
>>             <pwendell@gmail.com <ma...@gmail.com>>
>>             >>> wrote:
>>             >>>> Please vote on releasing the following candidate as
>>             Apache Spark version
>>             >>>> 1.4.0!
>>             >>>>
>>             >>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>>             >>>>
>>             https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>             >>>> 22596c534a38cfdda91aef18aa9037ab101e4251
>>             >>>>
>>             >>>> The release files, including signatures, digests,
>>             etc. can be found at:
>>             >>>>
>>             http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>             <http://people.apache.org/%7Epwendell/spark-releases/spark-1.4.0-rc4-bin/>
>>             >>>>
>>             >>>> Release artifacts are signed with the following key:
>>             >>>> https://people.apache.org/keys/committer/pwendell.asc
>>             >>>>
>>             >>>> The staging repository for this release can be found at:
>>             >>>> [published as version: 1.4.0]
>>             >>>>
>>             https://repository.apache.org/content/repositories/orgapachespark-1111/
>>             >>>> [published as version: 1.4.0-rc4]
>>             >>>>
>>             https://repository.apache.org/content/repositories/orgapachespark-1112/
>>             >>>>
>>             >>>> The documentation corresponding to this release can
>>             be found at:
>>             >>>>
>>             http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>             <http://people.apache.org/%7Epwendell/spark-releases/spark-1.4.0-rc4-docs/>
>>             >>>>
>>             >>>> Please vote on releasing this package as Apache
>>             Spark 1.4.0!
>>             >>>>
>>             >>>> The vote is open until Saturday, June 06, at 05:00
>>             UTC and passes
>>             >>>> if a majority of at least 3 +1 PMC votes are cast.
>>             >>>>
>>             >>>> [ ] +1 Release this package as Apache Spark 1.4.0
>>             >>>> [ ] -1 Do not release this package because ...
>>             >>>>
>>             >>>> To learn more about Apache Spark, please see
>>             >>>> http://spark.apache.org/
>>             >>>>
>>             >>>> == What has changed since RC3 ==
>>             >>>> In addition to may smaller fixes, three blocker
>>             issues were fixed:
>>             >>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in
>>             spark-defaults.conf make
>>             >>>> metadataHive get constructed too early
>>             >>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix
>>             Column.when() and otherwise()
>>             >>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType
>>             should not be singleton
>>             >>>>
>>             >>>> == How can I help test this release? ==
>>             >>>> If you are a Spark user, you can help us test this
>>             release by
>>             >>>> taking a Spark 1.3 workload and running on this
>>             release candidate,
>>             >>>> then reporting any regressions.
>>             >>>>
>>             >>>> == What justifies a -1 vote for this release? ==
>>             >>>> This vote is happening towards the end of the 1.4 QA
>>             period,
>>             >>>> so -1 votes should only occur for significant
>>             regressions from 1.3.1.
>>             >>>> Bugs already present in 1.3.X, minor regressions, or
>>             bugs related
>>             >>>> to new features will not block this release.
>>             >>>
>>             >>>
>>             ---------------------------------------------------------------------
>>             >>> To unsubscribe, e-mail:
>>             dev-unsubscribe@spark.apache.org
>>             <ma...@spark.apache.org>
>>             >>> For additional commands, e-mail:
>>             dev-help@spark.apache.org <ma...@spark.apache.org>
>>             >>>
>>             >>
>>             >
>>             >
>>             ---------------------------------------------------------------------
>>             > To unsubscribe, e-mail:
>>             dev-unsubscribe@spark.apache.org
>>             <ma...@spark.apache.org>
>>             > For additional commands, e-mail:
>>             dev-help@spark.apache.org <ma...@spark.apache.org>
>>             >
>>
>>
>>             ---------------------------------------------------------------------
>>             To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>             <ma...@spark.apache.org>
>>             For additional commands, e-mail:
>>             dev-help@spark.apache.org <ma...@spark.apache.org>
>>
>>
>
>


Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Burak Yavuz <br...@gmail.com>.
+1

Tested on Mac OS X

Burak

On Thu, Jun 4, 2015 at 6:35 PM, Calvin Jia <ji...@gmail.com> wrote:

> +1
>
> Tested with input from Tachyon and persist off heap.
>
> On Thu, Jun 4, 2015 at 6:26 PM, Timothy Chen <tn...@gmail.com> wrote:
>
>> +1
>>
>> Been testing cluster mode and client mode with mesos with 6 nodes cluster.
>>
>> Everything works so far.
>>
>> Tim
>>
>> On Jun 4, 2015, at 5:47 PM, Andrew Or <an...@databricks.com> wrote:
>>
>> +1 (binding)
>>
>> Ran the same tests I did for RC3:
>>
>> Tested the standalone cluster mode REST submission gateway - submit /
>> status / kill
>> Tested simple applications on YARN client / cluster modes with and
>> without --jars
>> Tested python applications on YARN client / cluster modes with and
>> without --py-files*
>> Tested dynamic allocation on YARN client / cluster modes**
>>
>> All good. A couple of known issues:
>>
>> *SPARK-8017: YARN cluster python --py-files not working - not a blocker
>> (new feature)
>> ** SPARK-8088: noisy output when min executors is set - not a blocker
>> (output can be disabled)
>>
>> 2015-06-04 13:35 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>>
>>> +1
>>>
>>> Tested on Mac OS X
>>>
>>> > On Jun 4, 2015, at 1:09 PM, Patrick Wendell <pw...@gmail.com>
>>> wrote:
>>> >
>>> > I will give +1 as well.
>>> >
>>> > On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin <rx...@databricks.com>
>>> wrote:
>>> >> Let me give you the 1st
>>> >>
>>> >> +1
>>> >>
>>> >>
>>> >>
>>> >> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell <pw...@gmail.com>
>>> wrote:
>>> >>>
>>> >>> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
>>> >>> exact commit and all other information is correct. (thanks Shivaram
>>> >>> who pointed this out).
>>> >>>
>>> >>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
>>> >>> wrote:
>>> >>>> Please vote on releasing the following candidate as Apache Spark
>>> version
>>> >>>> 1.4.0!
>>> >>>>
>>> >>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>>> >>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>> >>>> 22596c534a38cfdda91aef18aa9037ab101e4251
>>> >>>>
>>> >>>> The release files, including signatures, digests, etc. can be found
>>> at:
>>> >>>>
>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>> >>>>
>>> >>>> Release artifacts are signed with the following key:
>>> >>>> https://people.apache.org/keys/committer/pwendell.asc
>>> >>>>
>>> >>>> The staging repository for this release can be found at:
>>> >>>> [published as version: 1.4.0]
>>> >>>>
>>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>>> >>>> [published as version: 1.4.0-rc4]
>>> >>>>
>>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>>> >>>>
>>> >>>> The documentation corresponding to this release can be found at:
>>> >>>>
>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>> >>>>
>>> >>>> Please vote on releasing this package as Apache Spark 1.4.0!
>>> >>>>
>>> >>>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
>>> >>>> if a majority of at least 3 +1 PMC votes are cast.
>>> >>>>
>>> >>>> [ ] +1 Release this package as Apache Spark 1.4.0
>>> >>>> [ ] -1 Do not release this package because ...
>>> >>>>
>>> >>>> To learn more about Apache Spark, please see
>>> >>>> http://spark.apache.org/
>>> >>>>
>>> >>>> == What has changed since RC3 ==
>>> >>>> In addition to may smaller fixes, three blocker issues were fixed:
>>> >>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf
>>> make
>>> >>>> metadataHive get constructed too early
>>> >>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and
>>> otherwise()
>>> >>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be
>>> singleton
>>> >>>>
>>> >>>> == How can I help test this release? ==
>>> >>>> If you are a Spark user, you can help us test this release by
>>> >>>> taking a Spark 1.3 workload and running on this release candidate,
>>> >>>> then reporting any regressions.
>>> >>>>
>>> >>>> == What justifies a -1 vote for this release? ==
>>> >>>> This vote is happening towards the end of the 1.4 QA period,
>>> >>>> so -1 votes should only occur for significant regressions from
>>> 1.3.1.
>>> >>>> Bugs already present in 1.3.X, minor regressions, or bugs related
>>> >>>> to new features will not block this release.
>>> >>>
>>> >>> ---------------------------------------------------------------------
>>> >>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> >>> For additional commands, e-mail: dev-help@spark.apache.org
>>> >>>
>>> >>
>>> >
>>> > ---------------------------------------------------------------------
>>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> > For additional commands, e-mail: dev-help@spark.apache.org
>>> >
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>>
>>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Calvin Jia <ji...@gmail.com>.
+1

Tested with input from Tachyon and persist off heap.

On Thu, Jun 4, 2015 at 6:26 PM, Timothy Chen <tn...@gmail.com> wrote:

> +1
>
> Been testing cluster mode and client mode with mesos with 6 nodes cluster.
>
> Everything works so far.
>
> Tim
>
> On Jun 4, 2015, at 5:47 PM, Andrew Or <an...@databricks.com> wrote:
>
> +1 (binding)
>
> Ran the same tests I did for RC3:
>
> Tested the standalone cluster mode REST submission gateway - submit /
> status / kill
> Tested simple applications on YARN client / cluster modes with and without
> --jars
> Tested python applications on YARN client / cluster modes with and without
> --py-files*
> Tested dynamic allocation on YARN client / cluster modes**
>
> All good. A couple of known issues:
>
> *SPARK-8017: YARN cluster python --py-files not working - not a blocker
> (new feature)
> ** SPARK-8088: noisy output when min executors is set - not a blocker
> (output can be disabled)
>
> 2015-06-04 13:35 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>
>> +1
>>
>> Tested on Mac OS X
>>
>> > On Jun 4, 2015, at 1:09 PM, Patrick Wendell <pw...@gmail.com> wrote:
>> >
>> > I will give +1 as well.
>> >
>> > On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin <rx...@databricks.com>
>> wrote:
>> >> Let me give you the 1st
>> >>
>> >> +1
>> >>
>> >>
>> >>
>> >> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell <pw...@gmail.com>
>> wrote:
>> >>>
>> >>> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
>> >>> exact commit and all other information is correct. (thanks Shivaram
>> >>> who pointed this out).
>> >>>
>> >>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
>> >>> wrote:
>> >>>> Please vote on releasing the following candidate as Apache Spark
>> version
>> >>>> 1.4.0!
>> >>>>
>> >>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>> >>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>> >>>> 22596c534a38cfdda91aef18aa9037ab101e4251
>> >>>>
>> >>>> The release files, including signatures, digests, etc. can be found
>> at:
>> >>>>
>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>> >>>>
>> >>>> Release artifacts are signed with the following key:
>> >>>> https://people.apache.org/keys/committer/pwendell.asc
>> >>>>
>> >>>> The staging repository for this release can be found at:
>> >>>> [published as version: 1.4.0]
>> >>>>
>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>> >>>> [published as version: 1.4.0-rc4]
>> >>>>
>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>> >>>>
>> >>>> The documentation corresponding to this release can be found at:
>> >>>>
>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>> >>>>
>> >>>> Please vote on releasing this package as Apache Spark 1.4.0!
>> >>>>
>> >>>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
>> >>>> if a majority of at least 3 +1 PMC votes are cast.
>> >>>>
>> >>>> [ ] +1 Release this package as Apache Spark 1.4.0
>> >>>> [ ] -1 Do not release this package because ...
>> >>>>
>> >>>> To learn more about Apache Spark, please see
>> >>>> http://spark.apache.org/
>> >>>>
>> >>>> == What has changed since RC3 ==
>> >>>> In addition to may smaller fixes, three blocker issues were fixed:
>> >>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>> >>>> metadataHive get constructed too early
>> >>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and
>> otherwise()
>> >>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be
>> singleton
>> >>>>
>> >>>> == How can I help test this release? ==
>> >>>> If you are a Spark user, you can help us test this release by
>> >>>> taking a Spark 1.3 workload and running on this release candidate,
>> >>>> then reporting any regressions.
>> >>>>
>> >>>> == What justifies a -1 vote for this release? ==
>> >>>> This vote is happening towards the end of the 1.4 QA period,
>> >>>> so -1 votes should only occur for significant regressions from 1.3.1.
>> >>>> Bugs already present in 1.3.X, minor regressions, or bugs related
>> >>>> to new features will not block this release.
>> >>>
>> >>> ---------------------------------------------------------------------
>> >>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> >>> For additional commands, e-mail: dev-help@spark.apache.org
>> >>>
>> >>
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> > For additional commands, e-mail: dev-help@spark.apache.org
>> >
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Timothy Chen <tn...@gmail.com>.
+1

Been testing cluster mode and client mode with mesos with 6 nodes cluster.

Everything works so far.

Tim

> On Jun 4, 2015, at 5:47 PM, Andrew Or <an...@databricks.com> wrote:
> 
> +1 (binding)
> 
> Ran the same tests I did for RC3:
> 
> Tested the standalone cluster mode REST submission gateway - submit / status / kill
> Tested simple applications on YARN client / cluster modes with and without --jars
> Tested python applications on YARN client / cluster modes with and without --py-files*
> Tested dynamic allocation on YARN client / cluster modes**
> 
> All good. A couple of known issues:
> 
> *SPARK-8017: YARN cluster python --py-files not working - not a blocker (new feature)
> ** SPARK-8088: noisy output when min executors is set - not a blocker (output can be disabled)
> 
> 2015-06-04 13:35 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>> +1
>> 
>> Tested on Mac OS X
>> 
>> > On Jun 4, 2015, at 1:09 PM, Patrick Wendell <pw...@gmail.com> wrote:
>> >
>> > I will give +1 as well.
>> >
>> > On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin <rx...@databricks.com> wrote:
>> >> Let me give you the 1st
>> >>
>> >> +1
>> >>
>> >>
>> >>
>> >> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell <pw...@gmail.com> wrote:
>> >>>
>> >>> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
>> >>> exact commit and all other information is correct. (thanks Shivaram
>> >>> who pointed this out).
>> >>>
>> >>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
>> >>> wrote:
>> >>>> Please vote on releasing the following candidate as Apache Spark version
>> >>>> 1.4.0!
>> >>>>
>> >>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>> >>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>> >>>> 22596c534a38cfdda91aef18aa9037ab101e4251
>> >>>>
>> >>>> The release files, including signatures, digests, etc. can be found at:
>> >>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>> >>>>
>> >>>> Release artifacts are signed with the following key:
>> >>>> https://people.apache.org/keys/committer/pwendell.asc
>> >>>>
>> >>>> The staging repository for this release can be found at:
>> >>>> [published as version: 1.4.0]
>> >>>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>> >>>> [published as version: 1.4.0-rc4]
>> >>>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>> >>>>
>> >>>> The documentation corresponding to this release can be found at:
>> >>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>> >>>>
>> >>>> Please vote on releasing this package as Apache Spark 1.4.0!
>> >>>>
>> >>>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
>> >>>> if a majority of at least 3 +1 PMC votes are cast.
>> >>>>
>> >>>> [ ] +1 Release this package as Apache Spark 1.4.0
>> >>>> [ ] -1 Do not release this package because ...
>> >>>>
>> >>>> To learn more about Apache Spark, please see
>> >>>> http://spark.apache.org/
>> >>>>
>> >>>> == What has changed since RC3 ==
>> >>>> In addition to may smaller fixes, three blocker issues were fixed:
>> >>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>> >>>> metadataHive get constructed too early
>> >>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>> >>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>> >>>>
>> >>>> == How can I help test this release? ==
>> >>>> If you are a Spark user, you can help us test this release by
>> >>>> taking a Spark 1.3 workload and running on this release candidate,
>> >>>> then reporting any regressions.
>> >>>>
>> >>>> == What justifies a -1 vote for this release? ==
>> >>>> This vote is happening towards the end of the 1.4 QA period,
>> >>>> so -1 votes should only occur for significant regressions from 1.3.1.
>> >>>> Bugs already present in 1.3.X, minor regressions, or bugs related
>> >>>> to new features will not block this release.
>> >>>
>> >>> ---------------------------------------------------------------------
>> >>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> >>> For additional commands, e-mail: dev-help@spark.apache.org
>> >>>
>> >>
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> > For additional commands, e-mail: dev-help@spark.apache.org
>> >
>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>> 
> 

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Andrew Or <an...@databricks.com>.
+1 (binding)

Ran the same tests I did for RC3:

Tested the standalone cluster mode REST submission gateway - submit /
status / kill
Tested simple applications on YARN client / cluster modes with and without
--jars
Tested python applications on YARN client / cluster modes with and without
--py-files*
Tested dynamic allocation on YARN client / cluster modes**

All good. A couple of known issues:

*SPARK-8017: YARN cluster python --py-files not working - not a blocker
(new feature)
** SPARK-8088: noisy output when min executors is set - not a blocker
(output can be disabled)

2015-06-04 13:35 GMT-07:00 Matei Zaharia <ma...@gmail.com>:

> +1
>
> Tested on Mac OS X
>
> > On Jun 4, 2015, at 1:09 PM, Patrick Wendell <pw...@gmail.com> wrote:
> >
> > I will give +1 as well.
> >
> > On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin <rx...@databricks.com>
> wrote:
> >> Let me give you the 1st
> >>
> >> +1
> >>
> >>
> >>
> >> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell <pw...@gmail.com>
> wrote:
> >>>
> >>> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
> >>> exact commit and all other information is correct. (thanks Shivaram
> >>> who pointed this out).
> >>>
> >>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
> >>> wrote:
> >>>> Please vote on releasing the following candidate as Apache Spark
> version
> >>>> 1.4.0!
> >>>>
> >>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> >>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> >>>> 22596c534a38cfdda91aef18aa9037ab101e4251
> >>>>
> >>>> The release files, including signatures, digests, etc. can be found
> at:
> >>>>
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
> >>>>
> >>>> Release artifacts are signed with the following key:
> >>>> https://people.apache.org/keys/committer/pwendell.asc
> >>>>
> >>>> The staging repository for this release can be found at:
> >>>> [published as version: 1.4.0]
> >>>>
> https://repository.apache.org/content/repositories/orgapachespark-1111/
> >>>> [published as version: 1.4.0-rc4]
> >>>>
> https://repository.apache.org/content/repositories/orgapachespark-1112/
> >>>>
> >>>> The documentation corresponding to this release can be found at:
> >>>>
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
> >>>>
> >>>> Please vote on releasing this package as Apache Spark 1.4.0!
> >>>>
> >>>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
> >>>> if a majority of at least 3 +1 PMC votes are cast.
> >>>>
> >>>> [ ] +1 Release this package as Apache Spark 1.4.0
> >>>> [ ] -1 Do not release this package because ...
> >>>>
> >>>> To learn more about Apache Spark, please see
> >>>> http://spark.apache.org/
> >>>>
> >>>> == What has changed since RC3 ==
> >>>> In addition to may smaller fixes, three blocker issues were fixed:
> >>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> >>>> metadataHive get constructed too early
> >>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> >>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be
> singleton
> >>>>
> >>>> == How can I help test this release? ==
> >>>> If you are a Spark user, you can help us test this release by
> >>>> taking a Spark 1.3 workload and running on this release candidate,
> >>>> then reporting any regressions.
> >>>>
> >>>> == What justifies a -1 vote for this release? ==
> >>>> This vote is happening towards the end of the 1.4 QA period,
> >>>> so -1 votes should only occur for significant regressions from 1.3.1.
> >>>> Bugs already present in 1.3.X, minor regressions, or bugs related
> >>>> to new features will not block this release.
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> >>> For additional commands, e-mail: dev-help@spark.apache.org
> >>>
> >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> > For additional commands, e-mail: dev-help@spark.apache.org
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Matei Zaharia <ma...@gmail.com>.
+1 

Tested on Mac OS X

> On Jun 4, 2015, at 1:09 PM, Patrick Wendell <pw...@gmail.com> wrote:
> 
> I will give +1 as well.
> 
> On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin <rx...@databricks.com> wrote:
>> Let me give you the 1st
>> 
>> +1
>> 
>> 
>> 
>> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell <pw...@gmail.com> wrote:
>>> 
>>> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
>>> exact commit and all other information is correct. (thanks Shivaram
>>> who pointed this out).
>>> 
>>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
>>> wrote:
>>>> Please vote on releasing the following candidate as Apache Spark version
>>>> 1.4.0!
>>>> 
>>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>>> 22596c534a38cfdda91aef18aa9037ab101e4251
>>>> 
>>>> The release files, including signatures, digests, etc. can be found at:
>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>>> 
>>>> Release artifacts are signed with the following key:
>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>> 
>>>> The staging repository for this release can be found at:
>>>> [published as version: 1.4.0]
>>>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>>>> [published as version: 1.4.0-rc4]
>>>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>>>> 
>>>> The documentation corresponding to this release can be found at:
>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>>> 
>>>> Please vote on releasing this package as Apache Spark 1.4.0!
>>>> 
>>>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
>>>> if a majority of at least 3 +1 PMC votes are cast.
>>>> 
>>>> [ ] +1 Release this package as Apache Spark 1.4.0
>>>> [ ] -1 Do not release this package because ...
>>>> 
>>>> To learn more about Apache Spark, please see
>>>> http://spark.apache.org/
>>>> 
>>>> == What has changed since RC3 ==
>>>> In addition to may smaller fixes, three blocker issues were fixed:
>>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>>>> metadataHive get constructed too early
>>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>>>> 
>>>> == How can I help test this release? ==
>>>> If you are a Spark user, you can help us test this release by
>>>> taking a Spark 1.3 workload and running on this release candidate,
>>>> then reporting any regressions.
>>>> 
>>>> == What justifies a -1 vote for this release? ==
>>>> This vote is happening towards the end of the 1.4 QA period,
>>>> so -1 votes should only occur for significant regressions from 1.3.1.
>>>> Bugs already present in 1.3.X, minor regressions, or bugs related
>>>> to new features will not block this release.
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>> 
>> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Patrick Wendell <pw...@gmail.com>.
I will give +1 as well.

On Wed, Jun 3, 2015 at 11:59 PM, Reynold Xin <rx...@databricks.com> wrote:
> Let me give you the 1st
>
> +1
>
>
>
> On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell <pw...@gmail.com> wrote:
>>
>> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
>> exact commit and all other information is correct. (thanks Shivaram
>> who pointed this out).
>>
>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
>> wrote:
>> > Please vote on releasing the following candidate as Apache Spark version
>> > 1.4.0!
>> >
>> > The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>> > 22596c534a38cfdda91aef18aa9037ab101e4251
>> >
>> > The release files, including signatures, digests, etc. can be found at:
>> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>> >
>> > Release artifacts are signed with the following key:
>> > https://people.apache.org/keys/committer/pwendell.asc
>> >
>> > The staging repository for this release can be found at:
>> > [published as version: 1.4.0]
>> > https://repository.apache.org/content/repositories/orgapachespark-1111/
>> > [published as version: 1.4.0-rc4]
>> > https://repository.apache.org/content/repositories/orgapachespark-1112/
>> >
>> > The documentation corresponding to this release can be found at:
>> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>> >
>> > Please vote on releasing this package as Apache Spark 1.4.0!
>> >
>> > The vote is open until Saturday, June 06, at 05:00 UTC and passes
>> > if a majority of at least 3 +1 PMC votes are cast.
>> >
>> > [ ] +1 Release this package as Apache Spark 1.4.0
>> > [ ] -1 Do not release this package because ...
>> >
>> > To learn more about Apache Spark, please see
>> > http://spark.apache.org/
>> >
>> > == What has changed since RC3 ==
>> > In addition to may smaller fixes, three blocker issues were fixed:
>> > 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>> > metadataHive get constructed too early
>> > 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>> > 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>> >
>> > == How can I help test this release? ==
>> > If you are a Spark user, you can help us test this release by
>> > taking a Spark 1.3 workload and running on this release candidate,
>> > then reporting any regressions.
>> >
>> > == What justifies a -1 vote for this release? ==
>> > This vote is happening towards the end of the 1.4 QA period,
>> > so -1 votes should only occur for significant regressions from 1.3.1.
>> > Bugs already present in 1.3.X, minor regressions, or bugs related
>> > to new features will not block this release.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Reynold Xin <rx...@databricks.com>.
Let me give you the 1st

+1



On Tue, Jun 2, 2015 at 10:47 PM, Patrick Wendell <pw...@gmail.com> wrote:

> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
> exact commit and all other information is correct. (thanks Shivaram
> who pointed this out).
>
> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
> wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> 1.4.0!
> >
> > The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> > 22596c534a38cfdda91aef18aa9037ab101e4251
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > [published as version: 1.4.0]
> > https://repository.apache.org/content/repositories/orgapachespark-1111/
> > [published as version: 1.4.0-rc4]
> > https://repository.apache.org/content/repositories/orgapachespark-1112/
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
> >
> > Please vote on releasing this package as Apache Spark 1.4.0!
> >
> > The vote is open until Saturday, June 06, at 05:00 UTC and passes
> > if a majority of at least 3 +1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 1.4.0
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see
> > http://spark.apache.org/
> >
> > == What has changed since RC3 ==
> > In addition to may smaller fixes, three blocker issues were fixed:
> > 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> > metadataHive get constructed too early
> > 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> > 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
> >
> > == How can I help test this release? ==
> > If you are a Spark user, you can help us test this release by
> > taking a Spark 1.3 workload and running on this release candidate,
> > then reporting any regressions.
> >
> > == What justifies a -1 vote for this release? ==
> > This vote is happening towards the end of the 1.4 QA period,
> > so -1 votes should only occur for significant regressions from 1.3.1.
> > Bugs already present in 1.3.X, minor regressions, or bugs related
> > to new features will not block this release.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by saurfang <fo...@outlook.com>.
+1

Build for Hadoop 2.4. Run a few jobs on YARN and tested spark.sql.unsafe
whose performance seems great!



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-4-0-RC4-tp12582p12671.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Denny Lee <de...@gmail.com>.
+1

On Mon, Jun 8, 2015 at 17:51 Wang, Daoyuan <da...@intel.com> wrote:

> +1
>
> -----Original Message-----
> From: Patrick Wendell [mailto:pwendell@gmail.com]
> Sent: Wednesday, June 03, 2015 1:47 PM
> To: dev@spark.apache.org
> Subject: Re: [VOTE] Release Apache Spark 1.4.0 (RC4)
>
> He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The exact
> commit and all other information is correct. (thanks Shivaram who pointed
> this out).
>
> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
> wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> 1.4.0!
> >
> > The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> > 22596c534a38cfdda91aef18aa9037ab101e4251
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > [published as version: 1.4.0]
> > https://repository.apache.org/content/repositories/orgapachespark-1111
> > /
> > [published as version: 1.4.0-rc4]
> > https://repository.apache.org/content/repositories/orgapachespark-1112
> > /
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs
> > /
> >
> > Please vote on releasing this package as Apache Spark 1.4.0!
> >
> > The vote is open until Saturday, June 06, at 05:00 UTC and passes if a
> > majority of at least 3 +1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 1.4.0 [ ] -1 Do not
> > release this package because ...
> >
> > To learn more about Apache Spark, please see http://spark.apache.org/
> >
> > == What has changed since RC3 ==
> > In addition to may smaller fixes, three blocker issues were fixed:
> > 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> > metadataHive get constructed too early
> > 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> > 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be
> > singleton
> >
> > == How can I help test this release? == If you are a Spark user, you
> > can help us test this release by taking a Spark 1.3 workload and
> > running on this release candidate, then reporting any regressions.
> >
> > == What justifies a -1 vote for this release? == This vote is
> > happening towards the end of the 1.4 QA period, so -1 votes should
> > only occur for significant regressions from 1.3.1.
> > Bugs already present in 1.3.X, minor regressions, or bugs related to
> > new features will not block this release.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org For additional
> commands, e-mail: dev-help@spark.apache.org
>
>

RE: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by "Wang, Daoyuan" <da...@intel.com>.
+1

-----Original Message-----
From: Patrick Wendell [mailto:pwendell@gmail.com] 
Sent: Wednesday, June 03, 2015 1:47 PM
To: dev@spark.apache.org
Subject: Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The exact commit and all other information is correct. (thanks Shivaram who pointed this out).

On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com> wrote:
> Please vote on releasing the following candidate as Apache Spark version 1.4.0!
>
> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> 22596c534a38cfdda91aef18aa9037ab101e4251
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> [published as version: 1.4.0]
> https://repository.apache.org/content/repositories/orgapachespark-1111
> /
> [published as version: 1.4.0-rc4]
> https://repository.apache.org/content/repositories/orgapachespark-1112
> /
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs
> /
>
> Please vote on releasing this package as Apache Spark 1.4.0!
>
> The vote is open until Saturday, June 06, at 05:00 UTC and passes if a 
> majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 1.4.0 [ ] -1 Do not 
> release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> == What has changed since RC3 ==
> In addition to may smaller fixes, three blocker issues were fixed:
> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make 
> metadataHive get constructed too early
> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be 
> singleton
>
> == How can I help test this release? == If you are a Spark user, you 
> can help us test this release by taking a Spark 1.3 workload and 
> running on this release candidate, then reporting any regressions.
>
> == What justifies a -1 vote for this release? == This vote is 
> happening towards the end of the 1.4 QA period, so -1 votes should 
> only occur for significant regressions from 1.3.1.
> Bugs already present in 1.3.X, minor regressions, or bugs related to 
> new features will not block this release.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org For additional commands, e-mail: dev-help@spark.apache.org


Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Patrick Wendell <pw...@gmail.com>.
He all - a tiny nit from the last e-mail. The tag is v1.4.0-rc4. The
exact commit and all other information is correct. (thanks Shivaram
who pointed this out).

On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com> wrote:
> Please vote on releasing the following candidate as Apache Spark version 1.4.0!
>
> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> 22596c534a38cfdda91aef18aa9037ab101e4251
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> [published as version: 1.4.0]
> https://repository.apache.org/content/repositories/orgapachespark-1111/
> [published as version: 1.4.0-rc4]
> https://repository.apache.org/content/repositories/orgapachespark-1112/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>
> Please vote on releasing this package as Apache Spark 1.4.0!
>
> The vote is open until Saturday, June 06, at 05:00 UTC and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 1.4.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see
> http://spark.apache.org/
>
> == What has changed since RC3 ==
> In addition to may smaller fixes, three blocker issues were fixed:
> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> metadataHive get constructed too early
> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>
> == How can I help test this release? ==
> If you are a Spark user, you can help us test this release by
> taking a Spark 1.3 workload and running on this release candidate,
> then reporting any regressions.
>
> == What justifies a -1 vote for this release? ==
> This vote is happening towards the end of the 1.4 QA period,
> so -1 votes should only occur for significant regressions from 1.3.1.
> Bugs already present in 1.3.X, minor regressions, or bugs related
> to new features will not block this release.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


答复: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Tao Wang <wa...@huawei.com>.
+1

Tested with building with Hadoop 2.7.0 and running with tests:

WordCount in yarn-client/yarn-cluster mode works fine;
Basic sql queries are passed;
“spark.sql.autoBroadcastJoinThreshold” works fine;
Thrift Server is fine;
Running streaming with kafka is good;
External shuffle in YARN mode is fine;
Hisotry Server can automatically clean the event log on hdfs;
Basic PySpark tests are fine;


发件人: Sean McNamara [via Apache Spark Developers List] [mailto:ml-node+s1001551n12675h77@n3.nabble.com]
发送时间: 2015年6月9日 23:53
收件人: wangtao (A)
主题: Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

+1

tested /w OS X + deployed one of our streaming apps onto a staging yarn cluster.

Sean

> On Jun 2, 2015, at 9:54 PM, Patrick Wendell <[hidden email]</user/SendEmail.jtp?type=node&node=12675&i=0>> wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 1.4.0!
>
> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> 22596c534a38cfdda91aef18aa9037ab101e4251
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> [published as version: 1.4.0]
> https://repository.apache.org/content/repositories/orgapachespark-1111/
> [published as version: 1.4.0-rc4]
> https://repository.apache.org/content/repositories/orgapachespark-1112/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>
> Please vote on releasing this package as Apache Spark 1.4.0!
>
> The vote is open until Saturday, June 06, at 05:00 UTC and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 1.4.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see
> http://spark.apache.org/
>
> == What has changed since RC3 ==
> In addition to may smaller fixes, three blocker issues were fixed:
> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> metadataHive get constructed too early
> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>
> == How can I help test this release? ==
> If you are a Spark user, you can help us test this release by
> taking a Spark 1.3 workload and running on this release candidate,
> then reporting any regressions.
>
> == What justifies a -1 vote for this release? ==
> This vote is happening towards the end of the 1.4 QA period,
> so -1 votes should only occur for significant regressions from 1.3.1.
> Bugs already present in 1.3.X, minor regressions, or bugs related
> to new features will not block this release.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [hidden email]</user/SendEmail.jtp?type=node&node=12675&i=1>
> For additional commands, e-mail: [hidden email]</user/SendEmail.jtp?type=node&node=12675&i=2>
>


---------------------------------------------------------------------
To unsubscribe, e-mail: [hidden email]</user/SendEmail.jtp?type=node&node=12675&i=3>
For additional commands, e-mail: [hidden email]</user/SendEmail.jtp?type=node&node=12675&i=4>


________________________________
If you reply to this email, your message will be added to the discussion below:
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-4-0-RC4-tp12582p12675.html
To unsubscribe from [VOTE] Release Apache Spark 1.4.0 (RC4), click here<http://apache-spark-developers-list.1001551.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=12582&code=d2FuZ3RhbzExMUBodWF3ZWkuY29tfDEyNTgyfDgyNzMxMjE4MA==>.
NAML<http://apache-spark-developers-list.1001551.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml>




--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-4-0-RC4-tp12582p12682.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Sean McNamara <Se...@Webtrends.com>.
+1

tested /w OS X + deployed one of our streaming apps onto a staging yarn cluster.

Sean

> On Jun 2, 2015, at 9:54 PM, Patrick Wendell <pw...@gmail.com> wrote:
> 
> Please vote on releasing the following candidate as Apache Spark version 1.4.0!
> 
> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> 22596c534a38cfdda91aef18aa9037ab101e4251
> 
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
> 
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
> 
> The staging repository for this release can be found at:
> [published as version: 1.4.0]
> https://repository.apache.org/content/repositories/orgapachespark-1111/
> [published as version: 1.4.0-rc4]
> https://repository.apache.org/content/repositories/orgapachespark-1112/
> 
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
> 
> Please vote on releasing this package as Apache Spark 1.4.0!
> 
> The vote is open until Saturday, June 06, at 05:00 UTC and passes
> if a majority of at least 3 +1 PMC votes are cast.
> 
> [ ] +1 Release this package as Apache Spark 1.4.0
> [ ] -1 Do not release this package because ...
> 
> To learn more about Apache Spark, please see
> http://spark.apache.org/
> 
> == What has changed since RC3 ==
> In addition to may smaller fixes, three blocker issues were fixed:
> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> metadataHive get constructed too early
> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
> 
> == How can I help test this release? ==
> If you are a Spark user, you can help us test this release by
> taking a Spark 1.3 workload and running on this release candidate,
> then reporting any regressions.
> 
> == What justifies a -1 vote for this release? ==
> This vote is happening towards the end of the 1.4 QA period,
> so -1 votes should only occur for significant regressions from 1.3.1.
> Bugs already present in 1.3.X, minor regressions, or bugs related
> to new features will not block this release.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Patrick Wendell <pw...@gmail.com>.
Hi All,

Thanks for the continued voting! I'm going to leave this thread open
for another few days to continue to collect feedback.

- Patrick

On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com> wrote:
> Please vote on releasing the following candidate as Apache Spark version 1.4.0!
>
> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> 22596c534a38cfdda91aef18aa9037ab101e4251
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> [published as version: 1.4.0]
> https://repository.apache.org/content/repositories/orgapachespark-1111/
> [published as version: 1.4.0-rc4]
> https://repository.apache.org/content/repositories/orgapachespark-1112/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>
> Please vote on releasing this package as Apache Spark 1.4.0!
>
> The vote is open until Saturday, June 06, at 05:00 UTC and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 1.4.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see
> http://spark.apache.org/
>
> == What has changed since RC3 ==
> In addition to may smaller fixes, three blocker issues were fixed:
> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> metadataHive get constructed too early
> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>
> == How can I help test this release? ==
> If you are a Spark user, you can help us test this release by
> taking a Spark 1.3 workload and running on this release candidate,
> then reporting any regressions.
>
> == What justifies a -1 vote for this release? ==
> This vote is happening towards the end of the 1.4 QA period,
> so -1 votes should only occur for significant regressions from 1.3.1.
> Bugs already present in 1.3.X, minor regressions, or bugs related
> to new features will not block this release.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Bobby Chowdary <bo...@gmail.com>.
Thanks Yin !

every thing else works great!

+1 (non-binding)

On Fri, Jun 5, 2015 at 2:11 PM, Yin Huai <yh...@databricks.com> wrote:

> Hi Bobby,
>
> sqlContext.table("test.test1") is not officially supported in 1.3. For
> now, please use the "use database" as a workaround. We will add it.
>
> Thanks,
>
> Yin
>
> On Fri, Jun 5, 2015 at 12:18 PM, Bobby Chowdary <
> bobby.chowdary03@gmail.com> wrote:
>
>> Not sure if its a blocker but there might be a minor issue with hive
>> context, there is also a work around
>>
>> *Works:*
>>
>> from pyspark.sql import HiveContext
>>
>> sqlContext = HiveContext(sc)
>> df = sqlContext.sql("select * from test.test1")
>>
>> *Does not Work:*
>>
>>  df = sqlContext.table("test.test1")
>>
>> Py4JJavaError: An error occurred while calling o260.table. : org.apache.spark.sql.catalyst.analysis.NoSuchTableException     at org.apache.spark.sql.hive.client.ClientInterface$anonfun$getTable$1.apply(ClientInterface.scala:112)     at org.apache.spark.sql.hive.client.ClientInterface$anonfun$getTable$1.apply(ClientInterface.scala:112)     at scala.Option.getOrElse(Option.scala:120)     at org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:112)     at org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:58)     at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:227)     at org.apache.spark.sql.hive.HiveContext$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$super$lookupRelation(HiveContext.scala:370)     at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$anonfun$lookupRelation$3.apply(Catalog.scala:165)     at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$anonfun$lookupRelation$3.apply(Catalog.scala:165)     at scala.Option.getOrElse(Option.scala:120)     at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:165)     at org.apache.spark.sql.hive.HiveContext$anon$2.lookupRelation(HiveContext.scala:370)     at org.apache.spark.sql.SQLContext.table(SQLContext.scala:754)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:497)     at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)     at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)     at py4j.Gateway.invoke(Gateway.java:259)     at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)     at py4j.commands.CallCommand.execute(CallCommand.java:79)     at py4j.GatewayConnection.run(GatewayConnection.java:207)     at java.lang.Thread.run(Thread.java:745)  (<class 'py4j.protocol.Py4JJavaError'>, Py4JJavaError(u'An error occurred while calling o260.table.\n', JavaObject id=o262), <traceback object at 0x2e248c0>)
>>
>> How ever which i swtich db context it works
>>
>> *Works:*
>>
>>  sqlContext.sql("use test")
>>  df = sqlContext.table("test1")
>>
>> Bulit on Mac OSX  JDK6for Mapr Distribution and Running on CentOS 7.0 JDK8
>>
>> make-distribution.sh --tgz -Pmapr4  -Phive -Pnetlib-lgpl -Phive-thriftserver
>>
>> didn’t have this issue in RC3 and tried it on scala as well.
>>
>> Thanks
>> Bobby
>> ​
>>
>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Yin Huai <yh...@databricks.com>.
Hi Bobby,

sqlContext.table("test.test1") is not officially supported in 1.3. For now,
please use the "use database" as a workaround. We will add it.

Thanks,

Yin

On Fri, Jun 5, 2015 at 12:18 PM, Bobby Chowdary <bo...@gmail.com>
wrote:

> Not sure if its a blocker but there might be a minor issue with hive
> context, there is also a work around
>
> *Works:*
>
> from pyspark.sql import HiveContext
>
> sqlContext = HiveContext(sc)
> df = sqlContext.sql("select * from test.test1")
>
> *Does not Work:*
>
>  df = sqlContext.table("test.test1")
>
> Py4JJavaError: An error occurred while calling o260.table. : org.apache.spark.sql.catalyst.analysis.NoSuchTableException     at org.apache.spark.sql.hive.client.ClientInterface$anonfun$getTable$1.apply(ClientInterface.scala:112)     at org.apache.spark.sql.hive.client.ClientInterface$anonfun$getTable$1.apply(ClientInterface.scala:112)     at scala.Option.getOrElse(Option.scala:120)     at org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:112)     at org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:58)     at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:227)     at org.apache.spark.sql.hive.HiveContext$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$super$lookupRelation(HiveContext.scala:370)     at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$anonfun$lookupRelation$3.apply(Catalog.scala:165)     at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$anonfun$lookupRelation$3.apply(Catalog.scala:165)     at scala.Option.getOrElse(Option.scala:120)     at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:165)     at org.apache.spark.sql.hive.HiveContext$anon$2.lookupRelation(HiveContext.scala:370)     at org.apache.spark.sql.SQLContext.table(SQLContext.scala:754)     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)     at java.lang.reflect.Method.invoke(Method.java:497)     at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)     at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)     at py4j.Gateway.invoke(Gateway.java:259)     at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)     at py4j.commands.CallCommand.execute(CallCommand.java:79)     at py4j.GatewayConnection.run(GatewayConnection.java:207)     at java.lang.Thread.run(Thread.java:745)  (<class 'py4j.protocol.Py4JJavaError'>, Py4JJavaError(u'An error occurred while calling o260.table.\n', JavaObject id=o262), <traceback object at 0x2e248c0>)
>
> How ever which i swtich db context it works
>
> *Works:*
>
>  sqlContext.sql("use test")
>  df = sqlContext.table("test1")
>
> Bulit on Mac OSX  JDK6for Mapr Distribution and Running on CentOS 7.0 JDK8
>
> make-distribution.sh --tgz -Pmapr4  -Phive -Pnetlib-lgpl -Phive-thriftserver
>
> didn’t have this issue in RC3 and tried it on scala as well.
>
> Thanks
> Bobby
> ​
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Bobby Chowdary <bo...@gmail.com>.
Not sure if its a blocker but there might be a minor issue with hive
context, there is also a work around

*Works:*

from pyspark.sql import HiveContext

sqlContext = HiveContext(sc)
df = sqlContext.sql("select * from test.test1")

*Does not Work:*

 df = sqlContext.table("test.test1")

Py4JJavaError: An error occurred while calling o260.table. :
org.apache.spark.sql.catalyst.analysis.NoSuchTableException     at
org.apache.spark.sql.hive.client.ClientInterface$anonfun$getTable$1.apply(ClientInterface.scala:112)
    at org.apache.spark.sql.hive.client.ClientInterface$anonfun$getTable$1.apply(ClientInterface.scala:112)
    at scala.Option.getOrElse(Option.scala:120)     at
org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:112)
    at org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:58)
    at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:227)
    at org.apache.spark.sql.hive.HiveContext$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$super$lookupRelation(HiveContext.scala:370)
    at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$anonfun$lookupRelation$3.apply(Catalog.scala:165)
    at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$anonfun$lookupRelation$3.apply(Catalog.scala:165)
    at scala.Option.getOrElse(Option.scala:120)     at
org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:165)
    at org.apache.spark.sql.hive.HiveContext$anon$2.lookupRelation(HiveContext.scala:370)
    at org.apache.spark.sql.SQLContext.table(SQLContext.scala:754)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)     at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)     at
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)     at
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:259)     at
py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
  at py4j.commands.CallCommand.execute(CallCommand.java:79)     at
py4j.GatewayConnection.run(GatewayConnection.java:207)     at
java.lang.Thread.run(Thread.java:745)  (<class
'py4j.protocol.Py4JJavaError'>, Py4JJavaError(u'An error occurred
while calling o260.table.\n', JavaObject id=o262), <traceback object
at 0x2e248c0>)

How ever which i swtich db context it works

*Works:*

 sqlContext.sql("use test")
 df = sqlContext.table("test1")

Bulit on Mac OSX  JDK6for Mapr Distribution and Running on CentOS 7.0 JDK8

make-distribution.sh --tgz -Pmapr4  -Phive -Pnetlib-lgpl -Phive-thriftserver

didn’t have this issue in RC3 and tried it on scala as well.

Thanks
Bobby
​

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Sandy Ryza <sa...@cloudera.com>.
+1 (non-binding)

Built from source and ran some jobs against a pseudo-distributed YARN
cluster.

-Sandy

On Fri, Jun 5, 2015 at 11:05 AM, Ram Sriharsha <sr...@gmail.com>
wrote:

> +1 , tested  with hadoop 2.6/ yarn on centos 6.5 after building  w/ -Pyarn
> -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver and ran a
> few SQL tests and the ML examples
>
> On Fri, Jun 5, 2015 at 10:55 AM, Hari Shreedharan <
> hshreedharan@cloudera.com> wrote:
>
>> +1. Build looks good, ran a couple apps on YARN
>>
>>
>> Thanks,
>> Hari
>>
>> On Fri, Jun 5, 2015 at 10:52 AM, Yin Huai <yh...@databricks.com> wrote:
>>
>>> Sean,
>>>
>>> Can you add "-Phive -Phive-thriftserver" and try those Hive tests?
>>>
>>> Thanks,
>>>
>>> Yin
>>>
>>> On Fri, Jun 5, 2015 at 5:19 AM, Sean Owen <so...@cloudera.com> wrote:
>>>
>>>> Everything checks out again, and the tests pass for me on Ubuntu +
>>>> Java 7 with '-Pyarn -Phadoop-2.6', except that I always get
>>>> SparkSubmitSuite errors like ...
>>>>
>>>> - success sanity check *** FAILED ***
>>>>   java.lang.RuntimeException: [download failed:
>>>> org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed:
>>>> commons-net#commons-net;3.1!commons-net.jar]
>>>>   at
>>>> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978)
>>>>   at
>>>> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
>>>>   ...
>>>>
>>>> I also can't get hive tests to pass. Is anyone else seeing anything
>>>> like this? if not I'll assume this is something specific to the env --
>>>> or that I don't have the build invocation just right. It's puzzling
>>>> since it's so consistent, but I presume others' tests pass and Jenkins
>>>> does.
>>>>
>>>>
>>>> On Wed, Jun 3, 2015 at 5:53 AM, Patrick Wendell <pw...@gmail.com>
>>>> wrote:
>>>> > Please vote on releasing the following candidate as Apache Spark
>>>> version 1.4.0!
>>>> >
>>>> > The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>>>> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>>> > 22596c534a38cfdda91aef18aa9037ab101e4251
>>>> >
>>>> > The release files, including signatures, digests, etc. can be found
>>>> at:
>>>> >
>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>>> >
>>>> > Release artifacts are signed with the following key:
>>>> > https://people.apache.org/keys/committer/pwendell.asc
>>>> >
>>>> > The staging repository for this release can be found at:
>>>> > [published as version: 1.4.0]
>>>> >
>>>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>>>> > [published as version: 1.4.0-rc4]
>>>> >
>>>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>>>> >
>>>> > The documentation corresponding to this release can be found at:
>>>> >
>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>>> >
>>>> > Please vote on releasing this package as Apache Spark 1.4.0!
>>>> >
>>>> > The vote is open until Saturday, June 06, at 05:00 UTC and passes
>>>> > if a majority of at least 3 +1 PMC votes are cast.
>>>> >
>>>> > [ ] +1 Release this package as Apache Spark 1.4.0
>>>> > [ ] -1 Do not release this package because ...
>>>> >
>>>> > To learn more about Apache Spark, please see
>>>> > http://spark.apache.org/
>>>> >
>>>> > == What has changed since RC3 ==
>>>> > In addition to may smaller fixes, three blocker issues were fixed:
>>>> > 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>>>> > metadataHive get constructed too early
>>>> > 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>>>> > 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be
>>>> singleton
>>>> >
>>>> > == How can I help test this release? ==
>>>> > If you are a Spark user, you can help us test this release by
>>>> > taking a Spark 1.3 workload and running on this release candidate,
>>>> > then reporting any regressions.
>>>> >
>>>> > == What justifies a -1 vote for this release? ==
>>>> > This vote is happening towards the end of the 1.4 QA period,
>>>> > so -1 votes should only occur for significant regressions from 1.3.1.
>>>> > Bugs already present in 1.3.X, minor regressions, or bugs related
>>>> > to new features will not block this release.
>>>> >
>>>> > ---------------------------------------------------------------------
>>>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> > For additional commands, e-mail: dev-help@spark.apache.org
>>>> >
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>>
>>>>
>>>
>>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Ram Sriharsha <sr...@gmail.com>.
+1 , tested  with hadoop 2.6/ yarn on centos 6.5 after building  w/ -Pyarn
-Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver and ran a
few SQL tests and the ML examples

On Fri, Jun 5, 2015 at 10:55 AM, Hari Shreedharan <hshreedharan@cloudera.com
> wrote:

> +1. Build looks good, ran a couple apps on YARN
>
>
> Thanks,
> Hari
>
> On Fri, Jun 5, 2015 at 10:52 AM, Yin Huai <yh...@databricks.com> wrote:
>
>> Sean,
>>
>> Can you add "-Phive -Phive-thriftserver" and try those Hive tests?
>>
>> Thanks,
>>
>> Yin
>>
>> On Fri, Jun 5, 2015 at 5:19 AM, Sean Owen <so...@cloudera.com> wrote:
>>
>>> Everything checks out again, and the tests pass for me on Ubuntu +
>>> Java 7 with '-Pyarn -Phadoop-2.6', except that I always get
>>> SparkSubmitSuite errors like ...
>>>
>>> - success sanity check *** FAILED ***
>>>   java.lang.RuntimeException: [download failed:
>>> org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed:
>>> commons-net#commons-net;3.1!commons-net.jar]
>>>   at
>>> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978)
>>>   at
>>> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
>>>   ...
>>>
>>> I also can't get hive tests to pass. Is anyone else seeing anything
>>> like this? if not I'll assume this is something specific to the env --
>>> or that I don't have the build invocation just right. It's puzzling
>>> since it's so consistent, but I presume others' tests pass and Jenkins
>>> does.
>>>
>>>
>>> On Wed, Jun 3, 2015 at 5:53 AM, Patrick Wendell <pw...@gmail.com>
>>> wrote:
>>> > Please vote on releasing the following candidate as Apache Spark
>>> version 1.4.0!
>>> >
>>> > The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>>> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>> > 22596c534a38cfdda91aef18aa9037ab101e4251
>>> >
>>> > The release files, including signatures, digests, etc. can be found at:
>>> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>> >
>>> > Release artifacts are signed with the following key:
>>> > https://people.apache.org/keys/committer/pwendell.asc
>>> >
>>> > The staging repository for this release can be found at:
>>> > [published as version: 1.4.0]
>>> >
>>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>>> > [published as version: 1.4.0-rc4]
>>> >
>>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>>> >
>>> > The documentation corresponding to this release can be found at:
>>> >
>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>> >
>>> > Please vote on releasing this package as Apache Spark 1.4.0!
>>> >
>>> > The vote is open until Saturday, June 06, at 05:00 UTC and passes
>>> > if a majority of at least 3 +1 PMC votes are cast.
>>> >
>>> > [ ] +1 Release this package as Apache Spark 1.4.0
>>> > [ ] -1 Do not release this package because ...
>>> >
>>> > To learn more about Apache Spark, please see
>>> > http://spark.apache.org/
>>> >
>>> > == What has changed since RC3 ==
>>> > In addition to may smaller fixes, three blocker issues were fixed:
>>> > 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>>> > metadataHive get constructed too early
>>> > 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>>> > 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be
>>> singleton
>>> >
>>> > == How can I help test this release? ==
>>> > If you are a Spark user, you can help us test this release by
>>> > taking a Spark 1.3 workload and running on this release candidate,
>>> > then reporting any regressions.
>>> >
>>> > == What justifies a -1 vote for this release? ==
>>> > This vote is happening towards the end of the 1.4 QA period,
>>> > so -1 votes should only occur for significant regressions from 1.3.1.
>>> > Bugs already present in 1.3.X, minor regressions, or bugs related
>>> > to new features will not block this release.
>>> >
>>> > ---------------------------------------------------------------------
>>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> > For additional commands, e-mail: dev-help@spark.apache.org
>>> >
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>>
>>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Hari Shreedharan <hs...@cloudera.com>.
+1. Build looks good, ran a couple apps on YARN


Thanks,
Hari

On Fri, Jun 5, 2015 at 10:52 AM, Yin Huai <yh...@databricks.com> wrote:

> Sean,
>
> Can you add "-Phive -Phive-thriftserver" and try those Hive tests?
>
> Thanks,
>
> Yin
>
> On Fri, Jun 5, 2015 at 5:19 AM, Sean Owen <so...@cloudera.com> wrote:
>
>> Everything checks out again, and the tests pass for me on Ubuntu +
>> Java 7 with '-Pyarn -Phadoop-2.6', except that I always get
>> SparkSubmitSuite errors like ...
>>
>> - success sanity check *** FAILED ***
>>   java.lang.RuntimeException: [download failed:
>> org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed:
>> commons-net#commons-net;3.1!commons-net.jar]
>>   at
>> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978)
>>   at
>> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
>>   ...
>>
>> I also can't get hive tests to pass. Is anyone else seeing anything
>> like this? if not I'll assume this is something specific to the env --
>> or that I don't have the build invocation just right. It's puzzling
>> since it's so consistent, but I presume others' tests pass and Jenkins
>> does.
>>
>>
>> On Wed, Jun 3, 2015 at 5:53 AM, Patrick Wendell <pw...@gmail.com>
>> wrote:
>> > Please vote on releasing the following candidate as Apache Spark
>> version 1.4.0!
>> >
>> > The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>> > 22596c534a38cfdda91aef18aa9037ab101e4251
>> >
>> > The release files, including signatures, digests, etc. can be found at:
>> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>> >
>> > Release artifacts are signed with the following key:
>> > https://people.apache.org/keys/committer/pwendell.asc
>> >
>> > The staging repository for this release can be found at:
>> > [published as version: 1.4.0]
>> > https://repository.apache.org/content/repositories/orgapachespark-1111/
>> > [published as version: 1.4.0-rc4]
>> > https://repository.apache.org/content/repositories/orgapachespark-1112/
>> >
>> > The documentation corresponding to this release can be found at:
>> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>> >
>> > Please vote on releasing this package as Apache Spark 1.4.0!
>> >
>> > The vote is open until Saturday, June 06, at 05:00 UTC and passes
>> > if a majority of at least 3 +1 PMC votes are cast.
>> >
>> > [ ] +1 Release this package as Apache Spark 1.4.0
>> > [ ] -1 Do not release this package because ...
>> >
>> > To learn more about Apache Spark, please see
>> > http://spark.apache.org/
>> >
>> > == What has changed since RC3 ==
>> > In addition to may smaller fixes, three blocker issues were fixed:
>> > 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>> > metadataHive get constructed too early
>> > 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>> > 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>> >
>> > == How can I help test this release? ==
>> > If you are a Spark user, you can help us test this release by
>> > taking a Spark 1.3 workload and running on this release candidate,
>> > then reporting any regressions.
>> >
>> > == What justifies a -1 vote for this release? ==
>> > This vote is happening towards the end of the 1.4 QA period,
>> > so -1 votes should only occur for significant regressions from 1.3.1.
>> > Bugs already present in 1.3.X, minor regressions, or bugs related
>> > to new features will not block this release.
>> >
>> > ---------------------------------------------------------------------
>> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> > For additional commands, e-mail: dev-help@spark.apache.org
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Yin Huai <yh...@databricks.com>.
Sean,

Can you add "-Phive -Phive-thriftserver" and try those Hive tests?

Thanks,

Yin

On Fri, Jun 5, 2015 at 5:19 AM, Sean Owen <so...@cloudera.com> wrote:

> Everything checks out again, and the tests pass for me on Ubuntu +
> Java 7 with '-Pyarn -Phadoop-2.6', except that I always get
> SparkSubmitSuite errors like ...
>
> - success sanity check *** FAILED ***
>   java.lang.RuntimeException: [download failed:
> org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed:
> commons-net#commons-net;3.1!commons-net.jar]
>   at
> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978)
>   at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
>   ...
>
> I also can't get hive tests to pass. Is anyone else seeing anything
> like this? if not I'll assume this is something specific to the env --
> or that I don't have the build invocation just right. It's puzzling
> since it's so consistent, but I presume others' tests pass and Jenkins
> does.
>
>
> On Wed, Jun 3, 2015 at 5:53 AM, Patrick Wendell <pw...@gmail.com>
> wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> 1.4.0!
> >
> > The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> > 22596c534a38cfdda91aef18aa9037ab101e4251
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > [published as version: 1.4.0]
> > https://repository.apache.org/content/repositories/orgapachespark-1111/
> > [published as version: 1.4.0-rc4]
> > https://repository.apache.org/content/repositories/orgapachespark-1112/
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
> >
> > Please vote on releasing this package as Apache Spark 1.4.0!
> >
> > The vote is open until Saturday, June 06, at 05:00 UTC and passes
> > if a majority of at least 3 +1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 1.4.0
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see
> > http://spark.apache.org/
> >
> > == What has changed since RC3 ==
> > In addition to may smaller fixes, three blocker issues were fixed:
> > 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> > metadataHive get constructed too early
> > 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> > 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
> >
> > == How can I help test this release? ==
> > If you are a Spark user, you can help us test this release by
> > taking a Spark 1.3 workload and running on this release candidate,
> > then reporting any regressions.
> >
> > == What justifies a -1 vote for this release? ==
> > This vote is happening towards the end of the 1.4 QA period,
> > so -1 votes should only occur for significant regressions from 1.3.1.
> > Bugs already present in 1.3.X, minor regressions, or bugs related
> > to new features will not block this release.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> > For additional commands, e-mail: dev-help@spark.apache.org
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Fri, Jun 5, 2015 at 5:19 AM, Sean Owen <so...@cloudera.com> wrote:

> - success sanity check *** FAILED ***
>   java.lang.RuntimeException: [download failed:
> org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed:
> commons-net#commons-net;3.1!commons-net.jar]
>   at
> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978)
>   at
> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
>   ...
>

Aside from the separate thread I started, I see errors like these pretty
often when ivy is trying to download too many dependencies at the same time
(even when just starting sbt, for instance). Seems like it doesn't do
throttling very well. Retrying generally fixes these.

-- 
Marcelo

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Sean Owen <so...@cloudera.com>.
Everything checks out again, and the tests pass for me on Ubuntu +
Java 7 with '-Pyarn -Phadoop-2.6', except that I always get
SparkSubmitSuite errors like ...

- success sanity check *** FAILED ***
  java.lang.RuntimeException: [download failed:
org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed:
commons-net#commons-net;3.1!commons-net.jar]
  at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978)
  at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
  ...

I also can't get hive tests to pass. Is anyone else seeing anything
like this? if not I'll assume this is something specific to the env --
or that I don't have the build invocation just right. It's puzzling
since it's so consistent, but I presume others' tests pass and Jenkins
does.


On Wed, Jun 3, 2015 at 5:53 AM, Patrick Wendell <pw...@gmail.com> wrote:
> Please vote on releasing the following candidate as Apache Spark version 1.4.0!
>
> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> 22596c534a38cfdda91aef18aa9037ab101e4251
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> [published as version: 1.4.0]
> https://repository.apache.org/content/repositories/orgapachespark-1111/
> [published as version: 1.4.0-rc4]
> https://repository.apache.org/content/repositories/orgapachespark-1112/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>
> Please vote on releasing this package as Apache Spark 1.4.0!
>
> The vote is open until Saturday, June 06, at 05:00 UTC and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 1.4.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see
> http://spark.apache.org/
>
> == What has changed since RC3 ==
> In addition to may smaller fixes, three blocker issues were fixed:
> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> metadataHive get constructed too early
> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>
> == How can I help test this release? ==
> If you are a Spark user, you can help us test this release by
> taking a Spark 1.3 workload and running on this release candidate,
> then reporting any regressions.
>
> == What justifies a -1 vote for this release? ==
> This vote is happening towards the end of the 1.4 QA period,
> so -1 votes should only occur for significant regressions from 1.3.1.
> Bugs already present in 1.3.X, minor regressions, or bugs related
> to new features will not block this release.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Mark Hamstra <ma...@clearstorydata.com>.
+1

On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com> wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 1.4.0!
>
> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> 22596c534a38cfdda91aef18aa9037ab101e4251
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> [published as version: 1.4.0]
> https://repository.apache.org/content/repositories/orgapachespark-1111/
> [published as version: 1.4.0-rc4]
> https://repository.apache.org/content/repositories/orgapachespark-1112/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>
> Please vote on releasing this package as Apache Spark 1.4.0!
>
> The vote is open until Saturday, June 06, at 05:00 UTC and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 1.4.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see
> http://spark.apache.org/
>
> == What has changed since RC3 ==
> In addition to may smaller fixes, three blocker issues were fixed:
> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> metadataHive get constructed too early
> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>
> == How can I help test this release? ==
> If you are a Spark user, you can help us test this release by
> taking a Spark 1.3 workload and running on this release candidate,
> then reporting any regressions.
>
> == What justifies a -1 vote for this release? ==
> This vote is happening towards the end of the 1.4 QA period,
> so -1 votes should only occur for significant regressions from 1.3.1.
> Bugs already present in 1.3.X, minor regressions, or bugs related
> to new features will not block this release.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Marcelo Vanzin <va...@cloudera.com>.
+1 (non-binding)

Ran some of our internal test suite (yarn + standalone) against the
hadoop-2.6 and without-hadoop binaries.

On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com> wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 1.4.0!
>
> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> 22596c534a38cfdda91aef18aa9037ab101e4251
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> [published as version: 1.4.0]
> https://repository.apache.org/content/repositories/orgapachespark-1111/
> [published as version: 1.4.0-rc4]
> https://repository.apache.org/content/repositories/orgapachespark-1112/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>
> Please vote on releasing this package as Apache Spark 1.4.0!
>
> The vote is open until Saturday, June 06, at 05:00 UTC and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 1.4.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see
> http://spark.apache.org/
>
> == What has changed since RC3 ==
> In addition to may smaller fixes, three blocker issues were fixed:
> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> metadataHive get constructed too early
> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>
> == How can I help test this release? ==
> If you are a Spark user, you can help us test this release by
> taking a Spark 1.3 workload and running on this release candidate,
> then reporting any regressions.
>
> == What justifies a -1 vote for this release? ==
> This vote is happening towards the end of the 1.4 QA period,
> so -1 votes should only occur for significant regressions from 1.3.1.
> Bugs already present in 1.3.X, minor regressions, or bugs related
> to new features will not block this release.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>


-- 
Marcelo

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Ajay Singal <as...@gmail.com>.
+1

On Sun, Jun 7, 2015 at 6:02 PM, Tathagata Das <ta...@gmail.com>
wrote:

> +1
>
> On Sun, Jun 7, 2015 at 3:01 PM, Joseph Bradley <jo...@databricks.com>
> wrote:
>
>> +1
>>
>> On Sat, Jun 6, 2015 at 7:55 PM, Guoqiang Li <wi...@qq.com> wrote:
>>
>>> +1 (non-binding)
>>>
>>>
>>> ------------------ Original ------------------
>>> *From: * "Reynold Xin";<rx...@databricks.com>;
>>> *Date: * Fri, Jun 5, 2015 03:18 PM
>>> *To: * "Krishna Sankar"<ks...@gmail.com>;
>>> *Cc: * "Patrick Wendell"<pw...@gmail.com>; "dev@spark.apache.org"<
>>> dev@spark.apache.org>;
>>> *Subject: * Re: [VOTE] Release Apache Spark 1.4.0 (RC4)
>>>
>>> Enjoy your new shiny mbp.
>>>
>>> On Fri, Jun 5, 2015 at 12:10 AM, Krishna Sankar <ks...@gmail.com>
>>> wrote:
>>>
>>>> +1 (non-binding, of course)
>>>>
>>>> 1. Compiled OSX 10.10 (Yosemite) OK Total time: 25:42 min (My brand new
>>>> shiny MacBookPro12,1 : 16GB. Inaugurated the machine with compile & test
>>>> 1.4.0-RC4 !)
>>>>      mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
>>>> -Dhadoop.version=2.6.0 -DskipTests
>>>> 2. Tested pyspark, mlib - running as well as compare results with 1.3.1
>>>> 2.1. statistics (min,max,mean,Pearson,Spearman) OK
>>>> 2.2. Linear/Ridge/Laso Regression OK
>>>> 2.3. Decision Tree, Naive Bayes OK
>>>> 2.4. KMeans OK
>>>>        Center And Scale OK
>>>> 2.5. RDD operations OK
>>>>       State of the Union Texts - MapReduce, Filter,sortByKey (word
>>>> count)
>>>> 2.6. Recommendation (Movielens medium dataset ~1 M ratings) OK
>>>>        Model evaluation/optimization (rank, numIter, lambda) with
>>>> itertools OK
>>>> 3. Scala - MLlib
>>>> 3.1. statistics (min,max,mean,Pearson,Spearman) OK
>>>> 3.2. LinearRegressionWithSGD OK
>>>> 3.3. Decision Tree OK
>>>> 3.4. KMeans OK
>>>> 3.5. Recommendation (Movielens medium dataset ~1 M ratings) OK
>>>> 3.6. saveAsParquetFile OK
>>>> 3.7. Read and verify the 4.3 save(above) - sqlContext.parquetFile,
>>>> registerTempTable, sql OK
>>>> 3.8. result = sqlContext.sql("SELECT
>>>> OrderDetails.OrderID,ShipCountry,UnitPrice,Qty,Discount FROM Orders INNER
>>>> JOIN OrderDetails ON Orders.OrderID = OrderDetails.OrderID") OK
>>>> 4.0. Spark SQL from Python OK
>>>> 4.1. result = sqlContext.sql("SELECT * from people WHERE State = 'WA'")
>>>> OK
>>>>
>>>> Cheers
>>>> <k/>
>>>>
>>>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
>>>> wrote:
>>>>
>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>> version 1.4.0!
>>>>>
>>>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>>>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>>>> 22596c534a38cfdda91aef18aa9037ab101e4251
>>>>>
>>>>> The release files, including signatures, digests, etc. can be found at:
>>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>>>>
>>>>> Release artifacts are signed with the following key:
>>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>>
>>>>> The staging repository for this release can be found at:
>>>>> [published as version: 1.4.0]
>>>>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>>>>> [published as version: 1.4.0-rc4]
>>>>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>>>>>
>>>>> The documentation corresponding to this release can be found at:
>>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>>>>
>>>>> Please vote on releasing this package as Apache Spark 1.4.0!
>>>>>
>>>>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
>>>>> if a majority of at least 3 +1 PMC votes are cast.
>>>>>
>>>>> [ ] +1 Release this package as Apache Spark 1.4.0
>>>>> [ ] -1 Do not release this package because ...
>>>>>
>>>>> To learn more about Apache Spark, please see
>>>>> http://spark.apache.org/
>>>>>
>>>>> == What has changed since RC3 ==
>>>>> In addition to may smaller fixes, three blocker issues were fixed:
>>>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>>>>> metadataHive get constructed too early
>>>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>>>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be
>>>>> singleton
>>>>>
>>>>> == How can I help test this release? ==
>>>>> If you are a Spark user, you can help us test this release by
>>>>> taking a Spark 1.3 workload and running on this release candidate,
>>>>> then reporting any regressions.
>>>>>
>>>>> == What justifies a -1 vote for this release? ==
>>>>> This vote is happening towards the end of the 1.4 QA period,
>>>>> so -1 votes should only occur for significant regressions from 1.3.1.
>>>>> Bugs already present in 1.3.X, minor regressions, or bugs related
>>>>> to new features will not block this release.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Tathagata Das <ta...@gmail.com>.
+1

On Sun, Jun 7, 2015 at 3:01 PM, Joseph Bradley <jo...@databricks.com>
wrote:

> +1
>
> On Sat, Jun 6, 2015 at 7:55 PM, Guoqiang Li <wi...@qq.com> wrote:
>
>> +1 (non-binding)
>>
>>
>> ------------------ Original ------------------
>> *From: * "Reynold Xin";<rx...@databricks.com>;
>> *Date: * Fri, Jun 5, 2015 03:18 PM
>> *To: * "Krishna Sankar"<ks...@gmail.com>;
>> *Cc: * "Patrick Wendell"<pw...@gmail.com>; "dev@spark.apache.org"<
>> dev@spark.apache.org>;
>> *Subject: * Re: [VOTE] Release Apache Spark 1.4.0 (RC4)
>>
>> Enjoy your new shiny mbp.
>>
>> On Fri, Jun 5, 2015 at 12:10 AM, Krishna Sankar <ks...@gmail.com>
>> wrote:
>>
>>> +1 (non-binding, of course)
>>>
>>> 1. Compiled OSX 10.10 (Yosemite) OK Total time: 25:42 min (My brand new
>>> shiny MacBookPro12,1 : 16GB. Inaugurated the machine with compile & test
>>> 1.4.0-RC4 !)
>>>      mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
>>> -Dhadoop.version=2.6.0 -DskipTests
>>> 2. Tested pyspark, mlib - running as well as compare results with 1.3.1
>>> 2.1. statistics (min,max,mean,Pearson,Spearman) OK
>>> 2.2. Linear/Ridge/Laso Regression OK
>>> 2.3. Decision Tree, Naive Bayes OK
>>> 2.4. KMeans OK
>>>        Center And Scale OK
>>> 2.5. RDD operations OK
>>>       State of the Union Texts - MapReduce, Filter,sortByKey (word count)
>>> 2.6. Recommendation (Movielens medium dataset ~1 M ratings) OK
>>>        Model evaluation/optimization (rank, numIter, lambda) with
>>> itertools OK
>>> 3. Scala - MLlib
>>> 3.1. statistics (min,max,mean,Pearson,Spearman) OK
>>> 3.2. LinearRegressionWithSGD OK
>>> 3.3. Decision Tree OK
>>> 3.4. KMeans OK
>>> 3.5. Recommendation (Movielens medium dataset ~1 M ratings) OK
>>> 3.6. saveAsParquetFile OK
>>> 3.7. Read and verify the 4.3 save(above) - sqlContext.parquetFile,
>>> registerTempTable, sql OK
>>> 3.8. result = sqlContext.sql("SELECT
>>> OrderDetails.OrderID,ShipCountry,UnitPrice,Qty,Discount FROM Orders INNER
>>> JOIN OrderDetails ON Orders.OrderID = OrderDetails.OrderID") OK
>>> 4.0. Spark SQL from Python OK
>>> 4.1. result = sqlContext.sql("SELECT * from people WHERE State = 'WA'")
>>> OK
>>>
>>> Cheers
>>> <k/>
>>>
>>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
>>> wrote:
>>>
>>>> Please vote on releasing the following candidate as Apache Spark
>>>> version 1.4.0!
>>>>
>>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>>> 22596c534a38cfdda91aef18aa9037ab101e4251
>>>>
>>>> The release files, including signatures, digests, etc. can be found at:
>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>>>
>>>> Release artifacts are signed with the following key:
>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>
>>>> The staging repository for this release can be found at:
>>>> [published as version: 1.4.0]
>>>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>>>> [published as version: 1.4.0-rc4]
>>>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>>>>
>>>> The documentation corresponding to this release can be found at:
>>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>>>
>>>> Please vote on releasing this package as Apache Spark 1.4.0!
>>>>
>>>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
>>>> if a majority of at least 3 +1 PMC votes are cast.
>>>>
>>>> [ ] +1 Release this package as Apache Spark 1.4.0
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>> To learn more about Apache Spark, please see
>>>> http://spark.apache.org/
>>>>
>>>> == What has changed since RC3 ==
>>>> In addition to may smaller fixes, three blocker issues were fixed:
>>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>>>> metadataHive get constructed too early
>>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>>>>
>>>> == How can I help test this release? ==
>>>> If you are a Spark user, you can help us test this release by
>>>> taking a Spark 1.3 workload and running on this release candidate,
>>>> then reporting any regressions.
>>>>
>>>> == What justifies a -1 vote for this release? ==
>>>> This vote is happening towards the end of the 1.4 QA period,
>>>> so -1 votes should only occur for significant regressions from 1.3.1.
>>>> Bugs already present in 1.3.X, minor regressions, or bugs related
>>>> to new features will not block this release.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>>
>>>>
>>>
>>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Joseph Bradley <jo...@databricks.com>.
+1

On Sat, Jun 6, 2015 at 7:55 PM, Guoqiang Li <wi...@qq.com> wrote:

> +1 (non-binding)
>
>
> ------------------ Original ------------------
> *From: * "Reynold Xin";<rx...@databricks.com>;
> *Date: * Fri, Jun 5, 2015 03:18 PM
> *To: * "Krishna Sankar"<ks...@gmail.com>;
> *Cc: * "Patrick Wendell"<pw...@gmail.com>; "dev@spark.apache.org"<
> dev@spark.apache.org>;
> *Subject: * Re: [VOTE] Release Apache Spark 1.4.0 (RC4)
>
> Enjoy your new shiny mbp.
>
> On Fri, Jun 5, 2015 at 12:10 AM, Krishna Sankar <ks...@gmail.com>
> wrote:
>
>> +1 (non-binding, of course)
>>
>> 1. Compiled OSX 10.10 (Yosemite) OK Total time: 25:42 min (My brand new
>> shiny MacBookPro12,1 : 16GB. Inaugurated the machine with compile & test
>> 1.4.0-RC4 !)
>>      mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
>> -Dhadoop.version=2.6.0 -DskipTests
>> 2. Tested pyspark, mlib - running as well as compare results with 1.3.1
>> 2.1. statistics (min,max,mean,Pearson,Spearman) OK
>> 2.2. Linear/Ridge/Laso Regression OK
>> 2.3. Decision Tree, Naive Bayes OK
>> 2.4. KMeans OK
>>        Center And Scale OK
>> 2.5. RDD operations OK
>>       State of the Union Texts - MapReduce, Filter,sortByKey (word count)
>> 2.6. Recommendation (Movielens medium dataset ~1 M ratings) OK
>>        Model evaluation/optimization (rank, numIter, lambda) with
>> itertools OK
>> 3. Scala - MLlib
>> 3.1. statistics (min,max,mean,Pearson,Spearman) OK
>> 3.2. LinearRegressionWithSGD OK
>> 3.3. Decision Tree OK
>> 3.4. KMeans OK
>> 3.5. Recommendation (Movielens medium dataset ~1 M ratings) OK
>> 3.6. saveAsParquetFile OK
>> 3.7. Read and verify the 4.3 save(above) - sqlContext.parquetFile,
>> registerTempTable, sql OK
>> 3.8. result = sqlContext.sql("SELECT
>> OrderDetails.OrderID,ShipCountry,UnitPrice,Qty,Discount FROM Orders INNER
>> JOIN OrderDetails ON Orders.OrderID = OrderDetails.OrderID") OK
>> 4.0. Spark SQL from Python OK
>> 4.1. result = sqlContext.sql("SELECT * from people WHERE State = 'WA'") OK
>>
>> Cheers
>> <k/>
>>
>> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
>> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark version
>>> 1.4.0!
>>>
>>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>> 22596c534a38cfdda91aef18aa9037ab101e4251
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://people.apache.org/keys/committer/pwendell.asc
>>>
>>> The staging repository for this release can be found at:
>>> [published as version: 1.4.0]
>>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>>> [published as version: 1.4.0-rc4]
>>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>>>
>>> The documentation corresponding to this release can be found at:
>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>>
>>> Please vote on releasing this package as Apache Spark 1.4.0!
>>>
>>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
>>> if a majority of at least 3 +1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 1.4.0
>>> [ ] -1 Do not release this package because ...
>>>
>>> To learn more about Apache Spark, please see
>>> http://spark.apache.org/
>>>
>>> == What has changed since RC3 ==
>>> In addition to may smaller fixes, three blocker issues were fixed:
>>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>>> metadataHive get constructed too early
>>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>>>
>>> == How can I help test this release? ==
>>> If you are a Spark user, you can help us test this release by
>>> taking a Spark 1.3 workload and running on this release candidate,
>>> then reporting any regressions.
>>>
>>> == What justifies a -1 vote for this release? ==
>>> This vote is happening towards the end of the 1.4 QA period,
>>> so -1 votes should only occur for significant regressions from 1.3.1.
>>> Bugs already present in 1.3.X, minor regressions, or bugs related
>>> to new features will not block this release.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: dev-help@spark.apache.org
>>>
>>>
>>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Guoqiang Li <wi...@qq.com>.
+1 (non-binding)




------------------ Original ------------------
From:  "Reynold Xin";<rx...@databricks.com>;
Date:  Fri, Jun 5, 2015 03:18 PM
To:  "Krishna Sankar"<ks...@gmail.com>; 
Cc:  "Patrick Wendell"<pw...@gmail.com>; "dev@spark.apache.org"<de...@spark.apache.org>; 
Subject:  Re: [VOTE] Release Apache Spark 1.4.0 (RC4)



Enjoy your new shiny mbp.

On Fri, Jun 5, 2015 at 12:10 AM, Krishna Sankar <ks...@gmail.com> wrote:
+1 (non-binding, of course)


1. Compiled OSX 10.10 (Yosemite) OK Total time: 25:42 min (My brand new shiny MacBookPro12,1 : 16GB. Inaugurated the machine with compile & test 1.4.0-RC4 !)
     mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4 -Dhadoop.version=2.6.0 -DskipTests
2. Tested pyspark, mlib - running as well as compare results with 1.3.1
2.1. statistics (min,max,mean,Pearson,Spearman) OK
2.2. Linear/Ridge/Laso Regression OK 
2.3. Decision Tree, Naive Bayes OK
2.4. KMeans OK
       Center And Scale OK
2.5. RDD operations OK
      State of the Union Texts - MapReduce, Filter,sortByKey (word count)
2.6. Recommendation (Movielens medium dataset ~1 M ratings) OK
       Model evaluation/optimization (rank, numIter, lambda) with itertools OK
3. Scala - MLlib
3.1. statistics (min,max,mean,Pearson,Spearman) OK
3.2. LinearRegressionWithSGD OK
3.3. Decision Tree OK
3.4. KMeans OK
3.5. Recommendation (Movielens medium dataset ~1 M ratings) OK
3.6. saveAsParquetFile OK
3.7. Read and verify the 4.3 save(above) - sqlContext.parquetFile, registerTempTable, sql OK
3.8. result = sqlContext.sql("SELECT OrderDetails.OrderID,ShipCountry,UnitPrice,Qty,Discount FROM Orders INNER JOIN OrderDetails ON Orders.OrderID = OrderDetails.OrderID") OK
4.0. Spark SQL from Python OK
4.1. result = sqlContext.sql("SELECT * from people WHERE State = 'WA'") OK


Cheers
<k/>


On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com> wrote:
Please vote on releasing the following candidate as Apache Spark version 1.4.0!
 
 The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
 https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
 22596c534a38cfdda91aef18aa9037ab101e4251
 
 The release files, including signatures, digests, etc. can be found at:
 http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
 
 Release artifacts are signed with the following key:
 https://people.apache.org/keys/committer/pwendell.asc
 
 The staging repository for this release can be found at:
 [published as version: 1.4.0]
 https://repository.apache.org/content/repositories/orgapachespark-1111/
 [published as version: 1.4.0-rc4]
 https://repository.apache.org/content/repositories/orgapachespark-1112/
 
 The documentation corresponding to this release can be found at:
 http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
 
 Please vote on releasing this package as Apache Spark 1.4.0!
 
 The vote is open until Saturday, June 06, at 05:00 UTC and passes
 if a majority of at least 3 +1 PMC votes are cast.
 
 [ ] +1 Release this package as Apache Spark 1.4.0
 [ ] -1 Do not release this package because ...
 
 To learn more about Apache Spark, please see
 http://spark.apache.org/
 
 == What has changed since RC3 ==
 In addition to may smaller fixes, three blocker issues were fixed:
 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
 metadataHive get constructed too early
 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
 
 == How can I help test this release? ==
 If you are a Spark user, you can help us test this release by
 taking a Spark 1.3 workload and running on this release candidate,
 then reporting any regressions.
 
 == What justifies a -1 vote for this release? ==
 This vote is happening towards the end of the 1.4 QA period,
 so -1 votes should only occur for significant regressions from 1.3.1.
 Bugs already present in 1.3.X, minor regressions, or bugs related
 to new features will not block this release.
 
 ---------------------------------------------------------------------
 To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
 For additional commands, e-mail: dev-help@spark.apache.org

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Reynold Xin <rx...@databricks.com>.
Enjoy your new shiny mbp.

On Fri, Jun 5, 2015 at 12:10 AM, Krishna Sankar <ks...@gmail.com> wrote:

> +1 (non-binding, of course)
>
> 1. Compiled OSX 10.10 (Yosemite) OK Total time: 25:42 min (My brand new
> shiny MacBookPro12,1 : 16GB. Inaugurated the machine with compile & test
> 1.4.0-RC4 !)
>      mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
> -Dhadoop.version=2.6.0 -DskipTests
> 2. Tested pyspark, mlib - running as well as compare results with 1.3.1
> 2.1. statistics (min,max,mean,Pearson,Spearman) OK
> 2.2. Linear/Ridge/Laso Regression OK
> 2.3. Decision Tree, Naive Bayes OK
> 2.4. KMeans OK
>        Center And Scale OK
> 2.5. RDD operations OK
>       State of the Union Texts - MapReduce, Filter,sortByKey (word count)
> 2.6. Recommendation (Movielens medium dataset ~1 M ratings) OK
>        Model evaluation/optimization (rank, numIter, lambda) with
> itertools OK
> 3. Scala - MLlib
> 3.1. statistics (min,max,mean,Pearson,Spearman) OK
> 3.2. LinearRegressionWithSGD OK
> 3.3. Decision Tree OK
> 3.4. KMeans OK
> 3.5. Recommendation (Movielens medium dataset ~1 M ratings) OK
> 3.6. saveAsParquetFile OK
> 3.7. Read and verify the 4.3 save(above) - sqlContext.parquetFile,
> registerTempTable, sql OK
> 3.8. result = sqlContext.sql("SELECT
> OrderDetails.OrderID,ShipCountry,UnitPrice,Qty,Discount FROM Orders INNER
> JOIN OrderDetails ON Orders.OrderID = OrderDetails.OrderID") OK
> 4.0. Spark SQL from Python OK
> 4.1. result = sqlContext.sql("SELECT * from people WHERE State = 'WA'") OK
>
> Cheers
> <k/>
>
> On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com>
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 1.4.0!
>>
>> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
>> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>> 22596c534a38cfdda91aef18aa9037ab101e4251
>>
>> The release files, including signatures, digests, etc. can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> [published as version: 1.4.0]
>> https://repository.apache.org/content/repositories/orgapachespark-1111/
>> [published as version: 1.4.0-rc4]
>> https://repository.apache.org/content/repositories/orgapachespark-1112/
>>
>> The documentation corresponding to this release can be found at:
>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>>
>> Please vote on releasing this package as Apache Spark 1.4.0!
>>
>> The vote is open until Saturday, June 06, at 05:00 UTC and passes
>> if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 1.4.0
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see
>> http://spark.apache.org/
>>
>> == What has changed since RC3 ==
>> In addition to may smaller fixes, three blocker issues were fixed:
>> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
>> metadataHive get constructed too early
>> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
>> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>>
>> == How can I help test this release? ==
>> If you are a Spark user, you can help us test this release by
>> taking a Spark 1.3 workload and running on this release candidate,
>> then reporting any regressions.
>>
>> == What justifies a -1 vote for this release? ==
>> This vote is happening towards the end of the 1.4 QA period,
>> so -1 votes should only occur for significant regressions from 1.3.1.
>> Bugs already present in 1.3.X, minor regressions, or bugs related
>> to new features will not block this release.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>

Re: [VOTE] Release Apache Spark 1.4.0 (RC4)

Posted by Krishna Sankar <ks...@gmail.com>.
+1 (non-binding, of course)

1. Compiled OSX 10.10 (Yosemite) OK Total time: 25:42 min (My brand new
shiny MacBookPro12,1 : 16GB. Inaugurated the machine with compile & test
1.4.0-RC4 !)
     mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
-Dhadoop.version=2.6.0 -DskipTests
2. Tested pyspark, mlib - running as well as compare results with 1.3.1
2.1. statistics (min,max,mean,Pearson,Spearman) OK
2.2. Linear/Ridge/Laso Regression OK
2.3. Decision Tree, Naive Bayes OK
2.4. KMeans OK
       Center And Scale OK
2.5. RDD operations OK
      State of the Union Texts - MapReduce, Filter,sortByKey (word count)
2.6. Recommendation (Movielens medium dataset ~1 M ratings) OK
       Model evaluation/optimization (rank, numIter, lambda) with itertools
OK
3. Scala - MLlib
3.1. statistics (min,max,mean,Pearson,Spearman) OK
3.2. LinearRegressionWithSGD OK
3.3. Decision Tree OK
3.4. KMeans OK
3.5. Recommendation (Movielens medium dataset ~1 M ratings) OK
3.6. saveAsParquetFile OK
3.7. Read and verify the 4.3 save(above) - sqlContext.parquetFile,
registerTempTable, sql OK
3.8. result = sqlContext.sql("SELECT
OrderDetails.OrderID,ShipCountry,UnitPrice,Qty,Discount FROM Orders INNER
JOIN OrderDetails ON Orders.OrderID = OrderDetails.OrderID") OK
4.0. Spark SQL from Python OK
4.1. result = sqlContext.sql("SELECT * from people WHERE State = 'WA'") OK

Cheers
<k/>

On Tue, Jun 2, 2015 at 8:53 PM, Patrick Wendell <pw...@gmail.com> wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 1.4.0!
>
> The tag to be voted on is v1.4.0-rc3 (commit 22596c5):
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
> 22596c534a38cfdda91aef18aa9037ab101e4251
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> [published as version: 1.4.0]
> https://repository.apache.org/content/repositories/orgapachespark-1111/
> [published as version: 1.4.0-rc4]
> https://repository.apache.org/content/repositories/orgapachespark-1112/
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/
>
> Please vote on releasing this package as Apache Spark 1.4.0!
>
> The vote is open until Saturday, June 06, at 05:00 UTC and passes
> if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 1.4.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see
> http://spark.apache.org/
>
> == What has changed since RC3 ==
> In addition to may smaller fixes, three blocker issues were fixed:
> 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make
> metadataHive get constructed too early
> 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise()
> 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be singleton
>
> == How can I help test this release? ==
> If you are a Spark user, you can help us test this release by
> taking a Spark 1.3 workload and running on this release candidate,
> then reporting any regressions.
>
> == What justifies a -1 vote for this release? ==
> This vote is happening towards the end of the 1.4 QA period,
> so -1 votes should only occur for significant regressions from 1.3.1.
> Bugs already present in 1.3.X, minor regressions, or bugs related
> to new features will not block this release.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>