You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Felix Cheung <fe...@apache.org> on 2017/12/01 08:10:18 UTC

[RESULT][VOTE] Spark 2.2.1 (RC2)

This vote passes. Thanks everyone for testing this release.


+1:

Sean Owen (binding)

Herman van Hövell tot Westerflier (binding)

Wenchen Fan (binding)

Shivaram Venkataraman (binding)

Felix Cheung

Henry Robinson

Hyukjin Kwon

Dongjoon Hyun

Kazuaki Ishizaki

Holden Karau

Weichen Xu


0: None

-1: None




On Wed, Nov 29, 2017 at 3:21 PM Weichen Xu <we...@databricks.com>
wrote:

> +1
>
> On Thu, Nov 30, 2017 at 6:27 AM, Shivaram Venkataraman <
> shivaram@eecs.berkeley.edu> wrote:
>
>> +1
>>
>> SHA, MD5 and signatures look fine. Built and ran Maven tests on my
>> Macbook.
>>
>> Thanks
>> Shivaram
>>
>> On Wed, Nov 29, 2017 at 10:43 AM, Holden Karau <ho...@pigscanfly.ca>
>> wrote:
>>
>>> +1 (non-binding)
>>>
>>> PySpark install into a virtualenv works, PKG-INFO looks correctly
>>> populated (mostly checking for the pypandoc conversion there).
>>>
>>> Thanks for your hard work Felix (and all of the testers :)) :)
>>>
>>> On Wed, Nov 29, 2017 at 9:33 AM, Wenchen Fan <cl...@gmail.com>
>>> wrote:
>>>
>>>> +1
>>>>
>>>> On Thu, Nov 30, 2017 at 1:28 AM, Kazuaki Ishizaki <IS...@jp.ibm.com>
>>>> wrote:
>>>>
>>>>> +1 (non-binding)
>>>>>
>>>>> I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests
>>>>> for core/sql-core/sql-catalyst/mllib/mllib-local have passed.
>>>>>
>>>>> $ java -version
>>>>> openjdk version "1.8.0_131"
>>>>> OpenJDK Runtime Environment (build
>>>>> 1.8.0_131-8u131-b11-2ubuntu1.16.04.3-b11)
>>>>> OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)
>>>>>
>>>>> % build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7
>>>>> -T 24 clean package install
>>>>> % build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl
>>>>> core -pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
>>>>> ...
>>>>> Run completed in 13 minutes, 54 seconds.
>>>>> Total number of tests run: 1118
>>>>> Suites: completed 170, aborted 0
>>>>> Tests: succeeded 1118, failed 0, canceled 0, ignored 6, pending 0
>>>>> All tests passed.
>>>>> [INFO]
>>>>> ------------------------------------------------------------------------
>>>>> [INFO] Reactor Summary:
>>>>> [INFO]
>>>>> [INFO] Spark Project Core ................................. SUCCESS
>>>>> [17:13 min]
>>>>> [INFO] Spark Project ML Local Library ..................... SUCCESS [
>>>>>  6.065 s]
>>>>> [INFO] Spark Project Catalyst ............................. SUCCESS
>>>>> [11:51 min]
>>>>> [INFO] Spark Project SQL .................................. SUCCESS
>>>>> [17:55 min]
>>>>> [INFO] Spark Project ML Library ........................... SUCCESS
>>>>> [17:05 min]
>>>>> [INFO]
>>>>> ------------------------------------------------------------------------
>>>>> [INFO] BUILD SUCCESS
>>>>> [INFO]
>>>>> ------------------------------------------------------------------------
>>>>> [INFO] Total time: 01:04 h
>>>>> [INFO] Finished at: 2017-11-30T01:48:15+09:00
>>>>> [INFO] Final Memory: 128M/329M
>>>>> [INFO]
>>>>> ------------------------------------------------------------------------
>>>>> [WARNING] The requested profile "hive" could not be activated because
>>>>> it does not exist.
>>>>>
>>>>> Kazuaki Ishizaki
>>>>>
>>>>>
>>>>>
>>>>> From:        Dongjoon Hyun <do...@gmail.com>
>>>>> To:        Hyukjin Kwon <gu...@gmail.com>
>>>>> Cc:        Spark dev list <de...@spark.apache.org>, Felix Cheung <
>>>>> felixcheung@apache.org>, Sean Owen <so...@cloudera.com>
>>>>> Date:        2017/11/29 12:56
>>>>> Subject:        Re: [VOTE] Spark 2.2.1 (RC2)
>>>>> ------------------------------
>>>>>
>>>>>
>>>>>
>>>>> +1 (non-binding)
>>>>>
>>>>> RC2 is tested on CentOS, too.
>>>>>
>>>>> Bests,
>>>>> Dongjoon.
>>>>>
>>>>> On Tue, Nov 28, 2017 at 4:35 PM, Hyukjin Kwon <*gurwls223@gmail.com*
>>>>> <gu...@gmail.com>> wrote:
>>>>> +1
>>>>>
>>>>> 2017-11-29 8:18 GMT+09:00 Henry Robinson <*henry@apache.org*
>>>>> <he...@apache.org>>:
>>>>> (My vote is non-binding, of course).
>>>>>
>>>>> On 28 November 2017 at 14:53, Henry Robinson <*henry@apache.org*
>>>>> <he...@apache.org>> wrote:
>>>>> +1, tests all pass for me on Ubuntu 16.04.
>>>>>
>>>>> On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier <
>>>>> *hvanhovell@databricks.com* <hv...@databricks.com>> wrote:
>>>>> +1
>>>>>
>>>>> On Tue, Nov 28, 2017 at 7:35 PM, Felix Cheung <
>>>>> *felixcheung@apache.org* <fe...@apache.org>> wrote:
>>>>> +1
>>>>>
>>>>> Thanks Sean. Please vote!
>>>>>
>>>>> Tested various scenarios with R package. Ubuntu, Debian, Windows
>>>>> r-devel and release and on r-hub. Verified CRAN checks are clean (only 1
>>>>> NOTE!) and no leaked files (.cache removed, /tmp clean)
>>>>>
>>>>>
>>>>> On Sun, Nov 26, 2017 at 11:55 AM Sean Owen <*sowen@cloudera.com*
>>>>> <so...@cloudera.com>> wrote:
>>>>> Yes it downloads recent releases. The test worked for me on a second
>>>>> try, so I suspect a bad mirror. If this comes up frequently we can just add
>>>>> retry logic, as the closer.lua script will return different mirrors each
>>>>> time.
>>>>>
>>>>> The tests all pass for me on the latest Debian, so +1 for this release.
>>>>>
>>>>> (I committed the change to set -Xss4m for tests consistently, but this
>>>>> shouldn't block a release.)
>>>>>
>>>>>
>>>>> On Sat, Nov 25, 2017 at 12:47 PM Felix Cheung <
>>>>> *felixcheung@apache.org* <fe...@apache.org>> wrote:
>>>>> Ah sorry digging through the history it looks like this is changed
>>>>> relatively recently and should only download previous releases.
>>>>>
>>>>> Perhaps we are intermittently hitting a mirror that doesn’t have the
>>>>> files?
>>>>>
>>>>>
>>>>> *https://github.com/apache/spark/commit/daa838b8886496e64700b55d1301d348f1d5c9ae*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_commit_daa838b8886496e64700b55d1301d348f1d5c9ae&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=hs95TxtmzYWnoHYZjf51e_CNPW0Lxe1DnqZms2h_ChQ&e=>
>>>>>
>>>>>
>>>>>
>>>>> On Sat, Nov 25, 2017 at 10:36 AM Felix Cheung <
>>>>> *felixcheung@apache.org* <fe...@apache.org>> wrote:
>>>>> Thanks Sean.
>>>>>
>>>>> For the second one, it looks like the
>>>>>  HiveExternalCatalogVersionsSuite is trying to download the release tgz
>>>>> from the official Apache mirror, which won’t work unless the release is
>>>>> actually, released?
>>>>> valpreferredMirror=
>>>>>
>>>>> Seq("wget", "*https://www.apache.org/dyn/closer.lua?preferred=true*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.apache.org_dyn_closer.lua-3Fpreferred-3Dtrue&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=-ySYsEWnZhSg0bpbCeefR_JDKa0cO1tHCW5CJe_AiP0&e=>
>>>>> ", "-q", "-O", "-").!!.trim
>>>>> valurl=s
>>>>> "$preferredMirror/spark/spark-$version/spark-$version-bin-hadoop2.7.tgz"
>>>>>
>>>>>
>>>>>
>>>>> It’s proabbly getting an error page instead.
>>>>>
>>>>>
>>>>> On Sat, Nov 25, 2017 at 10:28 AM Sean Owen <*sowen@cloudera.com*
>>>>> <so...@cloudera.com>> wrote:
>>>>> I hit the same StackOverflowError as in the previous RC test, but,
>>>>> pretty sure this is just because the increased thread stack size JVM flag
>>>>> isn't applied consistently. This seems to resolve it:
>>>>>
>>>>> *https://github.com/apache/spark/pull/19820*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_pull_19820&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=h3SU2l6GO8-2bs9OSc842pTBaMzjk8Hq6CC-4i-nZPQ&e=>
>>>>>
>>>>> This wouldn't block release IMHO.
>>>>>
>>>>>
>>>>> I am currently investigating this failure though -- seems like the
>>>>> mechanism that downloads Spark tarballs needs fixing, or updating, in the
>>>>> 2.2 branch?
>>>>>
>>>>> HiveExternalCatalogVersionsSuite:
>>>>>
>>>>> gzip: stdin: not in gzip format
>>>>>
>>>>> tar: Child returned status 1
>>>>>
>>>>> tar: Error is not recoverable: exiting now
>>>>>
>>>>> *** RUN ABORTED ***
>>>>>
>>>>>   java.io.IOException: Cannot run program "./bin/spark-submit" (in
>>>>> directory "/tmp/test-spark/spark-2.0.2"): error=2, No such file or directory
>>>>>
>>>>> On Sat, Nov 25, 2017 at 12:34 AM Felix Cheung <
>>>>> *felixcheung@apache.org* <fe...@apache.org>> wrote:
>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>> version 2.2.1. The vote is open until Friday December 1, 2017 at
>>>>> 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes
>>>>> are cast.
>>>>>
>>>>>
>>>>> [ ] +1 Release this package as Apache Spark 2.2.1
>>>>>
>>>>> [ ] -1 Do not release this package because ...
>>>>>
>>>>>
>>>>> To learn more about Apache Spark, please see
>>>>> *https://spark.apache.org/*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__spark.apache.org_&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=eyXtxjDyM_HgW4H-niKxyA9uiYiDBs65UJB9xkXEv2c&e=>
>>>>>
>>>>>
>>>>> The tag to be voted on is v2.2.1-rc2
>>>>> *https://github.com/apache/spark/tree/v2.2.1-rc2*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_tree_v2.2.1-2Drc2&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=eUhE5bbOKKS6mInmx1SGSr5EI4TqHevk6FOqfv64i_4&e=>
>>>>>   (e30e2698a2193f0bbdcd4edb884710819ab6397c)
>>>>>
>>>>> List of JIRA tickets resolved in this release can be found here
>>>>> *https://issues.apache.org/jira/projects/SPARK/versions/12340470*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_projects_SPARK_versions_12340470&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=pQRROoYECKC9zu7BSCvffAEYGD7bmxyRmkMffqkPaXk&e=>
>>>>>
>>>>>
>>>>> The release files, including signatures, digests, etc. can be found at:
>>>>> *https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-bin/*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__dist.apache.org_repos_dist_dev_spark_spark-2D2.2.1-2Drc2-2Dbin_&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=vuvgUXSfszp32zQAimuTmyTXwsB1QGGVnKCq9XpjQyg&e=>
>>>>>
>>>>> Release artifacts are signed with the following key:
>>>>> *https://dist.apache.org/repos/dist/dev/spark/KEYS*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__dist.apache.org_repos_dist_dev_spark_KEYS&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=pETEhVi5MEdLXWqvOcElD5Q4OHu5Jn4E7XXlcY-CsQs&e=>
>>>>>
>>>>> The staging repository for this release can be found at:
>>>>>
>>>>> *https://repository.apache.org/content/repositories/orgapachespark-1257/*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__repository.apache.org_content_repositories_orgapachespark-2D1257_&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=UT1rKau36W-7JfUugXMC_BCEt4Zk20tInhbT0Bg52SM&e=>
>>>>>
>>>>> The documentation corresponding to this release can be found at:
>>>>>
>>>>> *https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-docs/_site/index.html*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__dist.apache.org_repos_dist_dev_spark_spark-2D2.2.1-2Drc2-2Ddocs_-5Fsite_index.html&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=lHSJIz9KadjrrrffuvOPxLvDccwwlodqO_CxQcCk1PI&e=>
>>>>>
>>>>>
>>>>> *FAQ*
>>>>>
>>>>> *How can I help test this release?*
>>>>>
>>>>> If you are a Spark user, you can help us test this release by taking
>>>>> an existing Spark workload and running on this release candidate, then
>>>>> reporting any regressions.
>>>>>
>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>>> can add the staging repository to your projects resolvers and test with the
>>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>>> up building with a out of date RC going forward).
>>>>>
>>>>> *What should happen to JIRA tickets still targeting 2.2.1?*
>>>>>
>>>>> Committers should look at those and triage. Extremely important bug
>>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>>> worked on immediately. Everything else please retarget to 2.2.2.
>>>>>
>>>>> *But my bug isn't fixed!??!*
>>>>>
>>>>> In order to make timely releases, we will typically not hold the
>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>> said if there is something which is a regression form 2.2.0 that has not
>>>>> been correctly targeted please ping a committer to help target the issue
>>>>> (you can see the open issues listed as impacting Spark 2.2.1 / 2.2.2
>>>>> *here*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_issues_-3Fjql-3Dproject-2520-253D-2520SPARK-2520AND-2520status-2520-253D-2520OPEN-2520AND-2520-28affectedVersion-2520-253D-25202.2.1-2520OR-2520affectedVersion-2520-253D-25202.2.2-29&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=ZW-BVV3xOCbwuTMugoLdpdqQIIQq255D5ICs2Ur7WyM&e=>
>>>>> .
>>>>>
>>>>> *What are the **unresolved issues targeted for 2.2.1*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_issues_-3Fjql-3Dproject-2520-253D-2520SPARK-2520AND-2520status-2520in-2520-28Open-252C-2520-2522In-2520Progress-2522-252C-2520Reopened-29-2520AND-2520-2522Target-2520Version-252Fs-2522-2520-253D-25202.2.1&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=1kUm3VAHEBf-_Qy4cZa4rK5HsEZ_0MvmZHuF8yS0gik&e=>
>>>>> *?*
>>>>>
>>>>> At the time of the writing, there is one intermited failure
>>>>> *SPARK-20201*
>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D20201&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=KBdZdyfkHvBk1ikAKi79I99vTnlXHbnwBe7d3NJTjg8&e=> which
>>>>> we are tracking since 2.2.0.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>>
>>
>>
>

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by Holden Karau <ho...@pigscanfly.ca>.
I think the final requirements published being PMC only (w/maybe the person
who set it up being an exception) is generally the case for each of the
languages (e.g. Maven requires PMC to do final push, the dist download
requires a final svn mv by PMC, etc.).

On Thu, Dec 14, 2017 at 1:38 PM, Felix Cheung <fe...@apache.org>
wrote:

> ;)
> The credential to the user to publish to PyPI is PMC only.
>
> +Holden
>
> Had discussed this in the other thread I sent to private@ last week.
>
>
> On Thu, Dec 14, 2017 at 4:34 AM Sean Owen <so...@cloudera.com> wrote:
>
>> On the various access questions here -- what do you need to have that
>> access? We definitely need to give you all necessary access if you're the
>> release manager!
>>
>>
>> On Thu, Dec 14, 2017 at 6:32 AM Felix Cheung <fe...@apache.org>
>> wrote:
>>
>>> And I don’t have access to publish python.
>>>
>>> On Wed, Dec 13, 2017 at 9:55 AM Shivaram Venkataraman <
>>> shivaram@eecs.berkeley.edu> wrote:
>>>
>>>> The R artifacts have some issue that Felix and I are debugging. Lets
>>>> not block the announcement for that.
>>>>
>>>>
>>>>


-- 
Twitter: https://twitter.com/holdenkarau

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by Felix Cheung <fe...@apache.org>.
;)
The credential to the user to publish to PyPI is PMC only.

+Holden

Had discussed this in the other thread I sent to private@ last week.


On Thu, Dec 14, 2017 at 4:34 AM Sean Owen <so...@cloudera.com> wrote:

> On the various access questions here -- what do you need to have that
> access? We definitely need to give you all necessary access if you're the
> release manager!
>
>
> On Thu, Dec 14, 2017 at 6:32 AM Felix Cheung <fe...@apache.org>
> wrote:
>
>> And I don’t have access to publish python.
>>
>> On Wed, Dec 13, 2017 at 9:55 AM Shivaram Venkataraman <
>> shivaram@eecs.berkeley.edu> wrote:
>>
>>> The R artifacts have some issue that Felix and I are debugging. Lets not
>>> block the announcement for that.
>>>
>>>
>>>

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by Sean Owen <so...@cloudera.com>.
On the various access questions here -- what do you need to have that
access? We definitely need to give you all necessary access if you're the
release manager!

On Thu, Dec 14, 2017 at 6:32 AM Felix Cheung <fe...@apache.org> wrote:

> And I don’t have access to publish python.
>
> On Wed, Dec 13, 2017 at 9:55 AM Shivaram Venkataraman <
> shivaram@eecs.berkeley.edu> wrote:
>
>> The R artifacts have some issue that Felix and I are debugging. Lets not
>> block the announcement for that.
>>
>>
>>

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by Felix Cheung <fe...@apache.org>.
And I don’t have access to publish python.

On Wed, Dec 13, 2017 at 9:55 AM Shivaram Venkataraman <
shivaram@eecs.berkeley.edu> wrote:

> The R artifacts have some issue that Felix and I are debugging. Lets not
> block the announcement for that.
>
> Thanks
>
> Shivaram
>
> On Wed, Dec 13, 2017 at 5:59 AM, Sean Owen <so...@cloudera.com> wrote:
>
>> Looks like Maven artifacts are up, site's up -- what about the Python and
>> R artifacts?
>> I can also move the spark.apache/docs/latest link to point to 2.2.1 if
>> it's pretty ready.
>> We should announce the release officially too then.
>>
>> On Wed, Dec 6, 2017 at 5:00 PM Felix Cheung <fe...@apache.org>
>> wrote:
>>
>>> I saw the svn move on Monday so I’m working on the website updates.
>>>
>>> I will look into maven today. I will ask if I couldn’t do it.
>>>
>>>
>>> On Wed, Dec 6, 2017 at 10:49 AM Sean Owen <so...@cloudera.com> wrote:
>>>
>>>> Pardon, did this release finish? I don't see it in Maven. I know there
>>>> was some question about getting a hand in finishing the release process,
>>>> including copying artifacts in svn. Was there anything else you're waiting
>>>> on someone to do?
>>>>
>>>>
>>>> On Fri, Dec 1, 2017 at 2:10 AM Felix Cheung <fe...@apache.org>
>>>> wrote:
>>>>
>>>>> This vote passes. Thanks everyone for testing this release.
>>>>>
>>>>>
>>>>> +1:
>>>>>
>>>>> Sean Owen (binding)
>>>>>
>>>>> Herman van Hövell tot Westerflier (binding)
>>>>>
>>>>> Wenchen Fan (binding)
>>>>>
>>>>> Shivaram Venkataraman (binding)
>>>>>
>>>>> Felix Cheung
>>>>>
>>>>> Henry Robinson
>>>>>
>>>>> Hyukjin Kwon
>>>>>
>>>>> Dongjoon Hyun
>>>>>
>>>>> Kazuaki Ishizaki
>>>>>
>>>>> Holden Karau
>>>>>
>>>>> Weichen Xu
>>>>>
>>>>>
>>>>> 0: None
>>>>>
>>>>> -1: None
>>>>>
>>>>
>

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by Shivaram Venkataraman <sh...@eecs.berkeley.edu>.
The R artifacts have some issue that Felix and I are debugging. Lets not
block the announcement for that.

Thanks
Shivaram

On Wed, Dec 13, 2017 at 5:59 AM, Sean Owen <so...@cloudera.com> wrote:

> Looks like Maven artifacts are up, site's up -- what about the Python and
> R artifacts?
> I can also move the spark.apache/docs/latest link to point to 2.2.1 if
> it's pretty ready.
> We should announce the release officially too then.
>
> On Wed, Dec 6, 2017 at 5:00 PM Felix Cheung <fe...@apache.org>
> wrote:
>
>> I saw the svn move on Monday so I’m working on the website updates.
>>
>> I will look into maven today. I will ask if I couldn’t do it.
>>
>>
>> On Wed, Dec 6, 2017 at 10:49 AM Sean Owen <so...@cloudera.com> wrote:
>>
>>> Pardon, did this release finish? I don't see it in Maven. I know there
>>> was some question about getting a hand in finishing the release process,
>>> including copying artifacts in svn. Was there anything else you're waiting
>>> on someone to do?
>>>
>>>
>>> On Fri, Dec 1, 2017 at 2:10 AM Felix Cheung <fe...@apache.org>
>>> wrote:
>>>
>>>> This vote passes. Thanks everyone for testing this release.
>>>>
>>>>
>>>> +1:
>>>>
>>>> Sean Owen (binding)
>>>>
>>>> Herman van Hövell tot Westerflier (binding)
>>>>
>>>> Wenchen Fan (binding)
>>>>
>>>> Shivaram Venkataraman (binding)
>>>>
>>>> Felix Cheung
>>>>
>>>> Henry Robinson
>>>>
>>>> Hyukjin Kwon
>>>>
>>>> Dongjoon Hyun
>>>>
>>>> Kazuaki Ishizaki
>>>>
>>>> Holden Karau
>>>>
>>>> Weichen Xu
>>>>
>>>>
>>>> 0: None
>>>>
>>>> -1: None
>>>>
>>>

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by Sean Owen <so...@cloudera.com>.
Looks like Maven artifacts are up, site's up -- what about the Python and R
artifacts?
I can also move the spark.apache/docs/latest link to point to 2.2.1 if it's
pretty ready.
We should announce the release officially too then.

On Wed, Dec 6, 2017 at 5:00 PM Felix Cheung <fe...@apache.org> wrote:

> I saw the svn move on Monday so I’m working on the website updates.
>
> I will look into maven today. I will ask if I couldn’t do it.
>
>
> On Wed, Dec 6, 2017 at 10:49 AM Sean Owen <so...@cloudera.com> wrote:
>
>> Pardon, did this release finish? I don't see it in Maven. I know there
>> was some question about getting a hand in finishing the release process,
>> including copying artifacts in svn. Was there anything else you're waiting
>> on someone to do?
>>
>>
>> On Fri, Dec 1, 2017 at 2:10 AM Felix Cheung <fe...@apache.org>
>> wrote:
>>
>>> This vote passes. Thanks everyone for testing this release.
>>>
>>>
>>> +1:
>>>
>>> Sean Owen (binding)
>>>
>>> Herman van Hövell tot Westerflier (binding)
>>>
>>> Wenchen Fan (binding)
>>>
>>> Shivaram Venkataraman (binding)
>>>
>>> Felix Cheung
>>>
>>> Henry Robinson
>>>
>>> Hyukjin Kwon
>>>
>>> Dongjoon Hyun
>>>
>>> Kazuaki Ishizaki
>>>
>>> Holden Karau
>>>
>>> Weichen Xu
>>>
>>>
>>> 0: None
>>>
>>> -1: None
>>>
>>

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by Felix Cheung <fe...@apache.org>.
I saw the svn move on Monday so I’m working on the website updates.

I will look into maven today. I will ask if I couldn’t do it.


On Wed, Dec 6, 2017 at 10:49 AM Sean Owen <so...@cloudera.com> wrote:

> Pardon, did this release finish? I don't see it in Maven. I know there was
> some question about getting a hand in finishing the release process,
> including copying artifacts in svn. Was there anything else you're waiting
> on someone to do?
>
>
> On Fri, Dec 1, 2017 at 2:10 AM Felix Cheung <fe...@apache.org>
> wrote:
>
>> This vote passes. Thanks everyone for testing this release.
>>
>>
>> +1:
>>
>> Sean Owen (binding)
>>
>> Herman van Hövell tot Westerflier (binding)
>>
>> Wenchen Fan (binding)
>>
>> Shivaram Venkataraman (binding)
>>
>> Felix Cheung
>>
>> Henry Robinson
>>
>> Hyukjin Kwon
>>
>> Dongjoon Hyun
>>
>> Kazuaki Ishizaki
>>
>> Holden Karau
>>
>> Weichen Xu
>>
>>
>> 0: None
>>
>> -1: None
>>
>

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by Sean Owen <so...@cloudera.com>.
Pardon, did this release finish? I don't see it in Maven. I know there was
some question about getting a hand in finishing the release process,
including copying artifacts in svn. Was there anything else you're waiting
on someone to do?

On Fri, Dec 1, 2017 at 2:10 AM Felix Cheung <fe...@apache.org> wrote:

> This vote passes. Thanks everyone for testing this release.
>
>
> +1:
>
> Sean Owen (binding)
>
> Herman van Hövell tot Westerflier (binding)
>
> Wenchen Fan (binding)
>
> Shivaram Venkataraman (binding)
>
> Felix Cheung
>
> Henry Robinson
>
> Hyukjin Kwon
>
> Dongjoon Hyun
>
> Kazuaki Ishizaki
>
> Holden Karau
>
> Weichen Xu
>
>
> 0: None
>
> -1: None
>

Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by zzc <44...@qq.com>.
Hi, dev, the version of latest spark doc is still 2.2.0,  when to publish the
2.2.1 doc ?



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: [RESULT][VOTE] Spark 2.2.1 (RC2)

Posted by Reynold Xin <rx...@databricks.com>.
Congrats.


On Fri, Dec 1, 2017 at 12:10 AM, Felix Cheung <fe...@apache.org>
wrote:

> This vote passes. Thanks everyone for testing this release.
>
>
> +1:
>
> Sean Owen (binding)
>
> Herman van Hövell tot Westerflier (binding)
>
> Wenchen Fan (binding)
>
> Shivaram Venkataraman (binding)
>
> Felix Cheung
>
> Henry Robinson
>
> Hyukjin Kwon
>
> Dongjoon Hyun
>
> Kazuaki Ishizaki
>
> Holden Karau
>
> Weichen Xu
>
>
> 0: None
>
> -1: None
>
>
>
>
> On Wed, Nov 29, 2017 at 3:21 PM Weichen Xu <we...@databricks.com>
> wrote:
>
>> +1
>>
>> On Thu, Nov 30, 2017 at 6:27 AM, Shivaram Venkataraman <
>> shivaram@eecs.berkeley.edu> wrote:
>>
>>> +1
>>>
>>> SHA, MD5 and signatures look fine. Built and ran Maven tests on my
>>> Macbook.
>>>
>>> Thanks
>>> Shivaram
>>>
>>> On Wed, Nov 29, 2017 at 10:43 AM, Holden Karau <ho...@pigscanfly.ca>
>>> wrote:
>>>
>>>> +1 (non-binding)
>>>>
>>>> PySpark install into a virtualenv works, PKG-INFO looks correctly
>>>> populated (mostly checking for the pypandoc conversion there).
>>>>
>>>> Thanks for your hard work Felix (and all of the testers :)) :)
>>>>
>>>> On Wed, Nov 29, 2017 at 9:33 AM, Wenchen Fan <cl...@gmail.com>
>>>> wrote:
>>>>
>>>>> +1
>>>>>
>>>>> On Thu, Nov 30, 2017 at 1:28 AM, Kazuaki Ishizaki <ISHIZAKI@jp.ibm.com
>>>>> > wrote:
>>>>>
>>>>>> +1 (non-binding)
>>>>>>
>>>>>> I tested it on Ubuntu 16.04 and OpenJDK8 on ppc64le. All of the tests
>>>>>> for core/sql-core/sql-catalyst/mllib/mllib-local have passed.
>>>>>>
>>>>>> $ java -version
>>>>>> openjdk version "1.8.0_131"
>>>>>> OpenJDK Runtime Environment (build 1.8.0_131-8u131-b11-2ubuntu1.
>>>>>> 16.04.3-b11)
>>>>>> OpenJDK 64-Bit Server VM (build 25.131-b11, mixed mode)
>>>>>>
>>>>>> % build/mvn -DskipTests -Phive -Phive-thriftserver -Pyarn
>>>>>> -Phadoop-2.7 -T 24 clean package install
>>>>>> % build/mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.7 test -pl
>>>>>> core -pl 'sql/core' -pl 'sql/catalyst' -pl mllib -pl mllib-local
>>>>>> ...
>>>>>> Run completed in 13 minutes, 54 seconds.
>>>>>> Total number of tests run: 1118
>>>>>> Suites: completed 170, aborted 0
>>>>>> Tests: succeeded 1118, failed 0, canceled 0, ignored 6, pending 0
>>>>>> All tests passed.
>>>>>> [INFO] ------------------------------------------------------------
>>>>>> ------------
>>>>>> [INFO] Reactor Summary:
>>>>>> [INFO]
>>>>>> [INFO] Spark Project Core ................................. SUCCESS
>>>>>> [17:13 min]
>>>>>> [INFO] Spark Project ML Local Library ..................... SUCCESS [
>>>>>>  6.065 s]
>>>>>> [INFO] Spark Project Catalyst ............................. SUCCESS
>>>>>> [11:51 min]
>>>>>> [INFO] Spark Project SQL .................................. SUCCESS
>>>>>> [17:55 min]
>>>>>> [INFO] Spark Project ML Library ........................... SUCCESS
>>>>>> [17:05 min]
>>>>>> [INFO] ------------------------------------------------------------
>>>>>> ------------
>>>>>> [INFO] BUILD SUCCESS
>>>>>> [INFO] ------------------------------------------------------------
>>>>>> ------------
>>>>>> [INFO] Total time: 01:04 h
>>>>>> [INFO] Finished at: 2017-11-30T01:48:15+09:00
>>>>>> [INFO] Final Memory: 128M/329M
>>>>>> [INFO] ------------------------------------------------------------
>>>>>> ------------
>>>>>> [WARNING] The requested profile "hive" could not be activated because
>>>>>> it does not exist.
>>>>>>
>>>>>> Kazuaki Ishizaki
>>>>>>
>>>>>>
>>>>>>
>>>>>> From:        Dongjoon Hyun <do...@gmail.com>
>>>>>> To:        Hyukjin Kwon <gu...@gmail.com>
>>>>>> Cc:        Spark dev list <de...@spark.apache.org>, Felix Cheung <
>>>>>> felixcheung@apache.org>, Sean Owen <so...@cloudera.com>
>>>>>> Date:        2017/11/29 12:56
>>>>>> Subject:        Re: [VOTE] Spark 2.2.1 (RC2)
>>>>>> ------------------------------
>>>>>>
>>>>>>
>>>>>>
>>>>>> +1 (non-binding)
>>>>>>
>>>>>> RC2 is tested on CentOS, too.
>>>>>>
>>>>>> Bests,
>>>>>> Dongjoon.
>>>>>>
>>>>>> On Tue, Nov 28, 2017 at 4:35 PM, Hyukjin Kwon <*gurwls223@gmail.com*
>>>>>> <gu...@gmail.com>> wrote:
>>>>>> +1
>>>>>>
>>>>>> 2017-11-29 8:18 GMT+09:00 Henry Robinson <*henry@apache.org*
>>>>>> <he...@apache.org>>:
>>>>>> (My vote is non-binding, of course).
>>>>>>
>>>>>> On 28 November 2017 at 14:53, Henry Robinson <*henry@apache.org*
>>>>>> <he...@apache.org>> wrote:
>>>>>> +1, tests all pass for me on Ubuntu 16.04.
>>>>>>
>>>>>> On 28 November 2017 at 10:36, Herman van Hövell tot Westerflier <
>>>>>> *hvanhovell@databricks.com* <hv...@databricks.com>> wrote:
>>>>>> +1
>>>>>>
>>>>>> On Tue, Nov 28, 2017 at 7:35 PM, Felix Cheung <
>>>>>> *felixcheung@apache.org* <fe...@apache.org>> wrote:
>>>>>> +1
>>>>>>
>>>>>> Thanks Sean. Please vote!
>>>>>>
>>>>>> Tested various scenarios with R package. Ubuntu, Debian, Windows
>>>>>> r-devel and release and on r-hub. Verified CRAN checks are clean (only 1
>>>>>> NOTE!) and no leaked files (.cache removed, /tmp clean)
>>>>>>
>>>>>>
>>>>>> On Sun, Nov 26, 2017 at 11:55 AM Sean Owen <*sowen@cloudera.com*
>>>>>> <so...@cloudera.com>> wrote:
>>>>>> Yes it downloads recent releases. The test worked for me on a second
>>>>>> try, so I suspect a bad mirror. If this comes up frequently we can just add
>>>>>> retry logic, as the closer.lua script will return different mirrors each
>>>>>> time.
>>>>>>
>>>>>> The tests all pass for me on the latest Debian, so +1 for this
>>>>>> release.
>>>>>>
>>>>>> (I committed the change to set -Xss4m for tests consistently, but
>>>>>> this shouldn't block a release.)
>>>>>>
>>>>>>
>>>>>> On Sat, Nov 25, 2017 at 12:47 PM Felix Cheung <
>>>>>> *felixcheung@apache.org* <fe...@apache.org>> wrote:
>>>>>> Ah sorry digging through the history it looks like this is changed
>>>>>> relatively recently and should only download previous releases.
>>>>>>
>>>>>> Perhaps we are intermittently hitting a mirror that doesn’t have the
>>>>>> files?
>>>>>>
>>>>>>
>>>>>> *https://github.com/apache/spark/commit/daa838b8886496e64700b55d1301d348f1d5c9ae*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_commit_daa838b8886496e64700b55d1301d348f1d5c9ae&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=hs95TxtmzYWnoHYZjf51e_CNPW0Lxe1DnqZms2h_ChQ&e=>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sat, Nov 25, 2017 at 10:36 AM Felix Cheung <
>>>>>> *felixcheung@apache.org* <fe...@apache.org>> wrote:
>>>>>> Thanks Sean.
>>>>>>
>>>>>> For the second one, it looks like the  HiveExternalCatalogVersionsSuite is
>>>>>> trying to download the release tgz from the official Apache mirror, which
>>>>>> won’t work unless the release is actually, released?
>>>>>> valpreferredMirror=
>>>>>>
>>>>>> Seq("wget", "*https://www.apache.org/dyn/closer.lua?preferred=true*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.apache.org_dyn_closer.lua-3Fpreferred-3Dtrue&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=-ySYsEWnZhSg0bpbCeefR_JDKa0cO1tHCW5CJe_AiP0&e=>
>>>>>> ", "-q", "-O", "-").!!.trim
>>>>>> valurl=s"$preferredMirror/spark/spark-$version/spark-$
>>>>>> version-bin-hadoop2.7.tgz"
>>>>>>
>>>>>>
>>>>>>
>>>>>> It’s proabbly getting an error page instead.
>>>>>>
>>>>>>
>>>>>> On Sat, Nov 25, 2017 at 10:28 AM Sean Owen <*sowen@cloudera.com*
>>>>>> <so...@cloudera.com>> wrote:
>>>>>> I hit the same StackOverflowError as in the previous RC test, but,
>>>>>> pretty sure this is just because the increased thread stack size JVM flag
>>>>>> isn't applied consistently. This seems to resolve it:
>>>>>>
>>>>>> *https://github.com/apache/spark/pull/19820*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_pull_19820&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=h3SU2l6GO8-2bs9OSc842pTBaMzjk8Hq6CC-4i-nZPQ&e=>
>>>>>>
>>>>>> This wouldn't block release IMHO.
>>>>>>
>>>>>>
>>>>>> I am currently investigating this failure though -- seems like the
>>>>>> mechanism that downloads Spark tarballs needs fixing, or updating, in the
>>>>>> 2.2 branch?
>>>>>>
>>>>>> HiveExternalCatalogVersionsSuite:
>>>>>>
>>>>>> gzip: stdin: not in gzip format
>>>>>>
>>>>>> tar: Child returned status 1
>>>>>>
>>>>>> tar: Error is not recoverable: exiting now
>>>>>>
>>>>>> *** RUN ABORTED ***
>>>>>>
>>>>>>   java.io.IOException: Cannot run program "./bin/spark-submit" (in
>>>>>> directory "/tmp/test-spark/spark-2.0.2"): error=2, No such file or
>>>>>> directory
>>>>>>
>>>>>> On Sat, Nov 25, 2017 at 12:34 AM Felix Cheung <
>>>>>> *felixcheung@apache.org* <fe...@apache.org>> wrote:
>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>> version 2.2.1. The vote is open until Friday December 1, 2017 at
>>>>>> 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes
>>>>>> are cast.
>>>>>>
>>>>>>
>>>>>> [ ] +1 Release this package as Apache Spark 2.2.1
>>>>>>
>>>>>> [ ] -1 Do not release this package because ...
>>>>>>
>>>>>>
>>>>>> To learn more about Apache Spark, please see
>>>>>> *https://spark.apache.org/*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__spark.apache.org_&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=eyXtxjDyM_HgW4H-niKxyA9uiYiDBs65UJB9xkXEv2c&e=>
>>>>>>
>>>>>>
>>>>>> The tag to be voted on is v2.2.1-rc2
>>>>>> *https://github.com/apache/spark/tree/v2.2.1-rc2*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_apache_spark_tree_v2.2.1-2Drc2&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=eUhE5bbOKKS6mInmx1SGSr5EI4TqHevk6FOqfv64i_4&e=>
>>>>>>   (e30e2698a2193f0bbdcd4edb884710819ab6397c)
>>>>>>
>>>>>> List of JIRA tickets resolved in this release can be found here
>>>>>> *https://issues.apache.org/jira/projects/SPARK/versions/12340470*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_projects_SPARK_versions_12340470&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=pQRROoYECKC9zu7BSCvffAEYGD7bmxyRmkMffqkPaXk&e=>
>>>>>>
>>>>>>
>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>> at:
>>>>>> *https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-bin/*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__dist.apache.org_repos_dist_dev_spark_spark-2D2.2.1-2Drc2-2Dbin_&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=vuvgUXSfszp32zQAimuTmyTXwsB1QGGVnKCq9XpjQyg&e=>
>>>>>>
>>>>>> Release artifacts are signed with the following key:
>>>>>> *https://dist.apache.org/repos/dist/dev/spark/KEYS*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__dist.apache.org_repos_dist_dev_spark_KEYS&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=pETEhVi5MEdLXWqvOcElD5Q4OHu5Jn4E7XXlcY-CsQs&e=>
>>>>>>
>>>>>> The staging repository for this release can be found at:
>>>>>>
>>>>>> *https://repository.apache.org/content/repositories/orgapachespark-1257/*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__repository.apache.org_content_repositories_orgapachespark-2D1257_&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=UT1rKau36W-7JfUugXMC_BCEt4Zk20tInhbT0Bg52SM&e=>
>>>>>>
>>>>>> The documentation corresponding to this release can be found at:
>>>>>>
>>>>>> *https://dist.apache.org/repos/dist/dev/spark/spark-2.2.1-rc2-docs/_site/index.html*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__dist.apache.org_repos_dist_dev_spark_spark-2D2.2.1-2Drc2-2Ddocs_-5Fsite_index.html&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=lHSJIz9KadjrrrffuvOPxLvDccwwlodqO_CxQcCk1PI&e=>
>>>>>>
>>>>>>
>>>>>> *FAQ*
>>>>>>
>>>>>> *How can I help test this release?*
>>>>>>
>>>>>> If you are a Spark user, you can help us test this release by taking
>>>>>> an existing Spark workload and running on this release candidate, then
>>>>>> reporting any regressions.
>>>>>>
>>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>>>> can add the staging repository to your projects resolvers and test with the
>>>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>>>> up building with a out of date RC going forward).
>>>>>>
>>>>>> *What should happen to JIRA tickets still targeting 2.2.1?*
>>>>>>
>>>>>> Committers should look at those and triage. Extremely important bug
>>>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>>>> worked on immediately. Everything else please retarget to 2.2.2.
>>>>>>
>>>>>> *But my bug isn't fixed!??!*
>>>>>>
>>>>>> In order to make timely releases, we will typically not hold the
>>>>>> release unless the bug in question is a regression from 2.2.0. That being
>>>>>> said if there is something which is a regression form 2.2.0 that has not
>>>>>> been correctly targeted please ping a committer to help target the issue
>>>>>> (you can see the open issues listed as impacting Spark 2.2.1 / 2.2.2
>>>>>> *here*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_issues_-3Fjql-3Dproject-2520-253D-2520SPARK-2520AND-2520status-2520-253D-2520OPEN-2520AND-2520-28affectedVersion-2520-253D-25202.2.1-2520OR-2520affectedVersion-2520-253D-25202.2.2-29&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=ZW-BVV3xOCbwuTMugoLdpdqQIIQq255D5ICs2Ur7WyM&e=>
>>>>>> .
>>>>>>
>>>>>> *What are the **unresolved issues targeted for 2.2.1*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_issues_-3Fjql-3Dproject-2520-253D-2520SPARK-2520AND-2520status-2520in-2520-28Open-252C-2520-2522In-2520Progress-2522-252C-2520Reopened-29-2520AND-2520-2522Target-2520Version-252Fs-2522-2520-253D-25202.2.1&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=1kUm3VAHEBf-_Qy4cZa4rK5HsEZ_0MvmZHuF8yS0gik&e=>
>>>>>> *?*
>>>>>>
>>>>>> At the time of the writing, there is one intermited failure
>>>>>> *SPARK-20201*
>>>>>> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D20201&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=b70dG_9wpCdZSkBJahHYQ4IwKMdp2hQM29f-ZCGj9Pg&m=WVNCcWfajx4adpKcHE0NxxV2_-fDqHOsX2gxULbm7Hs&s=KBdZdyfkHvBk1ikAKi79I99vTnlXHbnwBe7d3NJTjg8&e=> which
>>>>>> we are tracking since 2.2.0.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Twitter: https://twitter.com/holdenkarau
>>>>
>>>
>>>
>>