You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by ho...@apache.org on 2017/10/18 05:53:00 UTC
spark-website git commit: Update the release process notes to cover
PyPi for the next RM.
Repository: spark-website
Updated Branches:
refs/heads/asf-site 6634f88ab -> 5e04ca053
Update the release process notes to cover PyPi for the next RM.
Update release process notes
Update change to release process html page
Switch around notes
Update release process docs to include after what
Try and improve wording a bit
Include the URLS in the link text to sort of match the style of the rest of the page
Eh looks weird
Add a note about how you can use twine as well
Generate release process file.
Update to use twine
Update the release process documentation to twine compile
s/apache/Apache/ in release-process per @srowen's comment
Update release process html page too
Update release process remove unecessary "on the"
Project: http://git-wip-us.apache.org/repos/asf/spark-website/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark-website/commit/5e04ca05
Tree: http://git-wip-us.apache.org/repos/asf/spark-website/tree/5e04ca05
Diff: http://git-wip-us.apache.org/repos/asf/spark-website/diff/5e04ca05
Branch: refs/heads/asf-site
Commit: 5e04ca05364862c07175f206834ef8360e342632
Parents: 6634f88
Author: Holden Karau <ho...@us.ibm.com>
Authored: Sat May 6 16:37:23 2017 -0700
Committer: Holden Karau <ho...@us.ibm.com>
Committed: Tue Oct 17 22:52:50 2017 -0700
----------------------------------------------------------------------
release-process.md | 21 +++++++++++++++++++--
site/release-process.html | 17 +++++++++++++++--
2 files changed, 34 insertions(+), 4 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark-website/blob/5e04ca05/release-process.md
----------------------------------------------------------------------
diff --git a/release-process.md b/release-process.md
index f86ebaa..cd6010a 100644
--- a/release-process.md
+++ b/release-process.md
@@ -113,7 +113,7 @@ mkdir spark-1.1.1-rc2
$ sftp -r andrewor14@people.apache.org:~/public_html/spark-1.1.1-rc2/* spark-1.1.1-rc2
# NOTE: Remove any binaries you don’t want to publish
-# E.g. never push MapR and *without-hive artifacts to apache
+# E.g. never push MapR and *without-hive artifacts to Apache
$ rm spark-1.1.1-rc2/*mapr*
$ rm spark-1.1.1-rc2/*without-hive*
$ svn add spark-1.1.1-rc2
@@ -129,6 +129,23 @@ Verify that the resources are present in <a href="https://www.apache.org/dist/sp
It may take a while for them to be visible. This will be mirrored throughout the Apache network.
There are a few remaining steps.
+<h4>Upload to PyPI</h4>
+
+Uploading to PyPI is done after the release has been uploaded to Apache. To get started, go to the <a href="https://pypi.python.org">PyPI website</a> and log in with the spark-upload account (see the PMC mailing list for account permissions).
+
+
+Once you have logged in it is time to register the new release, on the <a href="https://pypi.python.org/pypi?%3Aaction=submit_form">submitting package information</a> page by uploading the PKG-INFO file from inside the pyspark packaged artifact.
+
+
+Once the release has been registered you can upload the artifacts
+to the <b>legacy</b> pypi interface, using <a href="https://pypi.python.org/pypi/twine">twine</a>.
+If you don't have twine setup you will need to create a .pypirc file with the reository pointing to `https://upload.pypi.org/legacy/` and the same username and password for the spark-upload account.
+
+In the release directory run `twine upload -r legacy pyspark-version.tar.gz pyspark-version.tar.gz.asc`.
+If for some reason the twine upload is incorrect (e.g. http failure or other issue), you can rename the artifact to `pyspark-version.post0.tar.gz`, delete the old artifact from PyPI and re-upload.
+
+
+
<h4>Remove Old Releases from Mirror Network</h4>
Spark always keeps two releases in the mirror network: the most recent release on the current and
@@ -190,7 +207,7 @@ $ git checkout v1.1.1
$ cd docs
$ PRODUCTION=1 jekyll build
-# Copy the new documentation to apache
+# Copy the new documentation to Apache
$ git clone https://github.com/apache/spark-website
...
$ cp -R _site spark-website/site/docs/1.1.1
http://git-wip-us.apache.org/repos/asf/spark-website/blob/5e04ca05/site/release-process.html
----------------------------------------------------------------------
diff --git a/site/release-process.html b/site/release-process.html
index 6261650..18e871f 100644
--- a/site/release-process.html
+++ b/site/release-process.html
@@ -316,7 +316,7 @@ mkdir spark-1.1.1-rc2
$ sftp -r andrewor14@people.apache.org:~/public_html/spark-1.1.1-rc2/* spark-1.1.1-rc2
# NOTE: Remove any binaries you don’t want to publish
-# E.g. never push MapR and *without-hive artifacts to apache
+# E.g. never push MapR and *without-hive artifacts to Apache
$ rm spark-1.1.1-rc2/*mapr*
$ rm spark-1.1.1-rc2/*without-hive*
$ svn add spark-1.1.1-rc2
@@ -332,6 +332,19 @@ $ svn mv https://dist.apache.org/repos/dist/dev/spark/spark-1.1.1-rc2 https://di
It may take a while for them to be visible. This will be mirrored throughout the Apache network.
There are a few remaining steps.</p>
+<h4>Upload to PyPI</h4>
+
+<p>Uploading to PyPI is done after the release has been uploaded to Apache. To get started, go to the <a href="https://pypi.python.org">PyPI website</a> and log in with the spark-upload account (see the PMC mailing list for account permissions).</p>
+
+<p>Once you have logged in it is time to register the new release, on the <a href="https://pypi.python.org/pypi?%3Aaction=submit_form">submitting package information</a> page by uploading the PKG-INFO file from inside the pyspark packaged artifact.</p>
+
+<p>Once the release has been registered you can upload the artifacts
+to the <b>legacy</b> pypi interface, using <a href="https://pypi.python.org/pypi/twine">twine</a>.
+If you don’t have twine setup you will need to create a .pypirc file with the reository pointing to <code>https://upload.pypi.org/legacy/</code> and the same username and password for the spark-upload account.</p>
+
+<p>In the release directory run <code>twine upload -r legacy pyspark-version.tar.gz pyspark-version.tar.gz.asc</code>.
+If for some reason the twine upload is incorrect (e.g. http failure or other issue), you can rename the artifact to <code>pyspark-version.post0.tar.gz</code>, delete the old artifact from PyPI and re-upload.</p>
+
<h4>Remove Old Releases from Mirror Network</h4>
<p>Spark always keeps two releases in the mirror network: the most recent release on the current and
@@ -392,7 +405,7 @@ $ git checkout v1.1.1
$ cd docs
$ PRODUCTION=1 jekyll build
-# Copy the new documentation to apache
+# Copy the new documentation to Apache
$ git clone https://github.com/apache/spark-website
...
$ cp -R _site spark-website/site/docs/1.1.1
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org