You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "holdenk (JIRA)" <ji...@apache.org> on 2016/11/03 03:59:58 UTC

[jira] [Commented] (SPARK-18128) Add support for publishing to PyPI

    [ https://issues.apache.org/jira/browse/SPARK-18128?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15631463#comment-15631463 ] 

holdenk commented on SPARK-18128:
---------------------------------

Extracted from the discussion around SPARK-1267:

People who are officially allowed to make releases will need to register on PyPI and PyPI test, create .pypirc files with their credentials and be added to the "pyspark" or "apache-pyspark" project (depending on the name that is chosen) and the release script will need to be updated slightly. Code wise the changes required for SPARK-18128 are relatively minor, whatever changing of package name may be required, and adding a shell variable to control which PyPI server is being published to, and during publish switching sdist to sdist upload.

> Add support for publishing to PyPI
> ----------------------------------
>
>                 Key: SPARK-18128
>                 URL: https://issues.apache.org/jira/browse/SPARK-18128
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>            Reporter: holdenk
>
> After SPARK-1267 is done we should add support for publishing to PyPI similar to how we publish to maven central.
> Note: one of the open questions is what to do about package name since someone has registered the package name PySpark on PyPI - we could use ApachePySpark or we could try and get find who registered PySpark and get them to transfer it to us (since they haven't published anything so maybe fine?)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org