You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:21:31 UTC

[jira] [Updated] (SPARK-14905) create conda environments w/locked package versions

     [ https://issues.apache.org/jira/browse/SPARK-14905?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-14905:
---------------------------------
    Labels: bulk-closed  (was: )

> create conda environments w/locked package versions
> ---------------------------------------------------
>
>                 Key: SPARK-14905
>                 URL: https://issues.apache.org/jira/browse/SPARK-14905
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>            Reporter: shane knapp
>            Priority: Major
>              Labels: bulk-closed
>
> right now, the package dependency story for the jenkins build system is...  well...  non-existent.
> packages are installed, and only rarely (if ever) updated.  when a new anaconda or system python library is installed or updated for a specific user/build requirement, this will randomly update and/or install other packages that may or may not have backwards compatibility. 
> we've survived for a number of years so far without looking to deal with the technical debt, but i don't see how this will remain manageable, especially as spark, and other projects hosted on jenkins grow.
> example:  currently, a non-spark amplab project (e-mission) needs scipy updated from 0.15.1 to 0.17.0 for their tests to pass.  this simple upgrade adds three new python libraries (ligbfortran, mkl, wheel) and updates eleven others (conda, conda-env, numpy, openssl, pip, python, pyyaml, requests, setuptools, sqlite, yaml).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org