You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "shane knapp (JIRA)" <ji...@apache.org> on 2016/04/26 01:48:12 UTC

[jira] [Created] (SPARK-14905) create conda environments w/locked package versions

shane knapp created SPARK-14905:
-----------------------------------

             Summary: create conda environments w/locked package versions
                 Key: SPARK-14905
                 URL: https://issues.apache.org/jira/browse/SPARK-14905
             Project: Spark
          Issue Type: Improvement
          Components: Build
            Reporter: shane knapp


right now, the package dependency story for the jenkins build system is...  well...  non-existent.

packages are installed, and only rarely (if ever) updated.  when a new anaconda or system python library is installed or updated for a specific user/build requirement, this will randomly update and/or install other packages that may or may not have backwards compatibility. 

we've survived for a number of years so far without looking to deal with the technical debt, but i don't see how this will remain manageable, especially as spark, and other projects hosted on jenkins grow.

example:  currently, a non-spark amplab project (e-mission) needs scipy updated from 0.15.1 to 0.17.0 for their tests to pass.  this simple upgrade adds three new python libraries (ligbfortran, mkl, wheel) and updates eleven others (conda, conda-env, numpy, openssl, pip, python, pyyaml, requests, setuptools, sqlite, yaml).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org