You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by "Ahmet Altay (JIRA)" <ji...@apache.org> on 2017/04/04 17:32:41 UTC

[jira] [Commented] (BEAM-680) Python Dataflow stages stale requirements-cache dependencies

    [ https://issues.apache.org/jira/browse/BEAM-680?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15955471#comment-15955471 ] 

Ahmet Altay commented on BEAM-680:
----------------------------------

The solution to this is to use `requirements_cache` flag which can be pointed to a clean directory.

Without using that flag there is no clear solution:
- Cleaning up the cache would result in re-downloading same requirements
- pip does not have a way of telling the transitive dependencies (short of what we are already doing).
- Also uploading more file to staging is generally harmless (other than bloat) but the opposite (e.g missing a dependency) would cause issues at execution time.

> Python Dataflow stages stale requirements-cache dependencies
> ------------------------------------------------------------
>
>                 Key: BEAM-680
>                 URL: https://issues.apache.org/jira/browse/BEAM-680
>             Project: Beam
>          Issue Type: Bug
>          Components: sdk-py
>            Reporter: Scott Wegner
>            Priority: Minor
>
> When executing a python pipeline using a requirements.txt file, the Dataflow runner will stage all dependencies downloaded to its requirements cache directory, including those specified in the requirements.txt, and any previously cached dependencies. This results in bloated staging directory if previous pipeline runs from the same machine included different dependencies.
> Repro:
> # Initialize a virtualenv and pip install apache_beam
> # Create an empty requirements.txt file
> # Create a simple pipeline using DataflowPipelineRunner and a requirements.txt file, for example: [my_pipeline.py|https://gist.github.com/swegner/6df00df1423b48206c4ab5a7e917218a]
> # {{touch /tmp/dataflow-requirements-cache/extra-file.txt}}
> # Run the pipeline with a specified staging directory
> # Check the staged files for the job
> 'extra-file.txt' will be uploaded with the job, along with any other cached dependencies under /tmp/dataflow-requirements-cache.
> We should only be staging the dependencies necessary for a pipeline, not all previously-cached dependencies found on the machine.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)