You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@buildstream.apache.org by tv...@apache.org on 2022/04/05 10:04:15 UTC

[buildstream-plugins] branch master created (now 9620631)

This is an automated email from the ASF dual-hosted git repository.

tvb pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git


      at 9620631  Merge pull request #3 from gtristan/tristan/use-zip-from-bst-plugins-experimental

This branch includes the following new commits:

     new eea7cf9  Adding LICENSE, NOTICE, COMMITTERS.rst
     new d23642b  Adding initial __init__.py files defining the plugin library structure
     new fee253c  Initially adding cargo source
     new 8ab4718  Initially adding pip source
     new 15381b5  Initially adding bzr source
     new 3ad513f  Initially adding patch source
     new 2d439b5  Initially adding git source
     new d848a24  Initially adding docker source
     new 6e51cbc  Initially adding cmake element
     new 52c3ab0  Initially adding meson element
     new 2e26118  Initially adding make element
     new 6c4a5f6  Initially adding pip element
     new c108d3e  Initially adding setuptools element
     new f803536  Initially adding autotools element
     new eac4b69  Added project.conf to allow using these plugins with a junction
     new 325b77e  requirements/: Adding initial requirements.txt files
     new 98baba4  MANIFEST.in: Adding initial manifest
     new f460744  README.rst: Adding initial README
     new 0071b70  setup.py: Adding initial setup.py
     new 511ddc4  setup.cfg: Adding initial setup.cfg
     new 10f5d1c  Adding pyproject.toml
     new 8695302  .pylintrc: Adding initial pylint configuration
     new 8e16746  tox.ini: Adding initial tox.ini
     new 8bd6f6b  tests: Adding initial tests
     new ad335d2  tests/sources/pip.py: Adding pip source test
     new d35af6e  tests/sources/patch.py: Adding tests for patch source
     new ddeceeb  tests/sources/git.py: Adding tests for git source
     new eb93327  tests/sources/bzr.py: Adding tests for bzr source
     new 091af01  tests/elements/autotools.py: Adding autotools element test
     new 9de643f  tests/elements/cmake.py: Adding cmake element test
     new d151fcd  tests/elements/meson.py: Adding meson element tests
     new 2ee8462  tests/elements/make.py: Adding tests for make element
     new 8b030e9  tests/elements/pip.py: Adding tests for pip element
     new 810506e  tests/elements/setuptools.py: Adding tests for setuptools element
     new 1a11f45  tests/sources/pip_build.py: Adding test which builds using the pip source
     new c74456f  doc: Adding docs build.
     new 1ba6cc3  tox.ini: Adding mypy static type checking
     new 09adcfc  .github/CODEOWNERS: Adding CODEOWNERS file
     new d54c1ef  .github: Adding run-ci.sh and the ci.docker-compose.yml
     new 1c51f01  .github/workflows: Adding the ci/merge/release workflows
     new 2ca29ca  .github/worflows: Use ubuntu 18.04 instead of 20.04
     new 817e24b  tox.ini: Updating black to version 22.3.0
     new ce223fc  Merge pull request #2 from gtristan/tristan/fix-ci-ubuntu-version
     new 921d1a1  tests/elements/{cmake,meson}: Use zip source from bst-plugins-experimental
     new a17e02a  tests/elements/{cmake,meson}: Use git source from buildstream-plugins
     new 5ade411  tests/sources/git: Use locally defined git source
     new 0720a37  tests/sources/patch: Use locally defined patch source
     new 8f80351  tests/sources/bzr: Use locally defined bzr source
     new 9620631  Merge pull request #3 from gtristan/tristan/use-zip-from-bst-plugins-experimental

The 49 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[buildstream-plugins] 30/49: tests/elements/cmake.py: Adding cmake element test

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 9de643f3b6887519ec893c38aa675c56eecd0d22
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 15:54:00 2022 +0900

    tests/elements/cmake.py: Adding cmake element test
---
 tests/elements/cmake.py                            |  67 +++++++++++++++++++++
 tests/elements/cmake/elements/base.bst             |   6 ++
 .../elements/cmake/elements/base/alpine-image.bst  |   6 ++
 .../cmake/elements/base/base-configure.bst         |  28 +++++++++
 .../elements/cmake/elements/base/install-dpkg.bst  |  15 +++++
 tests/elements/cmake/elements/base/ninja.bst       |  15 +++++
 .../elements/cmake/elements/cmakeconfroothello.bst |  15 +++++
 tests/elements/cmake/elements/cmakehello.bst       |  10 +++
 tests/elements/cmake/files/cmakehello.tar.gz       | Bin 0 -> 10240 bytes
 tests/elements/cmake/project.conf                  |  15 +++++
 10 files changed, 177 insertions(+)

diff --git a/tests/elements/cmake.py b/tests/elements/cmake.py
new file mode 100644
index 0000000..a409ac1
--- /dev/null
+++ b/tests/elements/cmake.py
@@ -0,0 +1,67 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import pytest
+
+from buildstream._testing.runcli import cli_integration as cli  # pylint: disable=unused-import
+from buildstream._testing.integration import integration_cache  # pylint: disable=unused-import
+from buildstream._testing.integration import assert_contains
+from buildstream._testing._utils.site import HAVE_SANDBOX
+
+pytestmark = pytest.mark.integration
+
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "cmake")
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_cmake_build(cli, datafiles):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_name = "cmakehello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    assert result.exit_code == 0
+
+    assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_cmake_confroot_build(cli, datafiles):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_name = "cmakeconfroothello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    assert result.exit_code == 0
+
+    assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_cmake_run(cli, datafiles):
+    project = str(datafiles)
+    element_name = "cmakehello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["shell", element_name, "/usr/bin/hello"])
+    assert result.exit_code == 0
+
+    assert (
+        result.output
+        == """Hello World!
+This is hello.
+"""
+    )
diff --git a/tests/elements/cmake/elements/base.bst b/tests/elements/cmake/elements/base.bst
new file mode 100644
index 0000000..5907dd5
--- /dev/null
+++ b/tests/elements/cmake/elements/base.bst
@@ -0,0 +1,6 @@
+kind: stack
+depends:
+- base/install-dpkg.bst
+- base/base-configure.bst
+- base/alpine-image.bst
+- base/ninja.bst
diff --git a/tests/elements/cmake/elements/base/alpine-image.bst b/tests/elements/cmake/elements/base/alpine-image.bst
new file mode 100644
index 0000000..f8e00ba
--- /dev/null
+++ b/tests/elements/cmake/elements/base/alpine-image.bst
@@ -0,0 +1,6 @@
+kind: import
+description: Import an alpine image as the platform
+sources:
+- kind: tar
+  url: alpine:integration-tests-base.v1.x86_64.tar.xz
+  ref: 3eb559250ba82b64a68d86d0636a6b127aa5f6d25d3601a79f79214dc9703639
diff --git a/tests/elements/cmake/elements/base/base-configure.bst b/tests/elements/cmake/elements/base/base-configure.bst
new file mode 100644
index 0000000..5323004
--- /dev/null
+++ b/tests/elements/cmake/elements/base/base-configure.bst
@@ -0,0 +1,28 @@
+kind: script
+depends:
+- filename: base/install-dpkg.bst
+  type: build
+
+variables:
+  install-root: /
+
+config:
+
+  commands:
+  - |
+    # Avoid some chowns which fail at dpkg configure time
+    #
+    mv /bin/chown /bin/chown.real
+    ln -s true /bin/chown
+
+  - |
+    # This is expected to fail, but will configure everything we need
+    # at least for the purpose of building, other dpkg scripts which
+    # require real root privileges will always fail here.
+    DEBIAN_FRONTEND=noninteractive dpkg --configure -a --abort-after=100000 || exit 0
+
+  - |
+    # Restore chown
+    #
+    rm -f /bin/chown
+    mv /bin/chown.real /bin/chown
diff --git a/tests/elements/cmake/elements/base/install-dpkg.bst b/tests/elements/cmake/elements/base/install-dpkg.bst
new file mode 100644
index 0000000..858044a
--- /dev/null
+++ b/tests/elements/cmake/elements/base/install-dpkg.bst
@@ -0,0 +1,15 @@
+kind: manual
+depends:
+- filename: base/alpine-image.bst
+  type: build
+sources:
+- kind: git
+  url: https://gitlab.com/BuildStream/buildstream-sysroots.git
+  track: dpkg-build
+  ref: ecf14954e4298ce5495f701464339162fad73f30
+config:
+  install-commands:
+  - tar xf dpkg-build-sysroot.tar.xz -C %{install-root} --no-same-owner
+  strip-commands:
+    # For some reason, the strip commands were hanging...
+    - echo "none"
diff --git a/tests/elements/cmake/elements/base/ninja.bst b/tests/elements/cmake/elements/base/ninja.bst
new file mode 100644
index 0000000..74afb16
--- /dev/null
+++ b/tests/elements/cmake/elements/base/ninja.bst
@@ -0,0 +1,15 @@
+kind: manual
+
+depends:
+- filename: base/alpine-image.bst
+
+config:
+  install-commands:
+  - |
+    install -D -m 0755 ninja %{install-root}%{bindir}/ninja
+
+sources:
+- kind: zip
+  url: https://github.com/ninja-build/ninja/releases/download/v1.9.0/ninja-linux.zip
+  ref: 1b1235f2b0b4df55ac6d80bbe681ea3639c9d2c505c7ff2159a3daf63d196305
+  base-dir: ''
diff --git a/tests/elements/cmake/elements/cmakeconfroothello.bst b/tests/elements/cmake/elements/cmakeconfroothello.bst
new file mode 100644
index 0000000..cd33dee
--- /dev/null
+++ b/tests/elements/cmake/elements/cmakeconfroothello.bst
@@ -0,0 +1,15 @@
+kind: cmake
+description: Cmake test
+
+depends:
+  - base.bst
+
+sources:
+  - kind: tar
+    directory: Source
+    url: project_dir:/files/cmakehello.tar.gz
+    ref: 508266f40dbc5875293bd24c4e50a9eb6b88cbacab742033f7b92f8c087b64e5
+
+variables:
+  conf-root: "%{build-root}/Source"
+  command-subdir: build
diff --git a/tests/elements/cmake/elements/cmakehello.bst b/tests/elements/cmake/elements/cmakehello.bst
new file mode 100644
index 0000000..c5fe496
--- /dev/null
+++ b/tests/elements/cmake/elements/cmakehello.bst
@@ -0,0 +1,10 @@
+kind: cmake
+description: Cmake test
+
+depends:
+  - base.bst
+
+sources:
+  - kind: tar
+    url: project_dir:/files/cmakehello.tar.gz
+    ref: 508266f40dbc5875293bd24c4e50a9eb6b88cbacab742033f7b92f8c087b64e5
diff --git a/tests/elements/cmake/files/cmakehello.tar.gz b/tests/elements/cmake/files/cmakehello.tar.gz
new file mode 100644
index 0000000..54d9505
Binary files /dev/null and b/tests/elements/cmake/files/cmakehello.tar.gz differ
diff --git a/tests/elements/cmake/project.conf b/tests/elements/cmake/project.conf
new file mode 100644
index 0000000..bdcf99b
--- /dev/null
+++ b/tests/elements/cmake/project.conf
@@ -0,0 +1,15 @@
+# test project config
+name: test
+min-version: 2.0
+
+element-path: elements
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  elements:
+  - cmake
+
+aliases:
+  alpine: https://bst-integration-test-images.ams3.cdn.digitaloceanspaces.com/
+  project_dir: file://{project_dir}


[buildstream-plugins] 02/49: Adding initial __init__.py files defining the plugin library structure

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit d23642ba20b47c3e713f87f0daf908b4bcaace45
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 16:57:44 2022 +0900

    Adding initial __init__.py files defining the plugin library structure
---
 src/buildstream_plugins/__init__.py          | 0
 src/buildstream_plugins/elements/__init__.py | 0
 src/buildstream_plugins/sources/__init__.py  | 0
 3 files changed, 0 insertions(+), 0 deletions(-)

diff --git a/src/buildstream_plugins/__init__.py b/src/buildstream_plugins/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/src/buildstream_plugins/elements/__init__.py b/src/buildstream_plugins/elements/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/src/buildstream_plugins/sources/__init__.py b/src/buildstream_plugins/sources/__init__.py
new file mode 100644
index 0000000..e69de29


[buildstream-plugins] 01/49: Adding LICENSE, NOTICE, COMMITTERS.rst

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit eea7cf9605555db9aab94c31f871aafe3303eb55
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 13:43:46 2022 +0900

    Adding LICENSE, NOTICE, COMMITTERS.rst
---
 COMMITTERS.rst |  24 +++++++
 LICENSE        | 202 +++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 NOTICE         |   5 ++
 3 files changed, 231 insertions(+)

diff --git a/COMMITTERS.rst b/COMMITTERS.rst
new file mode 100644
index 0000000..b1c3536
--- /dev/null
+++ b/COMMITTERS.rst
@@ -0,0 +1,24 @@
+.. _committers:
+
+Committers
+==========
+
+Full commit access
+-------------------
+List of people with full commit access, i.e. blanket commit access to
+the BuildStream plugins codebase. Note that this is not a full list of all
+contributors.
+
++-----------------------------------+-----------------------------------+
+| Full Name                         | GitHub User                       |
++===================================+===================================+
+| Tristan Van Berkom                | gtristan                          |
++-----------------------------------+-----------------------------------+
+| Jürg Billeter                     | juergbi                           |
++-----------------------------------+-----------------------------------+
+| Chandan Singh                     | cs-shadow                         |
++-----------------------------------+-----------------------------------+
+| Benjamin Schubert                 | BenjaminSchubert                  |
++-----------------------------------+-----------------------------------+
+| Abderrahim Kitouni                | abderrahim                        |
++-----------------------------------+-----------------------------------+
diff --git a/LICENSE b/LICENSE
new file mode 100644
index 0000000..d645695
--- /dev/null
+++ b/LICENSE
@@ -0,0 +1,202 @@
+
+                                 Apache License
+                           Version 2.0, January 2004
+                        http://www.apache.org/licenses/
+
+   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
+
+   1. Definitions.
+
+      "License" shall mean the terms and conditions for use, reproduction,
+      and distribution as defined by Sections 1 through 9 of this document.
+
+      "Licensor" shall mean the copyright owner or entity authorized by
+      the copyright owner that is granting the License.
+
+      "Legal Entity" shall mean the union of the acting entity and all
+      other entities that control, are controlled by, or are under common
+      control with that entity. For the purposes of this definition,
+      "control" means (i) the power, direct or indirect, to cause the
+      direction or management of such entity, whether by contract or
+      otherwise, or (ii) ownership of fifty percent (50%) or more of the
+      outstanding shares, or (iii) beneficial ownership of such entity.
+
+      "You" (or "Your") shall mean an individual or Legal Entity
+      exercising permissions granted by this License.
+
+      "Source" form shall mean the preferred form for making modifications,
+      including but not limited to software source code, documentation
+      source, and configuration files.
+
+      "Object" form shall mean any form resulting from mechanical
+      transformation or translation of a Source form, including but
+      not limited to compiled object code, generated documentation,
+      and conversions to other media types.
+
+      "Work" shall mean the work of authorship, whether in Source or
+      Object form, made available under the License, as indicated by a
+      copyright notice that is included in or attached to the work
+      (an example is provided in the Appendix below).
+
+      "Derivative Works" shall mean any work, whether in Source or Object
+      form, that is based on (or derived from) the Work and for which the
+      editorial revisions, annotations, elaborations, or other modifications
+      represent, as a whole, an original work of authorship. For the purposes
+      of this License, Derivative Works shall not include works that remain
+      separable from, or merely link (or bind by name) to the interfaces of,
+      the Work and Derivative Works thereof.
+
+      "Contribution" shall mean any work of authorship, including
+      the original version of the Work and any modifications or additions
+      to that Work or Derivative Works thereof, that is intentionally
+      submitted to Licensor for inclusion in the Work by the copyright owner
+      or by an individual or Legal Entity authorized to submit on behalf of
+      the copyright owner. For the purposes of this definition, "submitted"
+      means any form of electronic, verbal, or written communication sent
+      to the Licensor or its representatives, including but not limited to
+      communication on electronic mailing lists, source code control systems,
+      and issue tracking systems that are managed by, or on behalf of, the
+      Licensor for the purpose of discussing and improving the Work, but
+      excluding communication that is conspicuously marked or otherwise
+      designated in writing by the copyright owner as "Not a Contribution."
+
+      "Contributor" shall mean Licensor and any individual or Legal Entity
+      on behalf of whom a Contribution has been received by Licensor and
+      subsequently incorporated within the Work.
+
+   2. Grant of Copyright License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      copyright license to reproduce, prepare Derivative Works of,
+      publicly display, publicly perform, sublicense, and distribute the
+      Work and such Derivative Works in Source or Object form.
+
+   3. Grant of Patent License. Subject to the terms and conditions of
+      this License, each Contributor hereby grants to You a perpetual,
+      worldwide, non-exclusive, no-charge, royalty-free, irrevocable
+      (except as stated in this section) patent license to make, have made,
+      use, offer to sell, sell, import, and otherwise transfer the Work,
+      where such license applies only to those patent claims licensable
+      by such Contributor that are necessarily infringed by their
+      Contribution(s) alone or by combination of their Contribution(s)
+      with the Work to which such Contribution(s) was submitted. If You
+      institute patent litigation against any entity (including a
+      cross-claim or counterclaim in a lawsuit) alleging that the Work
+      or a Contribution incorporated within the Work constitutes direct
+      or contributory patent infringement, then any patent licenses
+      granted to You under this License for that Work shall terminate
+      as of the date such litigation is filed.
+
+   4. Redistribution. You may reproduce and distribute copies of the
+      Work or Derivative Works thereof in any medium, with or without
+      modifications, and in Source or Object form, provided that You
+      meet the following conditions:
+
+      (a) You must give any other recipients of the Work or
+          Derivative Works a copy of this License; and
+
+      (b) You must cause any modified files to carry prominent notices
+          stating that You changed the files; and
+
+      (c) You must retain, in the Source form of any Derivative Works
+          that You distribute, all copyright, patent, trademark, and
+          attribution notices from the Source form of the Work,
+          excluding those notices that do not pertain to any part of
+          the Derivative Works; and
+
+      (d) If the Work includes a "NOTICE" text file as part of its
+          distribution, then any Derivative Works that You distribute must
+          include a readable copy of the attribution notices contained
+          within such NOTICE file, excluding those notices that do not
+          pertain to any part of the Derivative Works, in at least one
+          of the following places: within a NOTICE text file distributed
+          as part of the Derivative Works; within the Source form or
+          documentation, if provided along with the Derivative Works; or,
+          within a display generated by the Derivative Works, if and
+          wherever such third-party notices normally appear. The contents
+          of the NOTICE file are for informational purposes only and
+          do not modify the License. You may add Your own attribution
+          notices within Derivative Works that You distribute, alongside
+          or as an addendum to the NOTICE text from the Work, provided
+          that such additional attribution notices cannot be construed
+          as modifying the License.
+
+      You may add Your own copyright statement to Your modifications and
+      may provide additional or different license terms and conditions
+      for use, reproduction, or distribution of Your modifications, or
+      for any such Derivative Works as a whole, provided Your use,
+      reproduction, and distribution of the Work otherwise complies with
+      the conditions stated in this License.
+
+   5. Submission of Contributions. Unless You explicitly state otherwise,
+      any Contribution intentionally submitted for inclusion in the Work
+      by You to the Licensor shall be under the terms and conditions of
+      this License, without any additional terms or conditions.
+      Notwithstanding the above, nothing herein shall supersede or modify
+      the terms of any separate license agreement you may have executed
+      with Licensor regarding such Contributions.
+
+   6. Trademarks. This License does not grant permission to use the trade
+      names, trademarks, service marks, or product names of the Licensor,
+      except as required for reasonable and customary use in describing the
+      origin of the Work and reproducing the content of the NOTICE file.
+
+   7. Disclaimer of Warranty. Unless required by applicable law or
+      agreed to in writing, Licensor provides the Work (and each
+      Contributor provides its Contributions) on an "AS IS" BASIS,
+      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
+      implied, including, without limitation, any warranties or conditions
+      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
+      PARTICULAR PURPOSE. You are solely responsible for determining the
+      appropriateness of using or redistributing the Work and assume any
+      risks associated with Your exercise of permissions under this License.
+
+   8. Limitation of Liability. In no event and under no legal theory,
+      whether in tort (including negligence), contract, or otherwise,
+      unless required by applicable law (such as deliberate and grossly
+      negligent acts) or agreed to in writing, shall any Contributor be
+      liable to You for damages, including any direct, indirect, special,
+      incidental, or consequential damages of any character arising as a
+      result of this License or out of the use or inability to use the
+      Work (including but not limited to damages for loss of goodwill,
+      work stoppage, computer failure or malfunction, or any and all
+      other commercial damages or losses), even if such Contributor
+      has been advised of the possibility of such damages.
+
+   9. Accepting Warranty or Additional Liability. While redistributing
+      the Work or Derivative Works thereof, You may choose to offer,
+      and charge a fee for, acceptance of support, warranty, indemnity,
+      or other liability obligations and/or rights consistent with this
+      License. However, in accepting such obligations, You may act only
+      on Your own behalf and on Your sole responsibility, not on behalf
+      of any other Contributor, and only if You agree to indemnify,
+      defend, and hold each Contributor harmless for any liability
+      incurred by, or claims asserted against, such Contributor by reason
+      of your accepting any such warranty or additional liability.
+
+   END OF TERMS AND CONDITIONS
+
+   APPENDIX: How to apply the Apache License to your work.
+
+      To apply the Apache License to your work, attach the following
+      boilerplate notice, with the fields enclosed by brackets "[]"
+      replaced with your own identifying information. (Don't include
+      the brackets!)  The text should be enclosed in the appropriate
+      comment syntax for the file format. We also recommend that a
+      file or class name and description of purpose be included on the
+      same "printed page" as the copyright notice for easier
+      identification within third-party archives.
+
+   Copyright [yyyy] [name of copyright owner]
+
+   Licensed under the Apache License, Version 2.0 (the "License");
+   you may not use this file except in compliance with the License.
+   You may obtain a copy of the License at
+
+       http://www.apache.org/licenses/LICENSE-2.0
+
+   Unless required by applicable law or agreed to in writing, software
+   distributed under the License is distributed on an "AS IS" BASIS,
+   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+   See the License for the specific language governing permissions and
+   limitations under the License.
diff --git a/NOTICE b/NOTICE
new file mode 100644
index 0000000..8266ca0
--- /dev/null
+++ b/NOTICE
@@ -0,0 +1,5 @@
+Apache BuildStream
+Copyright 2021 The Apache Software Foundation
+
+This product includes software developed at
+The Apache Software Foundation (http://www.apache.org/).


[buildstream-plugins] 28/49: tests/sources/bzr.py: Adding tests for bzr source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit eb933270680383c52d4b40c5fcb3fb0a8a1d0b1d
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 15:00:07 2022 +0900

    tests/sources/bzr.py: Adding tests for bzr source
---
 tests/sources/bzr.py           | 69 ++++++++++++++++++++++++++++++++++++++++++
 tests/sources/bzr/basic/test   |  1 +
 tests/sources/bzr/project.conf |  3 ++
 3 files changed, 73 insertions(+)

diff --git a/tests/sources/bzr.py b/tests/sources/bzr.py
new file mode 100644
index 0000000..4688423
--- /dev/null
+++ b/tests/sources/bzr.py
@@ -0,0 +1,69 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import subprocess
+
+import pytest
+
+from buildstream import _yaml
+from buildstream._testing import cli  # pylint: disable=unused-import
+from buildstream._testing import create_repo
+from buildstream._testing import generate_element
+
+from tests.testutils.site import HAVE_BZR
+
+pytestmark = pytest.mark.skipif(HAVE_BZR is False, reason="bzr is not available")
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "bzr")
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR))
+def test_fetch_checkout(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    repo = create_repo("bzr", str(tmpdir))
+    ref = repo.create(os.path.join(project, "basic"))
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config(ref=ref)]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch, build, checkout
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    assert result.exit_code == 0
+    result = cli.run(project=project, args=["build", "target.bst"])
+    assert result.exit_code == 0
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    assert result.exit_code == 0
+
+    # Assert we checked out the file as it was commited
+    with open(os.path.join(checkoutdir, "test"), encoding="utf-8") as f:
+        text = f.read()
+
+    assert text == "test\n"
+
+
+@pytest.mark.datafiles(DATA_DIR)
+def test_open_bzr_customize(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    repo = create_repo("bzr", str(tmpdir))
+    ref = repo.create(os.path.join(project, "basic"))
+
+    element = {"kind": "import", "sources": [repo.source_config(ref=ref)]}
+    generate_element(project, "target.bst", element)
+
+    workspace = os.path.join(datafiles, "bzr-workspace")
+    result = cli.run(cwd=project, project=project, args=["workspace", "open", "--directory", workspace, "target.bst"])
+    result.assert_success()
+
+    # Check that the .bzr dir exists
+    assert os.path.isdir(os.path.join(workspace, ".bzr"))
+
+    # Check that the correct origin branch is set
+    element_config = _yaml.load(os.path.join(project, "target.bst"), shortname=None)
+    source_config = element_config.get_sequence("sources").mapping_at(0)
+    output = subprocess.check_output(["bzr", "info"], cwd=workspace)
+    stripped_url = source_config.get_str("url").lstrip("file:///")
+    expected_output_str = "checkout of branch: /{}/{}".format(stripped_url, source_config.get_str("track"))
+    assert expected_output_str in str(output)
diff --git a/tests/sources/bzr/basic/test b/tests/sources/bzr/basic/test
new file mode 100644
index 0000000..9daeafb
--- /dev/null
+++ b/tests/sources/bzr/basic/test
@@ -0,0 +1 @@
+test
diff --git a/tests/sources/bzr/project.conf b/tests/sources/bzr/project.conf
new file mode 100644
index 0000000..08a9d60
--- /dev/null
+++ b/tests/sources/bzr/project.conf
@@ -0,0 +1,3 @@
+# Basic Project
+name: foo
+min-version: 2.0


[buildstream-plugins] 27/49: tests/sources/git.py: Adding tests for git source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit ddeceebc6128c8b6cc5a06ab15a703724f7fc096
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 14:44:23 2022 +0900

    tests/sources/git.py: Adding tests for git source
---
 tests/sources/git.py                               | 1143 ++++++++++++++++++++
 tests/sources/git/project-override/project.conf    |   12 +
 .../git/project-override/repofiles/file.txt        |    1 +
 .../git/project-override/subrepofiles/ponyfile.txt |    1 +
 .../template/inconsistent-submodule/.gitmodules    |    3 +
 .../git/template/othersubrepofiles/unicornfile.txt |    1 +
 tests/sources/git/template/project.conf            |    3 +
 tests/sources/git/template/repofiles/file.txt      |    1 +
 .../sources/git/template/subrepofiles/ponyfile.txt |    1 +
 9 files changed, 1166 insertions(+)

diff --git a/tests/sources/git.py b/tests/sources/git.py
new file mode 100644
index 0000000..beab0e6
--- /dev/null
+++ b/tests/sources/git.py
@@ -0,0 +1,1143 @@
+#
+#  Copyright (C) 2018 Codethink Limited
+#  Copyright (C) 2018 Bloomberg Finance LP
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors: Tristan Van Berkom <tr...@codethink.co.uk>
+#           Jonathan Maw <jo...@codethink.co.uk>
+#           William Salmon <wi...@codethink.co.uk>
+#
+
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import subprocess
+import shutil
+
+import pytest
+
+from buildstream import Node
+from buildstream.exceptions import ErrorDomain
+from buildstream.plugin import CoreWarnings
+from buildstream._testing import cli  # pylint: disable=unused-import
+from buildstream._testing import generate_project, generate_element, load_yaml
+from buildstream._testing import create_repo
+
+from tests.testutils.site import HAVE_GIT, HAVE_OLD_GIT
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "git",)
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_fetch_bad_ref(cli, tmpdir, datafiles):
+    project = str(datafiles)
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Write out our test target with a bad ref
+    element = {"kind": "import", "sources": [repo.source_config(ref="5")]}
+    generate_element(project, "target.bst", element)
+
+    # Assert that fetch raises an error here
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_main_error(ErrorDomain.STREAM, None)
+    result.assert_task_error(ErrorDomain.SOURCE, None)
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.skipif(HAVE_OLD_GIT, reason="old git cannot clone a shallow repo to stage the source")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_fetch_shallow(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    workspacedir = os.path.join(str(tmpdir), "workspace")
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+    first_commit = repo.latest_commit()
+    repo.add_commit()
+    repo.add_tag("tag")
+
+    ref = "tag-0-g" + repo.latest_commit()
+
+    element = {"kind": "import", "sources": [repo.source_config(ref=ref)]}
+    generate_element(project, "target.bst", element)
+
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["workspace", "open", "--directory", workspacedir, "target.bst"])
+    result.assert_success()
+
+    assert subprocess.call(["git", "show", "tag"], cwd=workspacedir) == 0
+    assert subprocess.call(["git", "show", first_commit], cwd=workspacedir) != 0
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_submodule_fetch_checkout(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created
+    ref = repo.add_submodule("subdir", "file://" + subrepo.repo)
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config(ref=ref)]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch, build, checkout
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Assert we checked out both files at their expected location
+    assert os.path.exists(os.path.join(checkoutdir, "file.txt"))
+    assert os.path.exists(os.path.join(checkoutdir, "subdir", "ponyfile.txt"))
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_recursive_submodule_fetch_checkout(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Create a submodule from the 'othersubrepofiles' subdir
+    subsubrepo = create_repo("git", str(tmpdir), "subsubrepo")
+    subsubrepo.create(os.path.join(project, "othersubrepofiles"))
+
+    # Create another submodule from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Configure submodules
+    subrepo.add_submodule("subdir", "file://" + subsubrepo.repo)
+    ref = repo.add_submodule("subdir", "file://" + subrepo.repo)
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config(ref=ref)]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch, build, checkout
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Assert we checked out all files at their expected location
+    assert os.path.exists(os.path.join(checkoutdir, "file.txt"))
+    assert os.path.exists(os.path.join(checkoutdir, "subdir", "ponyfile.txt"))
+    assert os.path.exists(os.path.join(checkoutdir, "subdir", "subdir", "unicornfile.txt"))
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_submodule_fetch_source_enable_explicit(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created
+    ref = repo.add_submodule("subdir", "file://" + subrepo.repo)
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config_extra(ref=ref, checkout_submodules=True)]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch, build, checkout
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Assert we checked out both files at their expected location
+    assert os.path.exists(os.path.join(checkoutdir, "file.txt"))
+    assert os.path.exists(os.path.join(checkoutdir, "subdir", "ponyfile.txt"))
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_submodule_fetch_source_disable(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created
+    ref = repo.add_submodule("subdir", "file://" + subrepo.repo)
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config_extra(ref=ref, checkout_submodules=False)]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch, build, checkout
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Assert we checked out both files at their expected location
+    assert os.path.exists(os.path.join(checkoutdir, "file.txt"))
+    assert not os.path.exists(os.path.join(checkoutdir, "subdir", "ponyfile.txt"))
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_submodule_fetch_submodule_does_override(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created
+    ref = repo.add_submodule("subdir", "file://" + subrepo.repo, checkout=True)
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config_extra(ref=ref, checkout_submodules=False)]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch, build, checkout
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Assert we checked out both files at their expected location
+    assert os.path.exists(os.path.join(checkoutdir, "file.txt"))
+    assert os.path.exists(os.path.join(checkoutdir, "subdir", "ponyfile.txt"))
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_submodule_fetch_submodule_individual_checkout(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create another submodule from the 'othersubrepofiles' subdir
+    other_subrepo = create_repo("git", str(tmpdir), "othersubrepo")
+    other_subrepo.create(os.path.join(project, "othersubrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created
+    repo.add_submodule("subdir", "file://" + subrepo.repo, checkout=False)
+    ref = repo.add_submodule("othersubdir", "file://" + other_subrepo.repo)
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config_extra(ref=ref, checkout_submodules=True)]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch, build, checkout
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Assert we checked out files at their expected location
+    assert os.path.exists(os.path.join(checkoutdir, "file.txt"))
+    assert not os.path.exists(os.path.join(checkoutdir, "subdir", "ponyfile.txt"))
+    assert os.path.exists(os.path.join(checkoutdir, "othersubdir", "unicornfile.txt"))
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_submodule_fetch_submodule_individual_checkout_explicit(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create another submodule from the 'othersubrepofiles' subdir
+    other_subrepo = create_repo("git", str(tmpdir), "othersubrepo")
+    other_subrepo.create(os.path.join(project, "othersubrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created
+    repo.add_submodule("subdir", "file://" + subrepo.repo, checkout=False)
+    ref = repo.add_submodule("othersubdir", "file://" + other_subrepo.repo, checkout=True)
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config_extra(ref=ref, checkout_submodules=True)]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch, build, checkout
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Assert we checked out files at their expected location
+    assert os.path.exists(os.path.join(checkoutdir, "file.txt"))
+    assert not os.path.exists(os.path.join(checkoutdir, "subdir", "ponyfile.txt"))
+    assert os.path.exists(os.path.join(checkoutdir, "othersubdir", "unicornfile.txt"))
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "project-override"))
+def test_submodule_fetch_project_override(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created
+    ref = repo.add_submodule("subdir", "file://" + subrepo.repo)
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config(ref=ref)]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch, build, checkout
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Assert we checked out both files at their expected location
+    assert os.path.exists(os.path.join(checkoutdir, "file.txt"))
+    assert not os.path.exists(os.path.join(checkoutdir, "subdir", "ponyfile.txt"))
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_submodule_track_ignore_inconsistent(cli, tmpdir, datafiles):
+    project = str(datafiles)
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    ref = repo.create(os.path.join(project, "repofiles"))
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config(ref=ref)]}
+    generate_element(project, "target.bst", element)
+
+    # Now add a .gitmodules file with an inconsistent submodule,
+    # we are calling this inconsistent because the file was created
+    # but `git submodule add` was never called, so there is no reference
+    # associated to the submodule.
+    #
+    repo.add_file(os.path.join(project, "inconsistent-submodule", ".gitmodules"))
+
+    # Fetch should work, we're not yet at the offending ref
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+
+    # Track to update to the offending commit
+    result = cli.run(project=project, args=["source", "track", "target.bst"])
+    result.assert_success()
+
+    # Fetch after track will encounter an inconsistent submodule without any ref
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+
+    # Assert that we are just fine without it, and emit a warning to the user.
+    assert "Ignoring inconsistent submodule" in result.stderr
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_submodule_track_no_ref_or_track(cli, tmpdir, datafiles):
+    project = str(datafiles)
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Write out our test target
+    gitsource = repo.source_config(ref=None)
+    gitsource.pop("track")
+    element = {"kind": "import", "sources": [gitsource]}
+    generate_element(project, "target.bst", element)
+
+    # Track will encounter an inconsistent submodule without any ref
+    result = cli.run(project=project, args=["show", "target.bst"])
+    result.assert_main_error(ErrorDomain.SOURCE, "missing-track-and-ref")
+    result.assert_task_error(None, None)
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+@pytest.mark.parametrize("fail", ["warn", "error"])
+def test_ref_not_in_track(cli, tmpdir, datafiles, fail):
+    project = str(datafiles)
+
+    # Make the warning an error if we're testing errors
+    if fail == "error":
+        generate_project(project, config={"fatal-warnings": [CoreWarnings.REF_NOT_IN_TRACK]})
+
+    # Create the repo from 'repofiles', create a branch without latest commit
+    repo = create_repo("git", str(tmpdir))
+    ref = repo.create(os.path.join(project, "repofiles"))
+
+    gitsource = repo.source_config(ref=ref)
+
+    # Overwrite the track value to the added branch
+    gitsource["track"] = "foo"
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [gitsource]}
+    generate_element(project, "target.bst", element)
+
+    result = cli.run(project=project, args=["build", "target.bst"])
+
+    # Assert a warning or an error depending on what we're checking
+    if fail == "error":
+        result.assert_main_error(ErrorDomain.STREAM, None)
+        result.assert_task_error(ErrorDomain.PLUGIN, CoreWarnings.REF_NOT_IN_TRACK)
+    else:
+        result.assert_success()
+        assert "ref-not-in-track" in result.stderr
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+@pytest.mark.parametrize("fail", ["warn", "error"])
+def test_unlisted_submodule(cli, tmpdir, datafiles, fail):
+    project = str(datafiles)
+
+    # Make the warning an error if we're testing errors
+    if fail == "error":
+        generate_project(project, config={"fatal-warnings": ["git:unlisted-submodule"]})
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created
+    ref = repo.add_submodule("subdir", "file://" + subrepo.repo)
+
+    # Create the source, and delete the explicit configuration
+    # of the submodules.
+    #
+    # We expect this to cause an unlisted submodule warning
+    # after the source has been fetched.
+    #
+    gitsource = repo.source_config(ref=ref)
+    del gitsource["submodules"]
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [gitsource]}
+    generate_element(project, "target.bst", element)
+
+    # The warning or error is reported during fetch. There should be no
+    # error with `bst show`.
+    result = cli.run(project=project, args=["show", "target.bst"])
+    result.assert_success()
+    assert "git:unlisted-submodule" not in result.stderr
+
+    # We will notice this directly in fetch, as it will try to fetch
+    # the submodules it discovers as a result of fetching the primary repo.
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+
+    # Assert a warning or an error depending on what we're checking
+    if fail == "error":
+        result.assert_main_error(ErrorDomain.STREAM, None)
+        result.assert_task_error(ErrorDomain.PLUGIN, "git:unlisted-submodule")
+    else:
+        result.assert_success()
+        assert "git:unlisted-submodule" in result.stderr
+
+    # Verify that `bst show` will still not error out after fetching.
+    result = cli.run(project=project, args=["show", "target.bst"])
+    result.assert_success()
+    assert "git:unlisted-submodule" not in result.stderr
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+@pytest.mark.parametrize("fail", ["warn", "error"])
+def test_track_unlisted_submodule(cli, tmpdir, datafiles, fail):
+    project = str(datafiles)
+
+    # Make the warning an error if we're testing errors
+    if fail == "error":
+        generate_project(project, config={"fatal-warnings": ["git:unlisted-submodule"]})
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    ref = repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created, but use
+    # the original ref, let the submodules appear after tracking
+    repo.add_submodule("subdir", "file://" + subrepo.repo)
+
+    # Create the source, and delete the explicit configuration
+    # of the submodules.
+    gitsource = repo.source_config(ref=ref)
+    del gitsource["submodules"]
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [gitsource]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch the repo, we will not see the warning because we
+    # are still pointing to a ref which predates the submodules
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    assert "git:unlisted-submodule" not in result.stderr
+
+    # We won't get a warning/error when tracking either, the source
+    # has not become cached so the opportunity to check
+    # for the warning has not yet arisen.
+    result = cli.run(project=project, args=["source", "track", "target.bst"])
+    result.assert_success()
+    assert "git:unlisted-submodule" not in result.stderr
+
+    # Fetching the repo at the new ref will finally reveal the warning
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    if fail == "error":
+        result.assert_main_error(ErrorDomain.STREAM, None)
+        result.assert_task_error(ErrorDomain.PLUGIN, "git:unlisted-submodule")
+    else:
+        result.assert_success()
+        assert "git:unlisted-submodule" in result.stderr
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+@pytest.mark.parametrize("fail", ["warn", "error"])
+def test_invalid_submodule(cli, tmpdir, datafiles, fail):
+    project = str(datafiles)
+
+    # Make the warning an error if we're testing errors
+    if fail == "error":
+        generate_project(project, config={"fatal-warnings": ["git:invalid-submodule"]})
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    ref = repo.create(os.path.join(project, "repofiles"))
+
+    # Create the source without any submodules, and add
+    # an invalid submodule configuration to it.
+    #
+    # We expect this to cause an invalid submodule warning
+    # after the source has been fetched and we know what
+    # the real submodules actually are.
+    #
+    gitsource = repo.source_config(ref=ref)
+    gitsource["submodules"] = {"subdir": {"url": "https://pony.org/repo.git"}}
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [gitsource]}
+    generate_element(project, "target.bst", element)
+
+    # The warning or error is reported during fetch. There should be no
+    # error with `bst show`.
+    result = cli.run(project=project, args=["show", "target.bst"])
+    result.assert_success()
+    assert "git:invalid-submodule" not in result.stderr
+
+    # We will notice this directly in fetch, as it will try to fetch
+    # the submodules it discovers as a result of fetching the primary repo.
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+
+    # Assert a warning or an error depending on what we're checking
+    if fail == "error":
+        result.assert_main_error(ErrorDomain.STREAM, None)
+        result.assert_task_error(ErrorDomain.PLUGIN, "git:invalid-submodule")
+    else:
+        result.assert_success()
+        assert "git:invalid-submodule" in result.stderr
+
+    # Verify that `bst show` will still not error out after fetching.
+    result = cli.run(project=project, args=["show", "target.bst"])
+    result.assert_success()
+    assert "git:invalid-submodule" not in result.stderr
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.skipif(HAVE_OLD_GIT, reason="old git rm does not update .gitmodules")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+@pytest.mark.parametrize("fail", ["warn", "error"])
+def test_track_invalid_submodule(cli, tmpdir, datafiles, fail):
+    project = str(datafiles)
+
+    # Make the warning an error if we're testing errors
+    if fail == "error":
+        generate_project(project, config={"fatal-warnings": ["git:invalid-submodule"]})
+
+    # Create the submodule first from the 'subrepofiles' subdir
+    subrepo = create_repo("git", str(tmpdir), "subrepo")
+    subrepo.create(os.path.join(project, "subrepofiles"))
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+
+    # Add a submodule pointing to the one we created
+    ref = repo.add_submodule("subdir", "file://" + subrepo.repo)
+
+    # Add a commit beyond the ref which *removes* the submodule we've added
+    repo.remove_path("subdir")
+
+    # Create the source, this will keep the submodules so initially
+    # the configuration is valid for the ref we're using
+    gitsource = repo.source_config(ref=ref)
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [gitsource]}
+    generate_element(project, "target.bst", element)
+
+    # Fetch the repo, we will not see the warning because we
+    # are still pointing to a ref which predates the submodules
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+    assert "git:invalid-submodule" not in result.stderr
+
+    # After tracking we're pointing to a ref, which would trigger an invalid
+    # submodule warning. However, cache validation is only performed as part
+    # of fetch.
+    result = cli.run(project=project, args=["source", "track", "target.bst"])
+    result.assert_success()
+
+    # Fetch to trigger cache validation
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    if fail == "error":
+        result.assert_main_error(ErrorDomain.STREAM, None)
+        result.assert_task_error(ErrorDomain.PLUGIN, "git:invalid-submodule")
+    else:
+        result.assert_success()
+        assert "git:invalid-submodule" in result.stderr
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+@pytest.mark.parametrize("ref_format", ["sha1", "git-describe"])
+@pytest.mark.parametrize("tag,extra_commit", [(False, False), (True, False), (True, True)])
+def test_track_fetch(cli, tmpdir, datafiles, ref_format, tag, extra_commit):
+    project = str(datafiles)
+
+    # Create the repo from 'repofiles' subdir
+    repo = create_repo("git", str(tmpdir))
+    repo.create(os.path.join(project, "repofiles"))
+    if tag:
+        repo.add_tag("tag")
+    if extra_commit:
+        repo.add_commit()
+
+    # Write out our test target
+    element = {"kind": "import", "sources": [repo.source_config()]}
+    element["sources"][0]["ref-format"] = ref_format
+    generate_element(project, "target.bst", element)
+    element_path = os.path.join(project, "target.bst")
+
+    # Track it
+    result = cli.run(project=project, args=["source", "track", "target.bst"])
+    result.assert_success()
+
+    element = load_yaml(element_path)
+    new_ref = element.get_sequence("sources").mapping_at(0).get_str("ref")
+
+    if ref_format == "git-describe" and tag:
+        # Check and strip prefix
+        prefix = "tag-{}-g".format(0 if not extra_commit else 1)
+        assert new_ref.startswith(prefix)
+        new_ref = new_ref[len(prefix) :]
+
+    # 40 chars for SHA-1
+    assert len(new_ref) == 40
+
+    # Fetch it
+    result = cli.run(project=project, args=["source", "fetch", "target.bst"])
+    result.assert_success()
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.skipif(HAVE_OLD_GIT, reason="old git describe lacks --first-parent")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+@pytest.mark.parametrize("ref_storage", [("inline"), ("project.refs")])
+@pytest.mark.parametrize("tag_type", [("annotated"), ("lightweight")])
+def test_git_describe(cli, tmpdir, datafiles, ref_storage, tag_type):
+    project = str(datafiles)
+
+    project_config = load_yaml(os.path.join(project, "project.conf"))
+    project_config["ref-storage"] = ref_storage
+    generate_project(project, config=project_config)
+
+    repofiles = os.path.join(str(tmpdir), "repofiles")
+    os.makedirs(repofiles, exist_ok=True)
+    file0 = os.path.join(repofiles, "file0")
+    with open(file0, "w", encoding="utf-8") as f:
+        f.write("test\n")
+
+    repo = create_repo("git", str(tmpdir))
+
+    def tag(name):
+        if tag_type == "annotated":
+            repo.add_annotated_tag(name, name)
+        else:
+            repo.add_tag(name)
+
+    repo.create(repofiles)
+    tag("uselesstag")
+
+    file1 = os.path.join(str(tmpdir), "file1")
+    with open(file1, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    repo.add_file(file1)
+    tag("tag1")
+
+    file2 = os.path.join(str(tmpdir), "file2")
+    with open(file2, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    repo.branch("branch2")
+    repo.add_file(file2)
+    tag("tag2")
+
+    repo.checkout("master")
+    file3 = os.path.join(str(tmpdir), "file3")
+    with open(file3, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    repo.add_file(file3)
+
+    repo.merge("branch2")
+
+    config = repo.source_config()
+    config["track"] = repo.latest_commit()
+    config["track-tags"] = True
+
+    # Write out our test target
+    element = {
+        "kind": "import",
+        "sources": [config],
+    }
+    generate_element(project, "target.bst", element)
+    element_path = os.path.join(project, "target.bst")
+
+    if ref_storage == "inline":
+        result = cli.run(project=project, args=["source", "track", "target.bst"])
+        result.assert_success()
+    else:
+        result = cli.run(project=project, args=["source", "track", "target.bst", "--deps", "all"])
+        result.assert_success()
+
+    if ref_storage == "inline":
+        element = load_yaml(element_path)
+        tags = element.get_sequence("sources").mapping_at(0).get_sequence("tags")
+        assert len(tags) == 2
+        for tag in tags:
+            assert "tag" in tag
+            assert "commit" in tag
+            assert "annotated" in tag
+            assert tag.get_bool("annotated") == (tag_type == "annotated")
+
+        assert {(tag.get_str("tag"), tag.get_str("commit")) for tag in tags} == {
+            ("tag1", repo.rev_parse("tag1^{commit}")),
+            ("tag2", repo.rev_parse("tag2^{commit}")),
+        }
+
+    checkout = os.path.join(str(tmpdir), "checkout")
+
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkout])
+    result.assert_success()
+
+    if tag_type == "annotated":
+        options = []
+    else:
+        options = ["--tags"]
+    describe = subprocess.check_output(["git", "describe", *options], cwd=checkout, universal_newlines=True)
+    assert describe.startswith("tag2-2-")
+
+    describe_fp = subprocess.check_output(
+        ["git", "describe", "--first-parent", *options], cwd=checkout, universal_newlines=True
+    )
+    assert describe_fp.startswith("tag1-2-")
+
+    tags = subprocess.check_output(["git", "tag"], cwd=checkout, universal_newlines=True)
+    tags = set(tags.splitlines())
+    assert tags == set(["tag1", "tag2"])
+
+    with pytest.raises(subprocess.CalledProcessError):
+        subprocess.run(["git", "log", repo.rev_parse("uselesstag")], cwd=checkout, check=True)
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+@pytest.mark.parametrize("ref_storage", [("inline"), ("project.refs")])
+@pytest.mark.parametrize("tag_type", [("annotated"), ("lightweight")])
+def test_git_describe_head_is_tagged(cli, tmpdir, datafiles, ref_storage, tag_type):
+    project = str(datafiles)
+
+    project_config = load_yaml(os.path.join(project, "project.conf"))
+    project_config["ref-storage"] = ref_storage
+    generate_project(project, config=project_config)
+
+    repofiles = os.path.join(str(tmpdir), "repofiles")
+    os.makedirs(repofiles, exist_ok=True)
+    file0 = os.path.join(repofiles, "file0")
+    with open(file0, "w", encoding="utf-8") as f:
+        f.write("test\n")
+
+    repo = create_repo("git", str(tmpdir))
+
+    def tag(name):
+        if tag_type == "annotated":
+            repo.add_annotated_tag(name, name)
+        else:
+            repo.add_tag(name)
+
+    repo.create(repofiles)
+    tag("uselesstag")
+
+    file1 = os.path.join(str(tmpdir), "file1")
+    with open(file1, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    repo.add_file(file1)
+
+    file2 = os.path.join(str(tmpdir), "file2")
+    with open(file2, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    repo.branch("branch2")
+    repo.add_file(file2)
+
+    repo.checkout("master")
+    file3 = os.path.join(str(tmpdir), "file3")
+    with open(file3, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    repo.add_file(file3)
+
+    tagged_ref = repo.merge("branch2")
+    tag("tag")
+
+    config = repo.source_config()
+    config["track"] = repo.latest_commit()
+    config["track-tags"] = True
+
+    # Write out our test target
+    element = {
+        "kind": "import",
+        "sources": [config],
+    }
+    generate_element(project, "target.bst", element)
+    element_path = os.path.join(project, "target.bst")
+
+    if ref_storage == "inline":
+        result = cli.run(project=project, args=["source", "track", "target.bst"])
+        result.assert_success()
+    else:
+        result = cli.run(project=project, args=["source", "track", "target.bst", "--deps", "all"])
+        result.assert_success()
+
+    if ref_storage == "inline":
+        element = load_yaml(element_path)
+        source = element.get_sequence("sources").mapping_at(0)
+        tags = source.get_sequence("tags")
+        assert len(tags) == 1
+
+        tag = source.get_sequence("tags").mapping_at(0)
+        assert "tag" in tag
+        assert "commit" in tag
+        assert "annotated" in tag
+        assert tag.get_bool("annotated") == (tag_type == "annotated")
+
+        tag_name = tag.get_str("tag")
+        commit = tag.get_str("commit")
+        assert (tag_name, commit) == ("tag", repo.rev_parse("tag^{commit}"))
+
+    checkout = os.path.join(str(tmpdir), "checkout")
+
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkout])
+    result.assert_success()
+
+    if tag_type == "annotated":
+        options = []
+    else:
+        options = ["--tags"]
+    describe = subprocess.check_output(["git", "describe", *options], cwd=checkout, universal_newlines=True)
+    assert describe.startswith("tag")
+
+    tags = subprocess.check_output(["git", "tag"], cwd=checkout, universal_newlines=True)
+    tags = set(tags.splitlines())
+    assert tags == set(["tag"])
+
+    rev_list = subprocess.check_output(["git", "rev-list", "--all"], cwd=checkout, universal_newlines=True)
+
+    assert set(rev_list.splitlines()) == set([tagged_ref])
+
+    with pytest.raises(subprocess.CalledProcessError):
+        subprocess.run(["git", "log", repo.rev_parse("uselesstag")], cwd=checkout, check=True)
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_git_describe_relevant_history(cli, tmpdir, datafiles):
+    project = str(datafiles)
+
+    project_config = load_yaml(os.path.join(project, "project.conf"))
+    project_config["ref-storage"] = "project.refs"
+    generate_project(project, config=project_config)
+
+    repofiles = os.path.join(str(tmpdir), "repofiles")
+    os.makedirs(repofiles, exist_ok=True)
+    file0 = os.path.join(repofiles, "file0")
+    with open(file0, "w", encoding="utf-8") as f:
+        f.write("test\n")
+
+    repo = create_repo("git", str(tmpdir))
+    repo.create(repofiles)
+
+    file1 = os.path.join(str(tmpdir), "file1")
+    with open(file1, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    repo.add_file(file1)
+    repo.branch("branch")
+    repo.checkout("master")
+
+    file2 = os.path.join(str(tmpdir), "file2")
+    with open(file2, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    repo.add_file(file2)
+
+    file3 = os.path.join(str(tmpdir), "file3")
+    with open(file3, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    branch_boundary = repo.add_file(file3)
+
+    repo.checkout("branch")
+    file4 = os.path.join(str(tmpdir), "file4")
+    with open(file4, "w", encoding="utf-8") as f:
+        f.write("test\n")
+    tagged_ref = repo.add_file(file4)
+    repo.add_annotated_tag("tag1", "tag1")
+
+    head = repo.merge("master")
+
+    config = repo.source_config()
+    config["track"] = head
+    config["track-tags"] = True
+
+    # Write out our test target
+    element = {
+        "kind": "import",
+        "sources": [config],
+    }
+    generate_element(project, "target.bst", element)
+
+    result = cli.run(project=project, args=["source", "track", "target.bst", "--deps", "all"])
+    result.assert_success()
+
+    checkout = os.path.join(str(tmpdir), "checkout")
+
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkout])
+    result.assert_success()
+
+    describe = subprocess.check_output(["git", "describe"], cwd=checkout, universal_newlines=True)
+    assert describe.startswith("tag1-2-")
+
+    rev_list = subprocess.check_output(["git", "rev-list", "--all"], cwd=checkout, universal_newlines=True)
+
+    assert set(rev_list.splitlines()) == set([head, tagged_ref, branch_boundary])
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_default_do_not_track_tags(cli, tmpdir, datafiles):
+    project = str(datafiles)
+
+    project_config = load_yaml(os.path.join(project, "project.conf"))
+    project_config["ref-storage"] = "inline"
+    generate_project(project, config=project_config)
+
+    repofiles = os.path.join(str(tmpdir), "repofiles")
+    os.makedirs(repofiles, exist_ok=True)
+    file0 = os.path.join(repofiles, "file0")
+    with open(file0, "w", encoding="utf-8") as f:
+        f.write("test\n")
+
+    repo = create_repo("git", str(tmpdir))
+
+    repo.create(repofiles)
+    repo.add_tag("tag")
+
+    config = repo.source_config()
+    config["track"] = repo.latest_commit()
+
+    # Write out our test target
+    element = {
+        "kind": "import",
+        "sources": [config],
+    }
+    generate_element(project, "target.bst", element)
+    element_path = os.path.join(project, "target.bst")
+
+    result = cli.run(project=project, args=["source", "track", "target.bst"])
+    result.assert_success()
+
+    element = load_yaml(element_path)
+    source = element.get_sequence("sources").mapping_at(0)
+    assert "tags" not in source
+
+
+@pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "template"))
+def test_overwrite_rogue_tag_multiple_remotes(cli, tmpdir, datafiles):
+    """When using multiple remotes in cache (i.e. when using aliases), we
+    need to make sure we override tags. This is not allowed to fetch
+    tags that were present from different origins
+    """
+
+    project = str(datafiles)
+
+    repofiles = os.path.join(str(tmpdir), "repofiles")
+    os.makedirs(repofiles, exist_ok=True)
+    file0 = os.path.join(repofiles, "file0")
+    with open(file0, "w", encoding="utf-8") as f:
+        f.write("test\n")
+
+    repo = create_repo("git", str(tmpdir))
+
+    top_commit = repo.create(repofiles)
+
+    repodir, reponame = os.path.split(repo.repo)
+    project_config = load_yaml(os.path.join(project, "project.conf"))
+    project_config["aliases"] = Node.from_dict({"repo": "http://example.com/"})
+    project_config["mirrors"] = [{"name": "middle-earth", "aliases": {"repo": ["file://{}/".format(repodir)]}}]
+    generate_project(project, config=project_config)
+
+    repo.add_annotated_tag("tag", "tag")
+
+    file1 = os.path.join(repofiles, "file1")
+    with open(file1, "w", encoding="utf-8") as f:
+        f.write("test\n")
+
+    ref = repo.add_file(file1)
+
+    config = repo.source_config(ref=ref)
+    del config["track"]
+    config["url"] = "repo:{}".format(reponame)
+
+    # Write out our test target
+    element = {
+        "kind": "import",
+        "sources": [config],
+    }
+    generate_element(project, "target.bst", element)
+
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+
+    repo.checkout(top_commit)
+
+    file2 = os.path.join(repofiles, "file2")
+    with open(file2, "w", encoding="utf-8") as f:
+        f.write("test\n")
+
+    new_ref = repo.add_file(file2)
+
+    repo.delete_tag("tag")
+    repo.add_annotated_tag("tag", "tag")
+    repo.checkout("master")
+
+    otherpath = os.path.join(str(tmpdir), "other_path")
+    shutil.copytree(repo.repo, os.path.join(otherpath, "repo"))
+    create_repo("git", otherpath)
+
+    repodir, reponame = os.path.split(repo.repo)
+
+    generate_project(project, config=project_config)
+
+    config = repo.source_config(ref=new_ref)
+    del config["track"]
+    config["url"] = "repo:{}".format(reponame)
+
+    element = {
+        "kind": "import",
+        "sources": [config],
+    }
+    generate_element(project, "target.bst", element)
+
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
diff --git a/tests/sources/git/project-override/project.conf b/tests/sources/git/project-override/project.conf
new file mode 100644
index 0000000..01c9016
--- /dev/null
+++ b/tests/sources/git/project-override/project.conf
@@ -0,0 +1,12 @@
+# Basic project
+name: foo
+min-version: 2.0
+sources:
+  git:
+    config:
+      checkout-submodules: False
+elements:
+  manual:
+    config:
+      build-commands:
+      - "foo"
diff --git a/tests/sources/git/project-override/repofiles/file.txt b/tests/sources/git/project-override/repofiles/file.txt
new file mode 100644
index 0000000..f621448
--- /dev/null
+++ b/tests/sources/git/project-override/repofiles/file.txt
@@ -0,0 +1 @@
+pony
diff --git a/tests/sources/git/project-override/subrepofiles/ponyfile.txt b/tests/sources/git/project-override/subrepofiles/ponyfile.txt
new file mode 100644
index 0000000..f73f309
--- /dev/null
+++ b/tests/sources/git/project-override/subrepofiles/ponyfile.txt
@@ -0,0 +1 @@
+file
diff --git a/tests/sources/git/template/inconsistent-submodule/.gitmodules b/tests/sources/git/template/inconsistent-submodule/.gitmodules
new file mode 100644
index 0000000..67271b8
--- /dev/null
+++ b/tests/sources/git/template/inconsistent-submodule/.gitmodules
@@ -0,0 +1,3 @@
+[submodule "farm/pony"]
+	path = farm/pony
+	url = git://pony.com
diff --git a/tests/sources/git/template/othersubrepofiles/unicornfile.txt b/tests/sources/git/template/othersubrepofiles/unicornfile.txt
new file mode 100644
index 0000000..f73f309
--- /dev/null
+++ b/tests/sources/git/template/othersubrepofiles/unicornfile.txt
@@ -0,0 +1 @@
+file
diff --git a/tests/sources/git/template/project.conf b/tests/sources/git/template/project.conf
new file mode 100644
index 0000000..dc34380
--- /dev/null
+++ b/tests/sources/git/template/project.conf
@@ -0,0 +1,3 @@
+# Basic project
+name: foo
+min-version: 2.0
diff --git a/tests/sources/git/template/repofiles/file.txt b/tests/sources/git/template/repofiles/file.txt
new file mode 100644
index 0000000..f621448
--- /dev/null
+++ b/tests/sources/git/template/repofiles/file.txt
@@ -0,0 +1 @@
+pony
diff --git a/tests/sources/git/template/subrepofiles/ponyfile.txt b/tests/sources/git/template/subrepofiles/ponyfile.txt
new file mode 100644
index 0000000..f73f309
--- /dev/null
+++ b/tests/sources/git/template/subrepofiles/ponyfile.txt
@@ -0,0 +1 @@
+file


[buildstream-plugins] 22/49: .pylintrc: Adding initial pylint configuration

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 869530235cb59f5853c8f017ea0951a9fc449ff7
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Mar 19 15:45:22 2022 +0900

    .pylintrc: Adding initial pylint configuration
    
    Modelled after the buildstream lint configuration, but allow declaration
    of instance variables outside of __init__ since this is common in plugins,
    and we have only plugins in this repo.
---
 .pylintrc | 544 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 544 insertions(+)

diff --git a/.pylintrc b/.pylintrc
new file mode 100644
index 0000000..2dcce97
--- /dev/null
+++ b/.pylintrc
@@ -0,0 +1,544 @@
+[MASTER]
+
+# A comma-separated list of package or module names from where C extensions may
+# be loaded. Extensions are loading into the active Python interpreter and may
+# run arbitrary code
+extension-pkg-whitelist=
+    buildstream.node,
+    buildstream._loader.loadelement,
+    buildstream._loader.types,
+    buildstream._types,
+    buildstream._utils,
+    buildstream._variables,
+    buildstream._yaml,
+    ujson
+
+# Add files or directories to the blacklist. They should be base names, not
+# paths.
+ignore=CVS,doc
+
+# Add files or directories matching the regex patterns to the blacklist. The
+# regex matches against base names, not paths.
+ignore-patterns=.*_pb2.py,.*_pb2_grpc.py
+
+# Python code to execute, usually for sys.path manipulation such as
+# pygtk.require().
+#init-hook=
+
+# Use multiple processes to speed up Pylint.
+jobs=1
+
+# List of plugins (as comma separated values of python modules names) to load,
+# usually to register additional checkers.
+load-plugins=
+
+# Pickle collected data for later comparisons.
+persistent=yes
+
+# Specify a configuration file.
+#rcfile=
+
+# When enabled, pylint would attempt to guess common misconfiguration and emit
+# user-friendly hints instead of false-positive error messages
+suggestion-mode=yes
+
+# Allow loading of arbitrary C extensions. Extensions are imported into the
+# active Python interpreter and may run arbitrary code.
+unsafe-load-any-extension=no
+
+
+[MESSAGES CONTROL]
+
+# Only show warnings with the listed confidence levels. Leave empty to show
+# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED
+confidence=
+
+# Disable the message, report, category or checker with the given id(s). You
+# can either give multiple identifiers separated by comma (,) or put this
+# option multiple times (only on the command line, not in the configuration
+# file where it should appear only once).You can also use "--disable=all" to
+# disable everything first and then reenable specific checks. For example, if
+# you want to run only the similarities checker, you can use "--disable=all
+# --enable=similarities". If you want to run only the classes checker, but have
+# no Warning level messages displayed, use"--disable=all --enable=classes
+# --disable=W"
+
+# We have three groups of disabled messages:
+#
+# 1) Messages that are of no use to us
+#    This is either because we don't follow the convention
+#    (missing-docstring and protected-access come to mind), or because
+#    it's not very useful in CI (too-many-arguments, for example)
+#
+# 2) Messages that we would like to enable at some point
+#    We introduced linting quite late into the project, so there are
+#    some issues that just grew out of control. Resolving these would
+#    be nice, but too much work atm.
+#
+# 3) Messages related to code formatting
+#    Since we use Black to format code automatically, there's no need for
+#    pylint to also check for those things.
+#
+
+disable=#####################################
+        # Messages that are of no use to us #
+        #####################################
+        ,
+	consider-using-f-string,
+        fixme,
+        missing-docstring,
+        no-self-use,
+        no-else-return,
+        protected-access,
+        too-few-public-methods,
+        too-many-arguments,
+        too-many-boolean-expressions,
+        too-many-branches,
+        too-many-instance-attributes,
+        too-many-lines,
+        too-many-locals,
+        too-many-nested-blocks,
+        too-many-public-methods,
+        too-many-statements,
+        too-many-return-statements,
+        too-many-ancestors,
+
+	# BuildStream plugins define instance variables in configure()
+        attribute-defined-outside-init,
+
+        #######################################################
+        # Messages that we would like to enable at some point #
+        #######################################################
+        # We have many circular imports that need breaking
+        import-outside-toplevel,
+
+        duplicate-code,
+
+        # Some invalid names are alright, we should configure pylint
+        # to accept them, and curb the others
+        invalid-name,
+
+        unused-argument,
+
+        # This is good to get context on exceptions, we should enable that
+        # at some point
+        raise-missing-from,
+
+	# We can probably enable this soon, it is a bit experimental
+	# for the moment and current releases of pylint (August 2021) raise
+	# a lot of false positives.
+	unused-private-member,
+
+        ##################################################
+        # Formatting-related messages, enforced by Black #
+        ##################################################
+
+        bad-continuation,
+        line-too-long,
+        superfluous-parens,
+
+
+# Enable the message, report, category or checker with the given id(s). You can
+# either give multiple identifier separated by comma (,) or put this option
+# multiple time (only on the command line, not in the configuration file where
+# it should appear only once). See also the "--disable" option for examples.
+enable=c-extension-no-member
+
+
+[REPORTS]
+
+# Python expression which should return a note less than 10 (10 is the highest
+# note). You have access to the variables errors warning, statement which
+# respectively contain the number of errors / warnings messages and the total
+# number of statements analyzed. This is used by the global evaluation report
+# (RP0004).
+evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
+
+# Template used to display messages. This is a python new-style format string
+# used to format the message information. See doc for all details
+#msg-template=
+
+# Set the output format. Available formats are text, parseable, colorized, json
+# and msvs (visual studio).You can also give a reporter class, eg
+# mypackage.mymodule.MyReporterClass.
+output-format=colorized
+
+# Tells whether to display a full report or only the messages
+reports=no
+
+# Activate the evaluation score.
+score=yes
+
+
+[REFACTORING]
+
+# Maximum number of nested blocks for function / method body
+max-nested-blocks=5
+
+# Complete name of functions that never returns. When checking for
+# inconsistent-return-statements if a never returning function is called then
+# it will be considered as an explicit return statement and no message will be
+# printed.
+never-returning-functions=optparse.Values,sys.exit
+
+
+[TYPECHECK]
+
+# List of decorators that produce context managers, such as
+# contextlib.contextmanager. Add to this list to register other decorators that
+# produce valid context managers.
+contextmanager-decorators=contextlib.contextmanager
+
+# List of members which are set dynamically and missed by pylint inference
+# system, and so shouldn't trigger E1101 when accessed. Python regular
+# expressions are accepted.
+generated-members=__enter__
+
+# Tells whether missing members accessed in mixin class should be ignored. A
+# mixin class is detected if its name ends with "mixin" (case insensitive).
+ignore-mixin-members=yes
+
+# This flag controls whether pylint should warn about no-member and similar
+# checks whenever an opaque object is returned when inferring. The inference
+# can return multiple potential results while evaluating a Python object, but
+# some branches might not be evaluated, which results in partial inference. In
+# that case, it might be useful to still emit no-member and other checks for
+# the rest of the inferred objects.
+ignore-on-opaque-inference=yes
+
+# List of class names for which member attributes should not be checked (useful
+# for classes with dynamically set attributes). This supports the use of
+# qualified names.
+ignored-classes=optparse.Values,thread._local,_thread._local,contextlib.closing,gi.repository.GLib.GError,pathlib.PurePath
+
+# List of module names for which member attributes should not be checked
+# (useful for modules/projects where namespaces are manipulated during runtime
+# and thus existing member attributes cannot be deduced by static analysis. It
+# supports qualified module names, as well as Unix pattern matching.
+ignored-modules=pkg_resources,gi.repository,grpc,buildstream._protos.*
+
+# Show a hint with possible names when a member name was not found. The aspect
+# of finding the hint is based on edit distance.
+missing-member-hint=yes
+
+# The minimum edit distance a name should have in order to be considered a
+# similar match for a missing member name.
+missing-member-hint-distance=1
+
+# The total number of similar names that should be taken in consideration when
+# showing a hint for a missing member.
+missing-member-max-choices=1
+
+
+[BASIC]
+
+# Naming style matching correct argument names
+argument-naming-style=snake_case
+
+# Regular expression matching correct argument names. Overrides argument-
+# naming-style
+#argument-rgx=
+
+# Naming style matching correct attribute names
+attr-naming-style=snake_case
+
+# Regular expression matching correct attribute names. Overrides attr-naming-
+# style
+#attr-rgx=
+
+# Bad variable names which should always be refused, separated by a comma
+bad-names=foo,
+          bar,
+          baz,
+          toto,
+          tutu,
+          tata
+
+# Naming style matching correct class attribute names
+class-attribute-naming-style=any
+
+# Regular expression matching correct class attribute names. Overrides class-
+# attribute-naming-style
+#class-attribute-rgx=
+
+# Naming style matching correct class names
+class-naming-style=PascalCase
+
+# Regular expression matching correct class names. Overrides class-naming-style
+#class-rgx=
+
+# Naming style matching correct constant names
+const-naming-style=UPPER_CASE
+
+# Regular expression matching correct constant names. Overrides const-naming-
+# style
+#const-rgx=
+
+# Minimum line length for functions/classes that require docstrings, shorter
+# ones are exempt.
+docstring-min-length=-1
+
+# Naming style matching correct function names
+function-naming-style=snake_case
+
+# Regular expression matching correct function names. Overrides function-
+# naming-style
+#function-rgx=
+
+# Good variable names which should always be accepted, separated by a comma
+good-names=i,j,k,ex,Run,_,e,f
+
+# Include a hint for the correct naming format with invalid-name
+include-naming-hint=no
+
+# Naming style matching correct inline iteration names
+inlinevar-naming-style=any
+
+# Regular expression matching correct inline iteration names. Overrides
+# inlinevar-naming-style
+#inlinevar-rgx=
+
+# Naming style matching correct method names
+method-naming-style=snake_case
+
+# Regular expression matching correct method names. Overrides method-naming-
+# style
+#method-rgx=
+
+# Naming style matching correct module names
+module-naming-style=snake_case
+
+# Regular expression matching correct module names. Overrides module-naming-
+# style
+#module-rgx=
+
+# Colon-delimited sets of names that determine each other's naming style when
+# the name regexes allow several styles.
+name-group=
+
+# Regular expression which should only match function or class names that do
+# not require a docstring.
+no-docstring-rgx=^_
+
+# List of decorators that produce properties, such as abc.abstractproperty. Add
+# to this list to register other decorators that produce valid properties.
+property-classes=abc.abstractproperty
+
+# Naming style matching correct variable names
+variable-naming-style=snake_case
+
+# Regular expression matching correct variable names. Overrides variable-
+# naming-style
+#variable-rgx=
+
+
+[VARIABLES]
+
+# List of additional names supposed to be defined in builtins. Remember that
+# you should avoid to define new builtins when possible.
+additional-builtins=
+
+# Tells whether unused global variables should be treated as a violation.
+allow-global-unused-variables=yes
+
+# List of strings which can identify a callback function by name. A callback
+# name must start or end with one of those strings.
+callbacks=cb_,
+          _cb
+
+# A regular expression matching the name of dummy variables (i.e. expectedly
+# not used).
+dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_
+
+# Argument names that match this expression will be ignored. Default to name
+# with leading underscore
+ignored-argument-names=_.*|^ignored_|^unused_
+
+# Tells whether we should check for unused import in __init__ files.
+init-import=no
+
+# List of qualified module names which can have objects that can redefine
+# builtins.
+redefining-builtins-modules=six.moves,past.builtins,future.builtins
+
+
+[LOGGING]
+
+# Logging modules to check that the string format arguments are in logging
+# function parameter format
+logging-modules=logging
+
+
+[SPELLING]
+
+# Limits count of emitted suggestions for spelling mistakes
+max-spelling-suggestions=4
+
+# Spelling dictionary name. Available dictionaries: none. To make it working
+# install python-enchant package.
+spelling-dict=
+
+# List of comma separated words that should not be checked.
+spelling-ignore-words=
+
+# A path to a file that contains private dictionary; one word per line.
+spelling-private-dict-file=
+
+# Tells whether to store unknown words to indicated private dictionary in
+# --spelling-private-dict-file option instead of raising a message.
+spelling-store-unknown-words=no
+
+
+[MISCELLANEOUS]
+
+# List of note tags to take in consideration, separated by a comma.
+notes=FIXME,
+      XXX,
+      TODO
+
+
+[SIMILARITIES]
+
+# Ignore comments when computing similarities.
+ignore-comments=yes
+
+# Ignore docstrings when computing similarities.
+ignore-docstrings=yes
+
+# Ignore imports when computing similarities.
+ignore-imports=no
+
+# Minimum lines number of a similarity.
+min-similarity-lines=4
+
+
+[FORMAT]
+
+# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
+expected-line-ending-format=
+
+# Regexp for a line that is allowed to be longer than the limit.
+ignore-long-lines=^\s*(# )?<?https?://\S+>?$
+
+# Number of spaces of indent required inside a hanging  or continued line.
+indent-after-paren=4
+
+# String used as indentation unit. This is usually "    " (4 spaces) or "\t" (1
+# tab).
+indent-string='    '
+
+# Maximum number of characters on a single line.
+max-line-length=119
+
+# Maximum number of lines in a module
+max-module-lines=1000
+
+# List of optional constructs for which whitespace checking is disabled. `dict-
+# separator` is used to allow tabulation in dicts, etc.: {1  : 1,\n222: 2}.
+# `trailing-comma` allows a space between comma and closing bracket: (a, ).
+# `empty-line` allows space-only lines.
+no-space-check=trailing-comma,
+               dict-separator
+
+# Allow the body of a class to be on the same line as the declaration if body
+# contains single statement.
+single-line-class-stmt=no
+
+# Allow the body of an if to be on the same line as the test if there is no
+# else.
+single-line-if-stmt=no
+
+
+[IMPORTS]
+
+# Allow wildcard imports from modules that define __all__.
+allow-wildcard-with-all=no
+
+# Analyse import fallback blocks. This can be used to support both Python 2 and
+# 3 compatible code, which means that the block might have code that exists
+# only in one or another interpreter, leading to false positives when analysed.
+analyse-fallback-blocks=no
+
+# Deprecated modules which should not be used, separated by a comma
+deprecated-modules=optparse,tkinter.tix
+
+# Create a graph of external dependencies in the given file (report RP0402 must
+# not be disabled)
+ext-import-graph=
+
+# Create a graph of every (i.e. internal and external) dependencies in the
+# given file (report RP0402 must not be disabled)
+import-graph=
+
+# Create a graph of internal dependencies in the given file (report RP0402 must
+# not be disabled)
+int-import-graph=
+
+# Force import order to recognize a module as part of the standard
+# compatibility libraries.
+known-standard-library=
+
+# Force import order to recognize a module as part of a third party library.
+known-third-party=enchant
+
+
+[DESIGN]
+
+# Maximum number of arguments for function / method
+max-args=5
+
+# Maximum number of attributes for a class (see R0902).
+max-attributes=7
+
+# Maximum number of boolean expressions in a if statement
+max-bool-expr=5
+
+# Maximum number of branch for function / method body
+max-branches=12
+
+# Maximum number of locals for function / method body
+max-locals=15
+
+# Maximum number of parents for a class (see R0901).
+max-parents=7
+
+# Maximum number of public methods for a class (see R0904).
+max-public-methods=20
+
+# Maximum number of return / yield for function / method body
+max-returns=6
+
+# Maximum number of statements in function / method body
+max-statements=50
+
+# Minimum number of public methods for a class (see R0903).
+min-public-methods=2
+
+
+[CLASSES]
+
+# List of method names used to declare (i.e. assign) instance attributes.
+defining-attr-methods=__init__,
+                      __new__,
+                      setUp
+
+# List of member names, which should be excluded from the protected access
+# warning.
+exclude-protected=_asdict,
+                  _fields,
+                  _replace,
+                  _source,
+                  _make
+
+# List of valid names for the first argument in a class method.
+valid-classmethod-first-arg=cls
+
+# List of valid names for the first argument in a metaclass class method.
+valid-metaclass-classmethod-first-arg=mcs
+
+
+[EXCEPTIONS]
+
+# Exceptions that will emit a warning when being caught. Defaults to
+# "Exception"
+overgeneral-exceptions=Exception


[buildstream-plugins] 37/49: tox.ini: Adding mypy static type checking

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 1ba6cc3d467b5d8cd6023d23054a09970320d5ce
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Thu Mar 24 15:04:48 2022 +0900

    tox.ini: Adding mypy static type checking
---
 requirements/mypy-requirements.txt |  7 +++++++
 setup.cfg                          | 10 ++++++++++
 tox.ini                            | 18 ++++++++++++++++++
 3 files changed, 35 insertions(+)

diff --git a/requirements/mypy-requirements.txt b/requirements/mypy-requirements.txt
new file mode 100644
index 0000000..20378a3
--- /dev/null
+++ b/requirements/mypy-requirements.txt
@@ -0,0 +1,7 @@
+# Additional requirements for running mypy
+
+# For cargo plugin
+types-toml
+
+# For docker plugin
+types-requests
diff --git a/setup.cfg b/setup.cfg
index 2796004..8058ebb 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -10,3 +10,13 @@ markers =
     datafiles: share datafiles in tests
 env =
     D:BST_TEST_SUITE=True
+
+[mypy]
+files = src
+warn_unused_configs = True
+warn_no_return = True
+
+# Ignore missing stubs for third-party packages.
+# In future, these should be re-enabled if/when stubs for them become available.
+[mypy-copyreg,grpc,pluginbase,psutil,pyroaring,ruamel,multiprocessing.forkserver]
+ignore_missing_imports=True
diff --git a/tox.ini b/tox.ini
index 44e0d0c..840c174 100644
--- a/tox.ini
+++ b/tox.ini
@@ -82,6 +82,24 @@ commands_pre =
 commands =
     pylint {posargs: buildstream_plugins tests setup.py}
 
+#
+# Running static type checkers
+#
+[testenv:mypy]
+skip_install = True
+commands =
+    mypy {posargs}
+deps =
+    mypy==0.910
+    types-protobuf
+    types-python-dateutil
+    types-setuptools
+    types-ujson
+    -rrequirements/plugin-requirements.txt
+    -rrequirements/test-requirements.txt
+    -rrequirements/mypy-requirements.txt
+    git+https://github.com/apache/buildstream@{env:BST_VERSION}
+
 #
 # Building documentation
 #


[buildstream-plugins] 26/49: tests/sources/patch.py: Adding tests for patch source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit d35af6e0df21a2bc09fb4b6877996f19304d903e
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 14:39:44 2022 +0900

    tests/sources/patch.py: Adding tests for patch source
---
 tests/sources/patch.py                             | 197 +++++++++++++++++++++
 tests/sources/patch/basic/failure-empty-dir.bst    |   5 +
 .../patch/basic/failure-nonexistent-dir.bst        |   6 +
 tests/sources/patch/basic/file.txt                 |   1 +
 tests/sources/patch/basic/file_1.patch             |   7 +
 tests/sources/patch/basic/irregular.bst            |   7 +
 tests/sources/patch/basic/project.conf             |   3 +
 tests/sources/patch/basic/target.bst               |   7 +
 tests/sources/patch/different-strip-level/file.txt |   1 +
 .../patch/different-strip-level/file_1.patch       |   7 +
 .../patch/different-strip-level/project.conf       |   3 +
 .../sources/patch/different-strip-level/target.bst |   8 +
 .../patch/invalid-relative-path/file_1.patch       |   7 +
 .../patch/invalid-relative-path/irregular.bst      |   5 +
 .../patch/invalid-relative-path/project.conf       |   3 +
 tests/sources/patch/multiple-patches/file.txt      |   1 +
 tests/sources/patch/multiple-patches/file_1.patch  |   7 +
 tests/sources/patch/multiple-patches/file_2.patch  |   7 +
 tests/sources/patch/multiple-patches/project.conf  |   3 +
 tests/sources/patch/multiple-patches/target.bst    |   9 +
 .../sources/patch/separate-patch-dir/file_1.patch  |   7 +
 .../separate-patch-dir/files/test-dir/file.txt     |   1 +
 .../sources/patch/separate-patch-dir/project.conf  |   3 +
 tests/sources/patch/separate-patch-dir/target.bst  |   8 +
 24 files changed, 313 insertions(+)

diff --git a/tests/sources/patch.py b/tests/sources/patch.py
new file mode 100644
index 0000000..91848f1
--- /dev/null
+++ b/tests/sources/patch.py
@@ -0,0 +1,197 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import socket
+import pytest
+
+from buildstream.exceptions import ErrorDomain, LoadErrorReason
+from buildstream._testing import cli  # pylint: disable=unused-import
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "patch",)
+
+
+# generate_file_types()
+#
+# Generator that creates a regular file directory, symbolic link, fifo
+# and socket at the specified path.
+#
+# Args:
+#  path: (str) path where to create each different type of file
+#
+def generate_file_types(path):
+    def clean():
+        if os.path.exists(path):
+            if os.path.isdir(path):
+                os.rmdir(path)
+            else:
+                os.remove(path)
+
+    clean()
+
+    with open(path, "w", encoding="utf-8"):
+        pass
+    yield
+    clean()
+
+    os.makedirs(path)
+    yield
+    clean()
+
+    os.symlink("project.conf", path)
+    yield
+    clean()
+
+    os.mkfifo(path)
+    yield
+    clean()
+
+    # Change directory because the full path may be longer than the ~100
+    # characters permitted for a unix socket
+    old_dir = os.getcwd()
+    parent, child = os.path.split(path)
+    os.chdir(parent)
+
+    s = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
+
+    try:
+        s.bind(child)
+        os.chdir(old_dir)
+        yield
+    finally:
+        s.close()
+
+    clean()
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "basic"))
+def test_missing_patch(cli, datafiles):
+    project = str(datafiles)
+
+    # Removing the local file causes preflight to fail
+    localfile = os.path.join(project, "file_1.patch")
+    os.remove(localfile)
+
+    result = cli.run(project=project, args=["show", "target.bst"])
+    result.assert_main_error(ErrorDomain.LOAD, LoadErrorReason.MISSING_FILE)
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "basic"))
+def test_non_regular_file_patch(cli, datafiles):
+    project = str(datafiles)
+
+    patch_path = os.path.join(project, "irregular_file.patch")
+    for _file_type in generate_file_types(patch_path):
+        result = cli.run(project=project, args=["show", "irregular.bst"])
+        if os.path.isfile(patch_path) and not os.path.islink(patch_path):
+            result.assert_success()
+        else:
+            result.assert_main_error(ErrorDomain.LOAD, LoadErrorReason.PROJ_PATH_INVALID_KIND)
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "basic"))
+def test_invalid_absolute_path(cli, datafiles):
+    project = str(datafiles)
+
+    with open(os.path.join(project, "target.bst"), "r", encoding="utf-8") as f:
+        old_yaml = f.read()
+    new_yaml = old_yaml.replace("file_1.patch", os.path.join(project, "file_1.patch"))
+    assert old_yaml != new_yaml
+
+    with open(os.path.join(project, "target.bst"), "w", encoding="utf-8") as f:
+        f.write(new_yaml)
+
+    result = cli.run(project=project, args=["show", "target.bst"])
+    result.assert_main_error(ErrorDomain.LOAD, LoadErrorReason.PROJ_PATH_INVALID)
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "invalid-relative-path"))
+def test_invalid_relative_path(cli, datafiles):
+    project = str(datafiles)
+
+    result = cli.run(project=project, args=["show", "irregular.bst"])
+    result.assert_main_error(ErrorDomain.LOAD, LoadErrorReason.PROJ_PATH_INVALID)
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "basic"))
+def test_stage_and_patch(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Build, checkout
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Test the file.txt was patched and changed
+    with open(os.path.join(checkoutdir, "file.txt"), encoding="utf-8") as f:
+        assert f.read() == "This is text file with superpowers\n"
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "basic"))
+def test_stage_file_nonexistent_dir(cli, datafiles):
+    project = str(datafiles)
+
+    # Fails at build time because it tries to patch into a non-existing directory
+    result = cli.run(project=project, args=["build", "failure-nonexistent-dir.bst"])
+    result.assert_main_error(ErrorDomain.STREAM, None)
+    result.assert_task_error(ErrorDomain.SOURCE, "patch-no-files")
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "basic"))
+def test_stage_file_empty_dir(cli, datafiles):
+    project = str(datafiles)
+
+    # Fails at build time because it tries to patch with nothing else staged
+    result = cli.run(project=project, args=["build", "failure-empty-dir.bst"])
+    result.assert_main_error(ErrorDomain.STREAM, None)
+    result.assert_task_error(ErrorDomain.SOURCE, "patch-no-files")
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "separate-patch-dir"))
+def test_stage_separate_patch_dir(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Track, fetch, build, checkout
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Test the file.txt was patched and changed
+    with open(os.path.join(checkoutdir, "test-dir", "file.txt"), encoding="utf-8") as f:
+        assert f.read() == "This is text file in a directory with superpowers\n"
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "multiple-patches"))
+def test_stage_multiple_patches(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Track, fetch, build, checkout
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Test the file.txt was patched and changed
+    with open(os.path.join(checkoutdir, "file.txt"), encoding="utf-8") as f:
+        assert f.read() == "This is text file with more superpowers\n"
+
+
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "different-strip-level"))
+def test_patch_strip_level(cli, tmpdir, datafiles):
+    project = str(datafiles)
+    checkoutdir = os.path.join(str(tmpdir), "checkout")
+
+    # Track, fetch, build, checkout
+    result = cli.run(project=project, args=["build", "target.bst"])
+    result.assert_success()
+    result = cli.run(project=project, args=["artifact", "checkout", "target.bst", "--directory", checkoutdir])
+    result.assert_success()
+
+    # Test the file.txt was patched and changed
+    with open(os.path.join(checkoutdir, "file.txt"), encoding="utf-8") as f:
+        assert f.read() == "This is text file with superpowers\n"
diff --git a/tests/sources/patch/basic/failure-empty-dir.bst b/tests/sources/patch/basic/failure-empty-dir.bst
new file mode 100644
index 0000000..b22af27
--- /dev/null
+++ b/tests/sources/patch/basic/failure-empty-dir.bst
@@ -0,0 +1,5 @@
+kind: import
+description: This is also the pony
+sources:
+- kind: patch
+  path: file_1.patch
diff --git a/tests/sources/patch/basic/failure-nonexistent-dir.bst b/tests/sources/patch/basic/failure-nonexistent-dir.bst
new file mode 100644
index 0000000..8fd593d
--- /dev/null
+++ b/tests/sources/patch/basic/failure-nonexistent-dir.bst
@@ -0,0 +1,6 @@
+kind: import
+description: This is also the pony
+sources:
+- kind: patch
+  path: file_1.patch
+  directory: /idontexist
diff --git a/tests/sources/patch/basic/file.txt b/tests/sources/patch/basic/file.txt
new file mode 100644
index 0000000..a496efe
--- /dev/null
+++ b/tests/sources/patch/basic/file.txt
@@ -0,0 +1 @@
+This is a text file
diff --git a/tests/sources/patch/basic/file_1.patch b/tests/sources/patch/basic/file_1.patch
new file mode 100644
index 0000000..424a486
--- /dev/null
+++ b/tests/sources/patch/basic/file_1.patch
@@ -0,0 +1,7 @@
+diff --git a/file.txt b/file.txt
+index a496efe..341ef26 100644
+--- a/file.txt
++++ b/file.txt
+@@ -1 +1 @@
+-This is a text file
++This is text file with superpowers
diff --git a/tests/sources/patch/basic/irregular.bst b/tests/sources/patch/basic/irregular.bst
new file mode 100644
index 0000000..425cbcc
--- /dev/null
+++ b/tests/sources/patch/basic/irregular.bst
@@ -0,0 +1,7 @@
+kind: import
+description: This is the pony
+sources:
+- kind: local
+  path: file.txt
+- kind: patch
+  path: irregular_file.patch
diff --git a/tests/sources/patch/basic/project.conf b/tests/sources/patch/basic/project.conf
new file mode 100644
index 0000000..dc34380
--- /dev/null
+++ b/tests/sources/patch/basic/project.conf
@@ -0,0 +1,3 @@
+# Basic project
+name: foo
+min-version: 2.0
diff --git a/tests/sources/patch/basic/target.bst b/tests/sources/patch/basic/target.bst
new file mode 100644
index 0000000..913371d
--- /dev/null
+++ b/tests/sources/patch/basic/target.bst
@@ -0,0 +1,7 @@
+kind: import
+description: This is the pony
+sources:
+- kind: local
+  path: file.txt
+- kind: patch
+  path: file_1.patch
diff --git a/tests/sources/patch/different-strip-level/file.txt b/tests/sources/patch/different-strip-level/file.txt
new file mode 100644
index 0000000..a496efe
--- /dev/null
+++ b/tests/sources/patch/different-strip-level/file.txt
@@ -0,0 +1 @@
+This is a text file
diff --git a/tests/sources/patch/different-strip-level/file_1.patch b/tests/sources/patch/different-strip-level/file_1.patch
new file mode 100644
index 0000000..ff7f7fe
--- /dev/null
+++ b/tests/sources/patch/different-strip-level/file_1.patch
@@ -0,0 +1,7 @@
+diff --git foo/a/file.txt foo/b/file.txt
+index a496efe..341ef26 100644
+--- foo/a/file.txt
++++ foo/b/file.txt
+@@ -1 +1 @@
+-This is a text file
++This is text file with superpowers
diff --git a/tests/sources/patch/different-strip-level/project.conf b/tests/sources/patch/different-strip-level/project.conf
new file mode 100644
index 0000000..dc34380
--- /dev/null
+++ b/tests/sources/patch/different-strip-level/project.conf
@@ -0,0 +1,3 @@
+# Basic project
+name: foo
+min-version: 2.0
diff --git a/tests/sources/patch/different-strip-level/target.bst b/tests/sources/patch/different-strip-level/target.bst
new file mode 100644
index 0000000..c8ea19a
--- /dev/null
+++ b/tests/sources/patch/different-strip-level/target.bst
@@ -0,0 +1,8 @@
+kind: import
+description: This is the pony
+sources:
+- kind: local
+  path: file.txt
+- kind: patch
+  path: file_1.patch
+  strip-level: 2
diff --git a/tests/sources/patch/invalid-relative-path/file_1.patch b/tests/sources/patch/invalid-relative-path/file_1.patch
new file mode 100644
index 0000000..424a486
--- /dev/null
+++ b/tests/sources/patch/invalid-relative-path/file_1.patch
@@ -0,0 +1,7 @@
+diff --git a/file.txt b/file.txt
+index a496efe..341ef26 100644
+--- a/file.txt
++++ b/file.txt
+@@ -1 +1 @@
+-This is a text file
++This is text file with superpowers
diff --git a/tests/sources/patch/invalid-relative-path/irregular.bst b/tests/sources/patch/invalid-relative-path/irregular.bst
new file mode 100644
index 0000000..6b63a4e
--- /dev/null
+++ b/tests/sources/patch/invalid-relative-path/irregular.bst
@@ -0,0 +1,5 @@
+kind: import
+description: This is the pony
+sources:
+- kind: patch
+  path: ../invalid-relative-path/irregular_file.patch
diff --git a/tests/sources/patch/invalid-relative-path/project.conf b/tests/sources/patch/invalid-relative-path/project.conf
new file mode 100644
index 0000000..dc34380
--- /dev/null
+++ b/tests/sources/patch/invalid-relative-path/project.conf
@@ -0,0 +1,3 @@
+# Basic project
+name: foo
+min-version: 2.0
diff --git a/tests/sources/patch/multiple-patches/file.txt b/tests/sources/patch/multiple-patches/file.txt
new file mode 100644
index 0000000..a496efe
--- /dev/null
+++ b/tests/sources/patch/multiple-patches/file.txt
@@ -0,0 +1 @@
+This is a text file
diff --git a/tests/sources/patch/multiple-patches/file_1.patch b/tests/sources/patch/multiple-patches/file_1.patch
new file mode 100644
index 0000000..424a486
--- /dev/null
+++ b/tests/sources/patch/multiple-patches/file_1.patch
@@ -0,0 +1,7 @@
+diff --git a/file.txt b/file.txt
+index a496efe..341ef26 100644
+--- a/file.txt
++++ b/file.txt
+@@ -1 +1 @@
+-This is a text file
++This is text file with superpowers
diff --git a/tests/sources/patch/multiple-patches/file_2.patch b/tests/sources/patch/multiple-patches/file_2.patch
new file mode 100644
index 0000000..f56614b
--- /dev/null
+++ b/tests/sources/patch/multiple-patches/file_2.patch
@@ -0,0 +1,7 @@
+diff --git a/file.txt b/file.txt
+index a496efe..341ef26 100644
+--- a/file.txt
++++ b/file.txt
+@@ -1 +1 @@
+-This is text file with superpowers
++This is text file with more superpowers
diff --git a/tests/sources/patch/multiple-patches/project.conf b/tests/sources/patch/multiple-patches/project.conf
new file mode 100644
index 0000000..dc34380
--- /dev/null
+++ b/tests/sources/patch/multiple-patches/project.conf
@@ -0,0 +1,3 @@
+# Basic project
+name: foo
+min-version: 2.0
diff --git a/tests/sources/patch/multiple-patches/target.bst b/tests/sources/patch/multiple-patches/target.bst
new file mode 100644
index 0000000..4665e7d
--- /dev/null
+++ b/tests/sources/patch/multiple-patches/target.bst
@@ -0,0 +1,9 @@
+kind: import
+description: This is the pony
+sources:
+- kind: local
+  path: file.txt
+- kind: patch
+  path: file_1.patch
+- kind: patch
+  path: file_2.patch
diff --git a/tests/sources/patch/separate-patch-dir/file_1.patch b/tests/sources/patch/separate-patch-dir/file_1.patch
new file mode 100644
index 0000000..ae8bc33
--- /dev/null
+++ b/tests/sources/patch/separate-patch-dir/file_1.patch
@@ -0,0 +1,7 @@
+diff --git a/file.txt b/file.txt
+index a496efe..341ef26 100644
+--- a/file.txt
++++ b/file.txt
+@@ -1 +1 @@
+-This is a text file in a directory
++This is text file in a directory with superpowers
diff --git a/tests/sources/patch/separate-patch-dir/files/test-dir/file.txt b/tests/sources/patch/separate-patch-dir/files/test-dir/file.txt
new file mode 100644
index 0000000..425911a
--- /dev/null
+++ b/tests/sources/patch/separate-patch-dir/files/test-dir/file.txt
@@ -0,0 +1 @@
+This is a text file in a directory
diff --git a/tests/sources/patch/separate-patch-dir/project.conf b/tests/sources/patch/separate-patch-dir/project.conf
new file mode 100644
index 0000000..dc34380
--- /dev/null
+++ b/tests/sources/patch/separate-patch-dir/project.conf
@@ -0,0 +1,3 @@
+# Basic project
+name: foo
+min-version: 2.0
diff --git a/tests/sources/patch/separate-patch-dir/target.bst b/tests/sources/patch/separate-patch-dir/target.bst
new file mode 100644
index 0000000..796c131
--- /dev/null
+++ b/tests/sources/patch/separate-patch-dir/target.bst
@@ -0,0 +1,8 @@
+kind: import
+description: This is the pony
+sources:
+- kind: local
+  path: files
+- kind: patch
+  path: file_1.patch
+  directory: test-dir


[buildstream-plugins] 05/49: Initially adding bzr source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 15381b54a79bf0c9a99b339862cabb259515f0d7
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 16:38:33 2022 +0900

    Initially adding bzr source
    
    From buildstream core plugins
---
 src/buildstream_plugins/sources/bzr.py | 220 +++++++++++++++++++++++++++++++++
 1 file changed, 220 insertions(+)

diff --git a/src/buildstream_plugins/sources/bzr.py b/src/buildstream_plugins/sources/bzr.py
new file mode 100644
index 0000000..c768b35
--- /dev/null
+++ b/src/buildstream_plugins/sources/bzr.py
@@ -0,0 +1,220 @@
+#  Copyright (C) 2017 Codethink Limited
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Jonathan Maw <jo...@codethink.co.uk>
+
+"""
+bzr - stage files from a bazaar repository
+==========================================
+
+**Host dependencies:**
+
+  * bzr
+
+**Usage:**
+
+.. code:: yaml
+
+   # Specify the bzr source kind
+   kind: bzr
+
+   # Specify the bzr url. Bazaar URLs come in many forms, see
+   # `bzr help urlspec` for more information. Using an alias defined
+   # in your project configuration is encouraged.
+   url: https://launchpad.net/bzr
+
+   # Specify the tracking branch. This is mandatory, as bzr cannot identify
+   # an individual revision outside its branch. bzr URLs that omit the branch
+   # name implicitly specify the trunk branch, but bst requires this to be
+   # explicit.
+   track: trunk
+
+   # Specify the ref. This is a revision number. This is usually a decimal,
+   # but revisions on a branch are of the form
+   # <revision-branched-from>.<branch-number>.<revision-since-branching>
+   # e.g. 6622.1.6.
+   # The ref must be specified to build, and 'bst source track' will update the
+   # revision number to the one on the tip of the branch specified in 'track'.
+   ref: 6622
+
+See `built-in functionality doumentation
+<https://docs.buildstream.build/master/buildstream.source.html#core-source-builtins>`_ for
+details on common configuration options for sources.
+"""
+
+import os
+import shutil
+import fcntl
+from contextlib import contextmanager
+
+from buildstream import Source, SourceError
+from buildstream import utils
+
+
+class BzrSource(Source):
+    # pylint: disable=attribute-defined-outside-init
+
+    BST_MIN_VERSION = "2.0"
+
+    def configure(self, node):
+        node.validate_keys(["url", "track", "ref", *Source.COMMON_CONFIG_KEYS])
+
+        self.original_url = node.get_str("url")
+        self.tracking = node.get_str("track")
+        self.ref = node.get_str("ref", None)
+        self.url = self.translate_url(self.original_url)
+
+    def preflight(self):
+        # Check if bzr is installed, get the binary at the same time.
+        self.host_bzr = utils.get_host_tool("bzr")
+
+    def get_unique_key(self):
+        return [self.original_url, self.tracking, self.ref]
+
+    def is_cached(self):
+        with self._locked():
+            return self._check_ref()
+
+    def load_ref(self, node):
+        self.ref = node.get_str("ref", None)
+
+    def get_ref(self):
+        return self.ref
+
+    def set_ref(self, ref, node):
+        node["ref"] = self.ref = ref
+
+    def track(self):  # pylint: disable=arguments-differ
+        with self.timed_activity("Tracking {}".format(self.url), silent_nested=True), self._locked():
+            self._ensure_mirror(skip_ref_check=True)
+            ret, out = self.check_output(
+                [self.host_bzr, "version-info", "--custom", "--template={revno}", self._get_branch_dir(),],
+                fail="Failed to read the revision number at '{}'".format(self._get_branch_dir()),
+            )
+            if ret != 0:
+                raise SourceError("{}: Failed to get ref for tracking {}".format(self, self.tracking))
+
+            return out
+
+    def fetch(self):  # pylint: disable=arguments-differ
+        with self.timed_activity("Fetching {}".format(self.url), silent_nested=True), self._locked():
+            self._ensure_mirror()
+
+    def stage(self, directory):
+        self.call(
+            [
+                self.host_bzr,
+                "checkout",
+                "--lightweight",
+                "--revision=revno:{}".format(self.ref),
+                self._get_branch_dir(),
+                directory,
+            ],
+            fail="Failed to checkout revision {} from branch {} to {}".format(
+                self.ref, self._get_branch_dir(), directory
+            ),
+        )
+        # Remove .bzr dir
+        shutil.rmtree(os.path.join(directory, ".bzr"))
+
+    def init_workspace(self, directory):
+        url = os.path.join(self.url, self.tracking)
+        with self.timed_activity('Setting up workspace "{}"'.format(directory), silent_nested=True):
+            # Checkout from the cache
+            self.call(
+                [
+                    self.host_bzr,
+                    "branch",
+                    "--use-existing-dir",
+                    "--revision=revno:{}".format(self.ref),
+                    self._get_branch_dir(),
+                    directory,
+                ],
+                fail="Failed to branch revision {} from branch {} to {}".format(
+                    self.ref, self._get_branch_dir(), directory
+                ),
+            )
+            # Switch the parent branch to the source's origin
+            self.call(
+                [self.host_bzr, "switch", "--directory={}".format(directory), url,],
+                fail="Failed to switch workspace's parent branch to {}".format(url),
+            )
+
+    # _locked()
+    #
+    # This context manager ensures exclusive access to the
+    # bzr repository.
+    #
+    @contextmanager
+    def _locked(self):
+        lockdir = os.path.join(self.get_mirror_directory(), "locks")
+        lockfile = os.path.join(lockdir, utils.url_directory_name(self.original_url) + ".lock")
+        os.makedirs(lockdir, exist_ok=True)
+        with open(lockfile, "wb") as lock:
+            fcntl.flock(lock, fcntl.LOCK_EX)
+            try:
+                yield
+            finally:
+                fcntl.flock(lock, fcntl.LOCK_UN)
+
+    def _check_ref(self):
+        # If the mirror doesnt exist yet, then we dont have the ref
+        if not os.path.exists(self._get_branch_dir()):
+            return False
+
+        return (
+            self.call([self.host_bzr, "revno", "--revision=revno:{}".format(self.ref), self._get_branch_dir(),]) == 0
+        )
+
+    def _get_branch_dir(self):
+        return os.path.join(self._get_mirror_dir(), self.tracking)
+
+    def _get_mirror_dir(self):
+        return os.path.join(self.get_mirror_directory(), utils.url_directory_name(self.original_url),)
+
+    def _ensure_mirror(self, skip_ref_check=False):
+        mirror_dir = self._get_mirror_dir()
+        bzr_metadata_dir = os.path.join(mirror_dir, ".bzr")
+        if not os.path.exists(bzr_metadata_dir):
+            self.call(
+                [self.host_bzr, "init-repo", "--no-trees", mirror_dir], fail="Failed to initialize bzr repository",
+            )
+
+        branch_dir = os.path.join(mirror_dir, self.tracking)
+        branch_url = self.url + "/" + self.tracking
+        if not os.path.exists(branch_dir):
+            # `bzr branch` the branch if it doesn't exist
+            # to get the upstream code
+            self.call(
+                [self.host_bzr, "branch", branch_url, branch_dir],
+                fail="Failed to branch from {} to {}".format(branch_url, branch_dir),
+            )
+
+        else:
+            # `bzr pull` the branch if it does exist
+            # to get any changes to the upstream code
+            self.call(
+                [self.host_bzr, "pull", "--directory={}".format(branch_dir), branch_url,],
+                fail="Failed to pull new changes for {}".format(branch_dir),
+            )
+
+        if not skip_ref_check and not self._check_ref():
+            raise SourceError(
+                "Failed to ensure ref '{}' was mirrored".format(self.ref), reason="ref-not-mirrored",
+            )
+
+
+def setup():
+    return BzrSource


[buildstream-plugins] 31/49: tests/elements/meson.py: Adding meson element tests

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit d151fcdd0d676236b8796f4b51a3a4e87d546445
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 16:12:27 2022 +0900

    tests/elements/meson.py: Adding meson element tests
---
 tests/elements/meson.py                            |  62 +++++++++++++++++++++
 tests/elements/meson/elements/base.bst             |   7 +++
 .../elements/meson/elements/base/alpine-image.bst  |   6 ++
 .../meson/elements/base/base-configure.bst         |  28 ++++++++++
 .../elements/meson/elements/base/install-dpkg.bst  |  15 +++++
 tests/elements/meson/elements/base/meson.bst       |   9 +++
 tests/elements/meson/elements/base/ninja.bst       |  15 +++++
 .../elements/meson/elements/mesonconfroothello.bst |  15 +++++
 tests/elements/meson/elements/mesonhello.bst       |   9 +++
 tests/elements/meson/files/mesonhello.tar.gz       | Bin 0 -> 1195 bytes
 tests/elements/meson/project.conf                  |  16 ++++++
 11 files changed, 182 insertions(+)

diff --git a/tests/elements/meson.py b/tests/elements/meson.py
new file mode 100644
index 0000000..ac11fab
--- /dev/null
+++ b/tests/elements/meson.py
@@ -0,0 +1,62 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import pytest
+
+from buildstream._testing.runcli import cli_integration as cli  # pylint: disable=unused-import
+from buildstream._testing.integration import integration_cache  # pylint: disable=unused-import
+from buildstream._testing.integration import assert_contains
+from buildstream._testing._utils.site import HAVE_SANDBOX
+
+pytestmark = pytest.mark.integration
+
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "meson")
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_meson_build(cli, datafiles):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_name = "mesonhello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    assert result.exit_code == 0
+
+    assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_meson_confroot_build(cli, datafiles):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_name = "mesonconfroothello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    assert result.exit_code == 0
+
+    assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_meson_run(cli, datafiles):
+    project = str(datafiles)
+    element_name = "mesonhello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["shell", element_name, "/usr/bin/hello"])
+    assert result.exit_code == 0
+
+    assert result.output == """Hello, World!\n"""
diff --git a/tests/elements/meson/elements/base.bst b/tests/elements/meson/elements/base.bst
new file mode 100644
index 0000000..08f4ffd
--- /dev/null
+++ b/tests/elements/meson/elements/base.bst
@@ -0,0 +1,7 @@
+kind: stack
+depends:
+- base/install-dpkg.bst
+- base/base-configure.bst
+- base/alpine-image.bst
+- base/ninja.bst
+- base/meson.bst
diff --git a/tests/elements/meson/elements/base/alpine-image.bst b/tests/elements/meson/elements/base/alpine-image.bst
new file mode 100644
index 0000000..f8e00ba
--- /dev/null
+++ b/tests/elements/meson/elements/base/alpine-image.bst
@@ -0,0 +1,6 @@
+kind: import
+description: Import an alpine image as the platform
+sources:
+- kind: tar
+  url: alpine:integration-tests-base.v1.x86_64.tar.xz
+  ref: 3eb559250ba82b64a68d86d0636a6b127aa5f6d25d3601a79f79214dc9703639
diff --git a/tests/elements/meson/elements/base/base-configure.bst b/tests/elements/meson/elements/base/base-configure.bst
new file mode 100644
index 0000000..5323004
--- /dev/null
+++ b/tests/elements/meson/elements/base/base-configure.bst
@@ -0,0 +1,28 @@
+kind: script
+depends:
+- filename: base/install-dpkg.bst
+  type: build
+
+variables:
+  install-root: /
+
+config:
+
+  commands:
+  - |
+    # Avoid some chowns which fail at dpkg configure time
+    #
+    mv /bin/chown /bin/chown.real
+    ln -s true /bin/chown
+
+  - |
+    # This is expected to fail, but will configure everything we need
+    # at least for the purpose of building, other dpkg scripts which
+    # require real root privileges will always fail here.
+    DEBIAN_FRONTEND=noninteractive dpkg --configure -a --abort-after=100000 || exit 0
+
+  - |
+    # Restore chown
+    #
+    rm -f /bin/chown
+    mv /bin/chown.real /bin/chown
diff --git a/tests/elements/meson/elements/base/install-dpkg.bst b/tests/elements/meson/elements/base/install-dpkg.bst
new file mode 100644
index 0000000..858044a
--- /dev/null
+++ b/tests/elements/meson/elements/base/install-dpkg.bst
@@ -0,0 +1,15 @@
+kind: manual
+depends:
+- filename: base/alpine-image.bst
+  type: build
+sources:
+- kind: git
+  url: https://gitlab.com/BuildStream/buildstream-sysroots.git
+  track: dpkg-build
+  ref: ecf14954e4298ce5495f701464339162fad73f30
+config:
+  install-commands:
+  - tar xf dpkg-build-sysroot.tar.xz -C %{install-root} --no-same-owner
+  strip-commands:
+    # For some reason, the strip commands were hanging...
+    - echo "none"
diff --git a/tests/elements/meson/elements/base/meson.bst b/tests/elements/meson/elements/base/meson.bst
new file mode 100644
index 0000000..bfeb945
--- /dev/null
+++ b/tests/elements/meson/elements/base/meson.bst
@@ -0,0 +1,9 @@
+kind: setuptools
+
+depends:
+- filename: base/alpine-image.bst
+
+sources:
+- kind: git
+  url: https://github.com/mesonbuild/meson.git
+  ref: 0.51.2-0-g6857936c592d6f9608add5a74a51ee405aaddc0d
diff --git a/tests/elements/meson/elements/base/ninja.bst b/tests/elements/meson/elements/base/ninja.bst
new file mode 100644
index 0000000..74afb16
--- /dev/null
+++ b/tests/elements/meson/elements/base/ninja.bst
@@ -0,0 +1,15 @@
+kind: manual
+
+depends:
+- filename: base/alpine-image.bst
+
+config:
+  install-commands:
+  - |
+    install -D -m 0755 ninja %{install-root}%{bindir}/ninja
+
+sources:
+- kind: zip
+  url: https://github.com/ninja-build/ninja/releases/download/v1.9.0/ninja-linux.zip
+  ref: 1b1235f2b0b4df55ac6d80bbe681ea3639c9d2c505c7ff2159a3daf63d196305
+  base-dir: ''
diff --git a/tests/elements/meson/elements/mesonconfroothello.bst b/tests/elements/meson/elements/mesonconfroothello.bst
new file mode 100644
index 0000000..3455f3b
--- /dev/null
+++ b/tests/elements/meson/elements/mesonconfroothello.bst
@@ -0,0 +1,15 @@
+kind: meson
+description: meson test
+
+depends:
+- base.bst
+
+sources:
+- kind: tar
+  directory: Source
+  url: project_dir:/files/mesonhello.tar.gz
+  ref: dbfa22f02c82c83493596cde465a7ed4c39d8d412da3d8ac3b24c3045781f3b2
+
+variables:
+  conf-root: "%{build-root}/Source"
+  command-subdir: build
diff --git a/tests/elements/meson/elements/mesonhello.bst b/tests/elements/meson/elements/mesonhello.bst
new file mode 100644
index 0000000..816c57c
--- /dev/null
+++ b/tests/elements/meson/elements/mesonhello.bst
@@ -0,0 +1,9 @@
+kind: meson
+
+depends:
+- filename: base.bst
+
+sources:
+- kind: tar
+  url: project_dir:/files/mesonhello.tar.gz
+  ref: dbfa22f02c82c83493596cde465a7ed4c39d8d412da3d8ac3b24c3045781f3b2
diff --git a/tests/elements/meson/files/mesonhello.tar.gz b/tests/elements/meson/files/mesonhello.tar.gz
new file mode 100644
index 0000000..32fe461
Binary files /dev/null and b/tests/elements/meson/files/mesonhello.tar.gz differ
diff --git a/tests/elements/meson/project.conf b/tests/elements/meson/project.conf
new file mode 100644
index 0000000..52ffbf0
--- /dev/null
+++ b/tests/elements/meson/project.conf
@@ -0,0 +1,16 @@
+# test project config
+name: test
+min-version: 2.0
+
+element-path: elements
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  elements:
+  - meson
+  - setuptools
+
+aliases:
+  alpine: https://bst-integration-test-images.ams3.cdn.digitaloceanspaces.com/
+  project_dir: file://{project_dir}


[buildstream-plugins] 19/49: setup.py: Adding initial setup.py

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 0071b707705fc71075ac3f5c075808b92659d989
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Mar 19 14:44:39 2022 +0900

    setup.py: Adding initial setup.py
---
 setup.py | 86 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 86 insertions(+)

diff --git a/setup.py b/setup.py
new file mode 100755
index 0000000..a988e40
--- /dev/null
+++ b/setup.py
@@ -0,0 +1,86 @@
+#!/usr/bin/env python3
+#
+#  Copyright (C) 2022 Codethink Limited
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+
+import os
+import sys
+
+try:
+    from setuptools import setup, find_packages
+except ImportError:
+    print(
+        "BuildStream requires setuptools in order to locate plugins. Install "
+        "it using your package manager (usually python3-setuptools) or via "
+        "pip (pip3 install setuptools)."
+    )
+    sys.exit(1)
+
+###############################################################################
+#                             Parse README                                    #
+###############################################################################
+with open(os.path.join(os.path.dirname(os.path.realpath(__file__)), "README.rst"), encoding="utf-8",) as readme:
+    long_description = readme.read()
+
+
+setup(
+    name="buildstream-plugins",
+    version="1.91.0",
+    author="BuildStream Developers",
+    author_email="dev@buildstream.apache.org",
+    classifiers=[
+        "Environment :: Console",
+        "Intended Audience :: Developers",
+        "License :: OSI Approved :: Apache Software License",
+        "Operating System :: POSIX",
+        "Programming Language :: Python :: 3",
+        "Programming Language :: Python :: 3.7",
+        "Programming Language :: Python :: 3.8",
+        "Programming Language :: Python :: 3.9",
+        "Programming Language :: Python :: 3.10",
+        "Topic :: Software Development :: Build Tools",
+    ],
+    description="A collection of plugins for BuildStream.",
+    long_description=long_description,
+    long_description_content_type="text/x-rst; charset=UTF-8",
+    license="Apache License Version 2.0",
+    url="https://buildstream.build",
+    project_urls={"Documentation": "https://docs.buildstream.build/",},
+    package_dir={"": "src"},
+    packages=find_packages(where="src"),
+    include_package_data=True,
+    entry_points={
+        "buildstream.plugins.elements": [
+            "autotools = buildstream_plugins.elements.autotools",
+            "cmake = buildstream_plugins.elements.cmake",
+            "make = buildstream_plugins.elements.make",
+            "meson = buildstream_plugins.elements.meson",
+            "pip = buildstream_plugins.elements.pip",
+            "setuptools = buildstream_plugins.elements.setuptools",
+        ],
+        "buildstream.plugins.sources": [
+            "bzr = buildstream_plugins.sources.bzr",
+            "cargo = buildstream_plugins.sources.cargo",
+            "docker = buildstream_plugins.sources.docker",
+            "git = buildstream_plugins.sources.git",
+            "patch = buildstream_plugins.sources.patch",
+            "pip = buildstream_plugins.sources.pip",
+            "zip = buildstream_plugins.sources.zip",
+        ],
+    },
+    extras_require={"cargo": ["toml"],},
+    zip_safe=False,
+)
+# eof setup()


[buildstream-plugins] 23/49: tox.ini: Adding initial tox.ini

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 8e16746b2df0a0a1e793860682b02c2f128c5e82
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Mar 19 15:46:21 2022 +0900

    tox.ini: Adding initial tox.ini
    
    For now only formatting and lint checks pass
---
 tox.ini | 104 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 104 insertions(+)

diff --git a/tox.ini b/tox.ini
new file mode 100644
index 0000000..44e0d0c
--- /dev/null
+++ b/tox.ini
@@ -0,0 +1,104 @@
+#
+# Tox global configuration
+#
+[tox]
+envlist = py{37,38,39,310}-{bst-fixed,bst-master}
+skip_missing_interpreters = true
+
+#
+# Defaults for all environments
+#
+# Anything specified here is inherited by the sections
+#
+[testenv]
+commands =
+    bst --version
+    pytest --basetemp {envtmpdir} {posargs}
+deps =
+    -rrequirements/test-requirements.txt
+    -rrequirements/plugin-requirements.txt
+    git+https://github.com/apache/buildstream@{env:BST_VERSION}
+
+passenv =
+    ARTIFACT_CACHE_SERVICE
+    BST_FORCE_BACKEND
+    BST_FORCE_SANDBOX
+    GI_TYPELIB_PATH
+    INTEGRATION_CACHE
+    http_proxy
+    HTTP_PROXY
+    https_proxy
+    HTTPS_PROXY
+    no_proxy
+    NO_PROXY
+    PYTEST_*
+    REMOTE_EXECUTION_SERVICE
+    SOURCE_CACHE_SERVICE
+    SSL_CERT_FILE
+
+#
+# These keys are not inherited by any other sections
+#
+setenv =
+    py{37,38,39,310}: XDG_CACHE_HOME = {envtmpdir}/cache
+    py{37,38,39,310}: XDG_CONFIG_HOME = {envtmpdir}/config
+    py{37,38,39,310}: XDG_DATA_HOME = {envtmpdir}/share
+    !master: BST_VERSION = 10be3784afa924d5af63958fea9354d0323eadad
+    master: BST_VERSION = master
+
+whitelist_externals =
+    py{37,38,39,310}:
+        mv
+        mkdir
+
+#
+# Code formatters
+#
+[testenv:format]
+skip_install = True
+deps =
+    black==19.10b0
+commands =
+    black {posargs: src tests setup.py}
+
+#
+# Code format checkers
+#
+[testenv:format-check]
+skip_install = True
+deps =
+    black==19.10b0
+commands =
+    black --check --diff {posargs: src tests setup.py}
+
+#
+# Running linters
+#
+[testenv:lint]
+commands_pre =
+    # Build C extensions to allow Pylint to analyse them
+    {envpython} setup.py build_ext --inplace
+
+commands =
+    pylint {posargs: buildstream_plugins tests setup.py}
+
+#
+# Building documentation
+#
+[testenv:docs]
+commands =
+    make -C doc
+# sphinx_rtd_theme < 0.4.2 breaks search functionality for Sphinx >= 1.8
+deps =
+    sphinx >= 1.8.5
+    sphinx_rtd_theme >= 0.4.2
+    -rrequirements/plugin-requirements.txt
+    git+https://github.com/apache/buildstream@{env:BST_VERSION}
+passenv =
+    BST_FORCE_SESSION_REBUILD
+    BST_SOURCE_CACHE
+    HOME
+    LANG
+    LC_ALL
+whitelist_externals =
+    make


[buildstream-plugins] 43/49: Merge pull request #2 from gtristan/tristan/fix-ci-ubuntu-version

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit ce223fc677bca2dddd35dacfb6074c731f23c5fd
Merge: 1c51f01 817e24b
Author: Tristan Van Berkom <tr...@codethink.co.uk>
AuthorDate: Tue Apr 5 14:05:01 2022 +0900

    Merge pull request #2 from gtristan/tristan/fix-ci-ubuntu-version
    
    .github/worflows: Use ubuntu 18.04 instead of 20.04

 .github/workflows/ci.yml                  |  4 +-
 .github/workflows/merge.yml               |  4 +-
 setup.py                                  | 13 ++++-
 src/buildstream_plugins/sources/bzr.py    | 43 +++++++++++---
 src/buildstream_plugins/sources/cargo.py  | 34 +++++++++--
 src/buildstream_plugins/sources/docker.py | 39 +++++++++----
 src/buildstream_plugins/sources/git.py    | 96 ++++++++++++++++++++++++++-----
 src/buildstream_plugins/sources/patch.py  | 12 +++-
 src/buildstream_plugins/sources/pip.py    |  5 +-
 tests/conftest.py                         |  5 +-
 tests/elements/cmake.py                   | 10 +++-
 tests/elements/make.py                    |  5 +-
 tests/elements/meson.py                   | 10 +++-
 tests/elements/pip.py                     | 26 +++++++--
 tests/elements/setuptools.py              |  5 +-
 tests/sources/git.py                      |  5 +-
 tests/sources/patch.py                    |  5 +-
 tests/sources/pip.py                      | 31 ++++++++--
 tests/sources/pip_build.py                | 25 ++++++--
 tests/testutils/python_repo.py            | 14 ++++-
 tests/testutils/repo/bzrrepo.py           | 12 +++-
 tests/testutils/repo/gitrepo.py           | 14 ++++-
 tox.ini                                   |  4 +-
 23 files changed, 341 insertions(+), 80 deletions(-)


[buildstream-plugins] 34/49: tests/elements/setuptools.py: Adding tests for setuptools element

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 810506e8cfb30508ef55ddcd3b257fe964f27ce3
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 17:04:54 2022 +0900

    tests/elements/setuptools.py: Adding tests for setuptools element
---
 tests/elements/setuptools.py                       |  46 +++++++++++++++++++++
 tests/elements/setuptools/elements/base.bst        |   3 ++
 .../setuptools/elements/base/alpine-image.bst      |   6 +++
 .../setuptools/elements/setuptoolshello.bst        |   9 ++++
 .../setuptools/files/setuptoolshello.tar.gz        | Bin 0 -> 1342 bytes
 tests/elements/setuptools/project.conf             |  15 +++++++
 6 files changed, 79 insertions(+)

diff --git a/tests/elements/setuptools.py b/tests/elements/setuptools.py
new file mode 100644
index 0000000..837aa2b
--- /dev/null
+++ b/tests/elements/setuptools.py
@@ -0,0 +1,46 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import pytest
+
+from buildstream._testing.runcli import cli_integration as cli  # pylint: disable=unused-import
+from buildstream._testing.integration import integration_cache  # pylint: disable=unused-import
+from buildstream._testing.integration import assert_contains
+from buildstream._testing._utils.site import HAVE_SANDBOX
+
+pytestmark = pytest.mark.integration
+
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "setuptools")
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_setuptools_build(cli, datafiles):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_name = "setuptoolshello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    assert result.exit_code == 0
+
+    assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_setuptools_run(cli, datafiles):
+    project = str(datafiles)
+    element_name = "setuptoolshello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["shell", element_name, "/usr/bin/hello"])
+    assert result.exit_code == 0
+
+    assert result.output == """Hello World!\n"""
diff --git a/tests/elements/setuptools/elements/base.bst b/tests/elements/setuptools/elements/base.bst
new file mode 100644
index 0000000..da7c70b
--- /dev/null
+++ b/tests/elements/setuptools/elements/base.bst
@@ -0,0 +1,3 @@
+kind: stack
+depends:
+- base/alpine-image.bst
diff --git a/tests/elements/setuptools/elements/base/alpine-image.bst b/tests/elements/setuptools/elements/base/alpine-image.bst
new file mode 100644
index 0000000..f8e00ba
--- /dev/null
+++ b/tests/elements/setuptools/elements/base/alpine-image.bst
@@ -0,0 +1,6 @@
+kind: import
+description: Import an alpine image as the platform
+sources:
+- kind: tar
+  url: alpine:integration-tests-base.v1.x86_64.tar.xz
+  ref: 3eb559250ba82b64a68d86d0636a6b127aa5f6d25d3601a79f79214dc9703639
diff --git a/tests/elements/setuptools/elements/setuptoolshello.bst b/tests/elements/setuptools/elements/setuptoolshello.bst
new file mode 100644
index 0000000..924664d
--- /dev/null
+++ b/tests/elements/setuptools/elements/setuptoolshello.bst
@@ -0,0 +1,9 @@
+kind: setuptools
+
+depends:
+- filename: base.bst
+
+sources:
+- kind: tar
+  url: project_dir:/files/setuptoolshello.tar.gz
+  ref: c281d5650a104b624c77676c30a456ba1c670654bcf6f24edccb9b9848513cae
diff --git a/tests/elements/setuptools/files/setuptoolshello.tar.gz b/tests/elements/setuptools/files/setuptoolshello.tar.gz
new file mode 100644
index 0000000..73eba4f
Binary files /dev/null and b/tests/elements/setuptools/files/setuptoolshello.tar.gz differ
diff --git a/tests/elements/setuptools/project.conf b/tests/elements/setuptools/project.conf
new file mode 100644
index 0000000..78b079a
--- /dev/null
+++ b/tests/elements/setuptools/project.conf
@@ -0,0 +1,15 @@
+# test project config
+name: test
+min-version: 2.0
+
+element-path: elements
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  elements:
+  - setuptools
+
+aliases:
+  alpine: https://bst-integration-test-images.ams3.cdn.digitaloceanspaces.com/
+  project_dir: file://{project_dir}


[buildstream-plugins] 11/49: Initially adding make element

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 2e26118765815f3592e8e5a2818e420dd5c10fc3
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 17:16:45 2022 +0900

    Initially adding make element
    
    From bst-plugins-experimental
---
 src/buildstream_plugins/elements/make.py   | 52 ++++++++++++++++++++++++++++++
 src/buildstream_plugins/elements/make.yaml | 48 +++++++++++++++++++++++++++
 2 files changed, 100 insertions(+)

diff --git a/src/buildstream_plugins/elements/make.py b/src/buildstream_plugins/elements/make.py
new file mode 100644
index 0000000..4a94132
--- /dev/null
+++ b/src/buildstream_plugins/elements/make.py
@@ -0,0 +1,52 @@
+#
+#  Copyright Bloomberg Finance LP
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Ed Baunton <eb...@bloomberg.net>
+
+"""
+make - Make build element
+=========================
+This is a `BuildElement
+<https://docs.buildstream.build/master/buildstream.scriptelement.html#module-buildstream.scriptelement>`_
+implementation for using GNU make based build.
+
+.. note::
+
+   The ``make`` element is available since `format version 9
+   <https://docs.buildstream.build/master/format_project.html#project-format-version>`_
+
+Here is the default configuration for the ``make`` element in full:
+
+  .. literalinclude:: ../../../src/buildstream_plugins/elements/make.yaml
+     :language: yaml
+
+See `built-in functionality documentation
+<https://docs.buildstream.build/master/buildstream.buildelement.html#core-buildelement-builtins>`_ for
+details on common configuration options for build elements.
+"""
+
+from buildstream import BuildElement
+
+
+# Element implementation for the 'make' kind.
+class MakeElement(BuildElement):
+
+    BST_MIN_VERSION = "2.0"
+
+
+# Plugin entry point
+def setup():
+    return MakeElement
diff --git a/src/buildstream_plugins/elements/make.yaml b/src/buildstream_plugins/elements/make.yaml
new file mode 100644
index 0000000..58e2da4
--- /dev/null
+++ b/src/buildstream_plugins/elements/make.yaml
@@ -0,0 +1,48 @@
+# make default configurations
+
+variables:
+  make-args: >-
+    PREFIX="%{prefix}"
+  make-install-args: >-
+    %{make-args}
+    DESTDIR="%{install-root}"
+    install
+  make: make %{make-args}
+  make-install: make -j1 %{make-install-args}
+
+  # Set this if the sources cannot handle parallelization.
+  #
+  # notparallel: True
+
+config:
+
+  # Commands for building the software
+  #
+  build-commands:
+  - |
+    %{make}
+
+  # Commands for installing the software into a
+  # destination folder
+  #
+  install-commands:
+  - |
+    %{make-install}
+
+  # Commands for stripping debugging information out of
+  # installed binaries
+  #
+  strip-commands:
+  - |
+    %{strip-binaries}
+
+# Use max-jobs CPUs for building and enable verbosity
+environment:
+  MAKEFLAGS: -j%{max-jobs}
+  V: 1
+
+# And dont consider MAKEFLAGS or V as something which may
+# affect build output.
+environment-nocache:
+- MAKEFLAGS
+- V


[buildstream-plugins] 47/49: tests/sources/patch: Use locally defined patch source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 0720a3735fe64ca159f3f2d489c63433dce677fa
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Tue Apr 5 18:36:17 2022 +0900

    tests/sources/patch: Use locally defined patch source
    
    Instead of the buildstream core patch source which will be removed
---
 tests/sources/patch/basic/project.conf                 | 6 ++++++
 tests/sources/patch/different-strip-level/project.conf | 6 ++++++
 tests/sources/patch/invalid-relative-path/project.conf | 6 ++++++
 tests/sources/patch/multiple-patches/project.conf      | 6 ++++++
 tests/sources/patch/separate-patch-dir/project.conf    | 6 ++++++
 5 files changed, 30 insertions(+)

diff --git a/tests/sources/patch/basic/project.conf b/tests/sources/patch/basic/project.conf
index dc34380..944e7f6 100644
--- a/tests/sources/patch/basic/project.conf
+++ b/tests/sources/patch/basic/project.conf
@@ -1,3 +1,9 @@
 # Basic project
 name: foo
 min-version: 2.0
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  sources:
+  - patch
diff --git a/tests/sources/patch/different-strip-level/project.conf b/tests/sources/patch/different-strip-level/project.conf
index dc34380..944e7f6 100644
--- a/tests/sources/patch/different-strip-level/project.conf
+++ b/tests/sources/patch/different-strip-level/project.conf
@@ -1,3 +1,9 @@
 # Basic project
 name: foo
 min-version: 2.0
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  sources:
+  - patch
diff --git a/tests/sources/patch/invalid-relative-path/project.conf b/tests/sources/patch/invalid-relative-path/project.conf
index dc34380..944e7f6 100644
--- a/tests/sources/patch/invalid-relative-path/project.conf
+++ b/tests/sources/patch/invalid-relative-path/project.conf
@@ -1,3 +1,9 @@
 # Basic project
 name: foo
 min-version: 2.0
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  sources:
+  - patch
diff --git a/tests/sources/patch/multiple-patches/project.conf b/tests/sources/patch/multiple-patches/project.conf
index dc34380..944e7f6 100644
--- a/tests/sources/patch/multiple-patches/project.conf
+++ b/tests/sources/patch/multiple-patches/project.conf
@@ -1,3 +1,9 @@
 # Basic project
 name: foo
 min-version: 2.0
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  sources:
+  - patch
diff --git a/tests/sources/patch/separate-patch-dir/project.conf b/tests/sources/patch/separate-patch-dir/project.conf
index dc34380..944e7f6 100644
--- a/tests/sources/patch/separate-patch-dir/project.conf
+++ b/tests/sources/patch/separate-patch-dir/project.conf
@@ -1,3 +1,9 @@
 # Basic project
 name: foo
 min-version: 2.0
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  sources:
+  - patch


[buildstream-plugins] 49/49: Merge pull request #3 from gtristan/tristan/use-zip-from-bst-plugins-experimental

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 96206318b2cade5329a64b0f15b362ed57222086
Merge: ce223fc 8f80351
Author: Tristan Van Berkom <tr...@codethink.co.uk>
AuthorDate: Tue Apr 5 19:02:14 2022 +0900

    Merge pull request #3 from gtristan/tristan/use-zip-from-bst-plugins-experimental
    
    tests/elements/{cmake,meson}: Use zip source from bst-plugins-experimental

 .../elements/cmake/elements/bst-plugins-experimental-junction.bst  | 5 +++++
 tests/elements/cmake/project.conf                                  | 7 +++++++
 .../elements/meson/elements/bst-plugins-experimental-junction.bst  | 5 +++++
 tests/elements/meson/project.conf                                  | 7 +++++++
 tests/sources/bzr/project.conf                                     | 6 ++++++
 tests/sources/git/project-override/project.conf                    | 7 +++++++
 tests/sources/git/template/project.conf                            | 6 ++++++
 tests/sources/patch/basic/project.conf                             | 6 ++++++
 tests/sources/patch/different-strip-level/project.conf             | 6 ++++++
 tests/sources/patch/invalid-relative-path/project.conf             | 6 ++++++
 tests/sources/patch/multiple-patches/project.conf                  | 6 ++++++
 tests/sources/patch/separate-patch-dir/project.conf                | 6 ++++++
 tox.ini                                                            | 2 +-
 13 files changed, 74 insertions(+), 1 deletion(-)


[buildstream-plugins] 18/49: README.rst: Adding initial README

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit f460744db34dabd525cf6e0454a5135b7e0ac7dd
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Mar 19 14:34:37 2022 +0900

    README.rst: Adding initial README
---
 README.rst | 10 ++++++++++
 1 file changed, 10 insertions(+)

diff --git a/README.rst b/README.rst
new file mode 100644
index 0000000..a8f751c
--- /dev/null
+++ b/README.rst
@@ -0,0 +1,10 @@
+BuildStream Plugins
+===================
+A collection of plugins for the BuildStream project.
+
+
+How to use plugins
+------------------
+Plugins must be declared by your buildstream project.conf for use in your
+project. For instructions on how to load plugins in your buildstream project,
+please consult the `plugin loading documentation <https://docs.buildstream.build/master/format_project.html#loading-plugins>`_


[buildstream-plugins] 39/49: .github: Adding run-ci.sh and the ci.docker-compose.yml

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit d54c1efc677cb4911b14f3c934b443b11d61a8ec
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 25 14:35:50 2022 +0900

    .github: Adding run-ci.sh and the ci.docker-compose.yml
    
    This doesn't add CI workflows but adds the underlying mechanics needed to
    run the tox test suite under the various docker containers.
---
 .github/common.env                    |  5 +++
 .github/compose/ci.docker-compose.yml | 58 ++++++++++++++++++++++++
 .github/run-ci.sh                     | 83 +++++++++++++++++++++++++++++++++++
 3 files changed, 146 insertions(+)

diff --git a/.github/common.env b/.github/common.env
new file mode 100644
index 0000000..02eea32
--- /dev/null
+++ b/.github/common.env
@@ -0,0 +1,5 @@
+# Shared common variables
+
+CI_IMAGE_VERSION=master-488745436
+CI_TOXENV_ALL=py37,py38,py39,py310
+CI_TOXENV_MASTER=py37-bst-master,py38-bst-master,py39-bst-master,py310-bst-master
diff --git a/.github/compose/ci.docker-compose.yml b/.github/compose/ci.docker-compose.yml
new file mode 100644
index 0000000..1251079
--- /dev/null
+++ b/.github/compose/ci.docker-compose.yml
@@ -0,0 +1,58 @@
+version: '3.4'
+
+x-tests-template: &tests-template
+    image: registry.gitlab.com/buildstream/buildstream-docker-images/testsuite-fedora:35-${CI_IMAGE_VERSION:-latest}
+    command: tox -vvvvv -- --color=yes --integration
+    environment:
+      TOXENV: ${CI_TOXENV_ALL}
+
+    # Enable privileges to run the sandbox
+    #
+    privileged: true
+    devices:
+      - /dev/fuse:/dev/fuse
+
+    # Mount the local directory and set the working directory
+    # to run the tests from.
+    #
+    volumes:
+      - ../..:/home/testuser/buildstream
+    working_dir: /home/testuser/buildstream
+
+
+services:
+
+  fedora-34:
+    <<: *tests-template
+    image: registry.gitlab.com/buildstream/buildstream-docker-images/testsuite-fedora:34-${CI_IMAGE_VERSION:-latest}
+
+  fedora-35:
+    <<: *tests-template
+    image: registry.gitlab.com/buildstream/buildstream-docker-images/testsuite-fedora:35-${CI_IMAGE_VERSION:-latest}
+
+  debian-10:
+    <<: *tests-template
+    image: registry.gitlab.com/buildstream/buildstream-docker-images/testsuite-debian:10-${CI_IMAGE_VERSION:-latest}
+
+  # Ensure that tests also pass in the absence of a sandboxing tool
+  fedora-missing-deps:
+    <<: *tests-template
+    image: registry.gitlab.com/buildstream/buildstream-docker-images/testsuite-fedora:minimal-${CI_IMAGE_VERSION:-latest}
+
+  # Test against the master version of BuildStream
+  bst-master:
+    <<: *tests-template
+    environment:
+      TOXENV: ${CI_TOXENV_MASTER}
+
+  docs:
+    <<: *tests-template
+    command: tox -e docs
+
+  lint:
+    <<: *tests-template
+    command: tox -e lint,format-check
+
+  mypy:
+    <<: *tests-template
+    command: tox -e mypy
diff --git a/.github/run-ci.sh b/.github/run-ci.sh
new file mode 100755
index 0000000..d4f60e3
--- /dev/null
+++ b/.github/run-ci.sh
@@ -0,0 +1,83 @@
+#!/bin/bash
+
+topdir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
+
+function usage () {
+    echo "Usage: "
+    echo "  run-ci.sh [OPTIONS] [TEST NAME [TEST NAME...]]"
+    echo
+    echo "Runs the CI tests locally using docker"
+    echo
+    echo "The test names are based on the names of tests in the CI yaml files"
+    echo
+    echo "If no test names are specified, all tests will be run"
+    echo
+    echo "Options:"
+    echo
+    echo "  -h --help      Display this help message and exit"
+    echo "  "
+    exit 1;
+}
+
+arg_service=false
+
+while : ; do
+    case "$1" in 
+	-h|--help)
+	    usage;
+	    shift ;;
+	-s|--service)
+	    arg_service=true
+	    shift ;;
+	*)
+	    break ;;
+    esac
+done
+
+test_names="${@}"
+
+
+# We need to give ownership to the docker image user `testuser`,
+# chances are high that this will be the same UID as the primary
+# user on this host
+#
+user_uid="$(id -u)"
+user_gid="$(id -g)"
+if [ "${user_uid}" -ne "1000" ] || [ "${user_gid}" -ne "1000" ]; then
+    sudo chown -R 1000:1000 "${topdir}/.."
+fi
+
+
+# runTest()
+#
+#  $1 = test name
+#
+function runTest() {
+    test_name=$1
+
+    # Run docker-compose from it's directory, because it will use
+    # relative paths
+    cd "${topdir}/compose"
+    docker-compose \
+        --env-file ${topdir}/common.env \
+        --file ${topdir}/compose/ci.docker-compose.yml \
+        run "${test_name}"
+    return $?
+}
+
+
+if [ -z "${test_names}" ]; then
+    for test_name in "mypy debian-10 fedora-34 fedora-35 fedora-missing-deps"; do
+	if ! runTest "${test_name}"; then
+	    echo "Tests failed"
+	    exit 1
+	fi
+    done
+else
+    for test_name in "${test_names}"; do
+	if ! runTest "${test_name}"; then
+	    echo "Tests failed"
+	    exit 1
+	fi
+    done
+fi


[buildstream-plugins] 32/49: tests/elements/make.py: Adding tests for make element

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 2ee84628a867f5ca15996a8d5533cb97dfc3a41a
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 16:29:39 2022 +0900

    tests/elements/make.py: Adding tests for make element
---
 tests/elements/make.py                             |  48 +++++++++++++++++++++
 tests/elements/make/elements/base.bst              |   3 ++
 tests/elements/make/elements/base/alpine-image.bst |   6 +++
 tests/elements/make/elements/makehello.bst         |  10 +++++
 tests/elements/make/files/makehello.tar.gz         | Bin 0 -> 432 bytes
 tests/elements/make/project.conf                   |  15 +++++++
 6 files changed, 82 insertions(+)

diff --git a/tests/elements/make.py b/tests/elements/make.py
new file mode 100644
index 0000000..4edb513
--- /dev/null
+++ b/tests/elements/make.py
@@ -0,0 +1,48 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import pytest
+
+from buildstream._testing.integration import assert_contains
+from buildstream._testing.integration import integration_cache  # pylint: disable=unused-import
+from buildstream._testing.runcli import cli_integration as cli  # pylint: disable=unused-import
+from buildstream._testing._utils.site import HAVE_SANDBOX
+
+pytestmark = pytest.mark.integration
+
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "make")
+
+
+# Test that a make build 'works' - we use the make sample
+# makehello project for this.
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_make_build(cli, datafiles):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_name = "makehello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    assert result.exit_code == 0
+
+    assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
+
+
+# Test running an executable built with make
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_make_run(cli, datafiles):
+    project = str(datafiles)
+    element_name = "makehello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["shell", element_name, "/usr/bin/hello"])
+    assert result.exit_code == 0
+    assert result.output == "Hello, world\n"
diff --git a/tests/elements/make/elements/base.bst b/tests/elements/make/elements/base.bst
new file mode 100644
index 0000000..da7c70b
--- /dev/null
+++ b/tests/elements/make/elements/base.bst
@@ -0,0 +1,3 @@
+kind: stack
+depends:
+- base/alpine-image.bst
diff --git a/tests/elements/make/elements/base/alpine-image.bst b/tests/elements/make/elements/base/alpine-image.bst
new file mode 100644
index 0000000..f8e00ba
--- /dev/null
+++ b/tests/elements/make/elements/base/alpine-image.bst
@@ -0,0 +1,6 @@
+kind: import
+description: Import an alpine image as the platform
+sources:
+- kind: tar
+  url: alpine:integration-tests-base.v1.x86_64.tar.xz
+  ref: 3eb559250ba82b64a68d86d0636a6b127aa5f6d25d3601a79f79214dc9703639
diff --git a/tests/elements/make/elements/makehello.bst b/tests/elements/make/elements/makehello.bst
new file mode 100644
index 0000000..4b5c5ac
--- /dev/null
+++ b/tests/elements/make/elements/makehello.bst
@@ -0,0 +1,10 @@
+kind: make
+description: make test
+
+depends:
+- base.bst
+
+sources:
+- kind: tar
+  url: project_dir:/files/makehello.tar.gz
+  ref: fd342a36503a0a0dd37b81ddb4d2b78bd398d912d813339e0de44a6b6c393b8e
diff --git a/tests/elements/make/files/makehello.tar.gz b/tests/elements/make/files/makehello.tar.gz
new file mode 100644
index 0000000..d0edcb2
Binary files /dev/null and b/tests/elements/make/files/makehello.tar.gz differ
diff --git a/tests/elements/make/project.conf b/tests/elements/make/project.conf
new file mode 100644
index 0000000..039d500
--- /dev/null
+++ b/tests/elements/make/project.conf
@@ -0,0 +1,15 @@
+# test project config
+name: test
+min-version: 2.0
+
+element-path: elements
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  elements:
+  - make
+
+aliases:
+  alpine: https://bst-integration-test-images.ams3.cdn.digitaloceanspaces.com/
+  project_dir: file://{project_dir}


[buildstream-plugins] 13/49: Initially adding setuptools element

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit c108d3ef4c7221f63acb6d93a652f4583b602c0b
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 17:23:49 2022 +0900

    Initially adding setuptools element
    
    Previously the distutils element from bst-plugins-experimental
---
 src/buildstream_plugins/elements/setuptools.py   | 46 +++++++++++++++++++++++
 src/buildstream_plugins/elements/setuptools.yaml | 47 ++++++++++++++++++++++++
 2 files changed, 93 insertions(+)

diff --git a/src/buildstream_plugins/elements/setuptools.py b/src/buildstream_plugins/elements/setuptools.py
new file mode 100644
index 0000000..e0de141
--- /dev/null
+++ b/src/buildstream_plugins/elements/setuptools.py
@@ -0,0 +1,46 @@
+#
+#  Copyright (C) 2016 Codethink Limited
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Tristan Van Berkom <tr...@codethink.co.uk>
+
+"""
+setuptools - Python setuptools element
+======================================
+A `BuildElement
+<https://docs.buildstream.build/master/buildstream.buildelement.html#module-buildstream.buildelement>`_
+implementation for using python setuptools
+
+The setuptools default configuration:
+  .. literalinclude:: ../../../src/buildstream_plugins/elements/setuptools.yaml
+     :language: yaml
+
+See `built-in functionality documentation
+<https://docs.buildstream.build/master/buildstream.buildelement.html#core-buildelement-builtins>`_ for
+details on common configuration options for build elements.
+"""
+
+from buildstream import BuildElement
+
+
+# Element implementation for the python 'setuptools' kind.
+class SetuptoolsElement(BuildElement):
+
+    BST_MIN_VERSION = "2.0"
+
+
+# Plugin entry point
+def setup():
+    return SetuptoolsElement
diff --git a/src/buildstream_plugins/elements/setuptools.yaml b/src/buildstream_plugins/elements/setuptools.yaml
new file mode 100644
index 0000000..c407937
--- /dev/null
+++ b/src/buildstream_plugins/elements/setuptools.yaml
@@ -0,0 +1,47 @@
+# Default python distutils configuration
+
+variables:
+
+  # When building for python2 distutils, simply
+  # override this in the element declaration
+  python: python3
+
+  python-build: |
+
+    %{python} %{conf-root}/setup.py build
+
+  install-args: |
+
+    --prefix "%{prefix}" \
+    --root "%{install-root}"
+
+  python-install: |
+
+    %{python} %{conf-root}/setup.py install %{install-args}
+
+
+config:
+
+  # Commands for configuring the software
+  #
+  configure-commands: []
+
+  # Commands for building the software
+  #
+  build-commands:
+  - |
+    %{python-build}
+
+  # Commands for installing the software into a
+  # destination folder
+  #
+  install-commands:
+  - |
+    %{python-install}
+
+  # Commands for stripping debugging information out of
+  # installed binaries
+  #
+  strip-commands:
+  - |
+    %{strip-binaries}


[buildstream-plugins] 42/49: tox.ini: Updating black to version 22.3.0

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 817e24b711f5bb220b83b9ddaee2042397d432d8
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Thu Mar 31 18:14:29 2022 +0900

    tox.ini: Updating black to version 22.3.0
    
    This is required to avoid a recently occurring exception when running black,
    as outlined here: https://github.com/psf/black/issues/2964
---
 setup.py                                  | 13 ++++-
 src/buildstream_plugins/sources/bzr.py    | 43 +++++++++++---
 src/buildstream_plugins/sources/cargo.py  | 34 +++++++++--
 src/buildstream_plugins/sources/docker.py | 39 +++++++++----
 src/buildstream_plugins/sources/git.py    | 96 ++++++++++++++++++++++++++-----
 src/buildstream_plugins/sources/patch.py  | 12 +++-
 src/buildstream_plugins/sources/pip.py    |  5 +-
 tests/conftest.py                         |  5 +-
 tests/elements/cmake.py                   | 10 +++-
 tests/elements/make.py                    |  5 +-
 tests/elements/meson.py                   | 10 +++-
 tests/elements/pip.py                     | 26 +++++++--
 tests/elements/setuptools.py              |  5 +-
 tests/sources/git.py                      |  5 +-
 tests/sources/patch.py                    |  5 +-
 tests/sources/pip.py                      | 31 ++++++++--
 tests/sources/pip_build.py                | 25 ++++++--
 tests/testutils/python_repo.py            | 14 ++++-
 tests/testutils/repo/bzrrepo.py           | 12 +++-
 tests/testutils/repo/gitrepo.py           | 14 ++++-
 tox.ini                                   |  4 +-
 21 files changed, 337 insertions(+), 76 deletions(-)

diff --git a/setup.py b/setup.py
index a988e40..44d1b23 100755
--- a/setup.py
+++ b/setup.py
@@ -31,7 +31,10 @@ except ImportError:
 ###############################################################################
 #                             Parse README                                    #
 ###############################################################################
-with open(os.path.join(os.path.dirname(os.path.realpath(__file__)), "README.rst"), encoding="utf-8",) as readme:
+with open(
+    os.path.join(os.path.dirname(os.path.realpath(__file__)), "README.rst"),
+    encoding="utf-8",
+) as readme:
     long_description = readme.read()
 
 
@@ -57,7 +60,9 @@ setup(
     long_description_content_type="text/x-rst; charset=UTF-8",
     license="Apache License Version 2.0",
     url="https://buildstream.build",
-    project_urls={"Documentation": "https://docs.buildstream.build/",},
+    project_urls={
+        "Documentation": "https://docs.buildstream.build/",
+    },
     package_dir={"": "src"},
     packages=find_packages(where="src"),
     include_package_data=True,
@@ -80,7 +85,9 @@ setup(
             "zip = buildstream_plugins.sources.zip",
         ],
     },
-    extras_require={"cargo": ["toml"],},
+    extras_require={
+        "cargo": ["toml"],
+    },
     zip_safe=False,
 )
 # eof setup()
diff --git a/src/buildstream_plugins/sources/bzr.py b/src/buildstream_plugins/sources/bzr.py
index c768b35..b7de232 100644
--- a/src/buildstream_plugins/sources/bzr.py
+++ b/src/buildstream_plugins/sources/bzr.py
@@ -100,7 +100,13 @@ class BzrSource(Source):
         with self.timed_activity("Tracking {}".format(self.url), silent_nested=True), self._locked():
             self._ensure_mirror(skip_ref_check=True)
             ret, out = self.check_output(
-                [self.host_bzr, "version-info", "--custom", "--template={revno}", self._get_branch_dir(),],
+                [
+                    self.host_bzr,
+                    "version-info",
+                    "--custom",
+                    "--template={revno}",
+                    self._get_branch_dir(),
+                ],
                 fail="Failed to read the revision number at '{}'".format(self._get_branch_dir()),
             )
             if ret != 0:
@@ -148,7 +154,12 @@ class BzrSource(Source):
             )
             # Switch the parent branch to the source's origin
             self.call(
-                [self.host_bzr, "switch", "--directory={}".format(directory), url,],
+                [
+                    self.host_bzr,
+                    "switch",
+                    "--directory={}".format(directory),
+                    url,
+                ],
                 fail="Failed to switch workspace's parent branch to {}".format(url),
             )
 
@@ -175,21 +186,33 @@ class BzrSource(Source):
             return False
 
         return (
-            self.call([self.host_bzr, "revno", "--revision=revno:{}".format(self.ref), self._get_branch_dir(),]) == 0
+            self.call(
+                [
+                    self.host_bzr,
+                    "revno",
+                    "--revision=revno:{}".format(self.ref),
+                    self._get_branch_dir(),
+                ]
+            )
+            == 0
         )
 
     def _get_branch_dir(self):
         return os.path.join(self._get_mirror_dir(), self.tracking)
 
     def _get_mirror_dir(self):
-        return os.path.join(self.get_mirror_directory(), utils.url_directory_name(self.original_url),)
+        return os.path.join(
+            self.get_mirror_directory(),
+            utils.url_directory_name(self.original_url),
+        )
 
     def _ensure_mirror(self, skip_ref_check=False):
         mirror_dir = self._get_mirror_dir()
         bzr_metadata_dir = os.path.join(mirror_dir, ".bzr")
         if not os.path.exists(bzr_metadata_dir):
             self.call(
-                [self.host_bzr, "init-repo", "--no-trees", mirror_dir], fail="Failed to initialize bzr repository",
+                [self.host_bzr, "init-repo", "--no-trees", mirror_dir],
+                fail="Failed to initialize bzr repository",
             )
 
         branch_dir = os.path.join(mirror_dir, self.tracking)
@@ -206,13 +229,19 @@ class BzrSource(Source):
             # `bzr pull` the branch if it does exist
             # to get any changes to the upstream code
             self.call(
-                [self.host_bzr, "pull", "--directory={}".format(branch_dir), branch_url,],
+                [
+                    self.host_bzr,
+                    "pull",
+                    "--directory={}".format(branch_dir),
+                    branch_url,
+                ],
                 fail="Failed to pull new changes for {}".format(branch_dir),
             )
 
         if not skip_ref_check and not self._check_ref():
             raise SourceError(
-                "Failed to ensure ref '{}' was mirrored".format(self.ref), reason="ref-not-mirrored",
+                "Failed to ensure ref '{}' was mirrored".format(self.ref),
+                reason="ref-not-mirrored",
             )
 
 
diff --git a/src/buildstream_plugins/sources/cargo.py b/src/buildstream_plugins/sources/cargo.py
index bebc305..705f2a1 100644
--- a/src/buildstream_plugins/sources/cargo.py
+++ b/src/buildstream_plugins/sources/cargo.py
@@ -234,10 +234,20 @@ class Crate(SourceFetcher):
                 # Because we use etag only for matching sha, currently specified sha is what
                 # we would have downloaded.
                 return self.sha
-            raise SourceError("{}: Error mirroring {}: {}".format(self, url, e), temporary=True,) from e
+            raise SourceError(
+                "{}: Error mirroring {}: {}".format(self, url, e),
+                temporary=True,
+            ) from e
 
-        except (urllib.error.URLError, urllib.error.ContentTooShortError, OSError,) as e:
-            raise SourceError("{}: Error mirroring {}: {}".format(self, url, e), temporary=True,) from e
+        except (
+            urllib.error.URLError,
+            urllib.error.ContentTooShortError,
+            OSError,
+        ) as e:
+            raise SourceError(
+                "{}: Error mirroring {}: {}".format(self, url, e),
+                temporary=True,
+            ) from e
 
     # _get_url()
     #
@@ -292,7 +302,10 @@ class Crate(SourceFetcher):
     #
     def _get_mirror_dir(self):
         return os.path.join(
-            self.cargo.get_mirror_directory(), utils.url_directory_name(self.cargo.url), self.name, self.version,
+            self.cargo.get_mirror_directory(),
+            utils.url_directory_name(self.cargo.url),
+            self.name,
+            self.version,
         )
 
     # _get_mirror_file()
@@ -365,7 +378,8 @@ class CargoSource(Source):
                     lock = toml.load(f)
                 except toml.TomlDecodeError as e:
                     raise SourceError(
-                        "Malformed Cargo.lock file at: {}".format(self.cargo_lock), detail="{}".format(e),
+                        "Malformed Cargo.lock file at: {}".format(self.cargo_lock),
+                        detail="{}".format(e),
                     ) from e
         except FileNotFoundError as e:
             raise SourceError(
@@ -435,7 +449,15 @@ class CargoSource(Source):
         if refs is None:
             return []
 
-        return [Crate(self, crate["name"], crate["version"], sha=crate.get("sha", None),) for crate in refs]
+        return [
+            Crate(
+                self,
+                crate["name"],
+                crate["version"],
+                sha=crate.get("sha", None),
+            )
+            for crate in refs
+        ]
 
 
 def setup():
diff --git a/src/buildstream_plugins/sources/docker.py b/src/buildstream_plugins/sources/docker.py
index 4b2afde..eeaa1cf 100644
--- a/src/buildstream_plugins/sources/docker.py
+++ b/src/buildstream_plugins/sources/docker.py
@@ -207,7 +207,11 @@ class DockerRegistryV2Client:
     # Returns:
     #    (str, str): A tuple of the manifest content as text, and its content hash
     def manifest(
-        self, image_path, reference, architecture=default_architecture(), os_=default_os(),
+        self,
+        image_path,
+        reference,
+        architecture=default_architecture(),
+        os_=default_os(),
     ):
         # pylint: disable=too-many-locals
 
@@ -223,7 +227,8 @@ class DockerRegistryV2Client:
             manifest = json.loads(response.text)
         except json.JSONDecodeError as e:
             raise DockerManifestError(
-                "Server did not return a valid manifest: {}".format(e), manifest=response.text,
+                "Server did not return a valid manifest: {}".format(e),
+                manifest=response.text,
             ) from e
 
         schema_version = manifest.get("schemaVersion")
@@ -231,7 +236,8 @@ class DockerRegistryV2Client:
             raise DockerManifestError("Schema version 1 is unsupported.", manifest=response.text)
         if schema_version is None:
             raise DockerManifestError(
-                "Manifest did not include the schemaVersion key.", manifest=response.text,
+                "Manifest did not include the schemaVersion key.",
+                manifest=response.text,
             )
 
         our_digest = self.digest(response.text.encode("utf8"))
@@ -239,7 +245,8 @@ class DockerRegistryV2Client:
 
         if not their_digest:
             raise DockerManifestError(
-                "Server did not set the Docker-Content-Digest header.", manifest=response.text,
+                "Server did not set the Docker-Content-Digest header.",
+                manifest=response.text,
             )
         if our_digest != their_digest:
             raise DockerManifestError(
@@ -254,16 +261,23 @@ class DockerRegistryV2Client:
             for sub in manifest["manifests"]:
                 if sub["platform"]["architecture"] == architecture and sub["platform"]["os"]:
                     sub_digest = sub["digest"]
-                    return self.manifest(image_path, sub_digest, architecture=architecture, os_=os_,)
+                    return self.manifest(
+                        image_path,
+                        sub_digest,
+                        architecture=architecture,
+                        os_=os_,
+                    )
                 else:
                     raise DockerManifestError(
-                        "No images found for architecture {}, OS {}".format(architecture, os_), manifest=response.text,
+                        "No images found for architecture {}, OS {}".format(architecture, os_),
+                        manifest=response.text,
                     )
         elif manifest["mediaType"] == "application/vnd.docker.distribution.manifest.v2+json":
             return response.text, our_digest
         else:
             raise DockerManifestError(
-                "Unsupported manifest type {}".format(manifest["mediaType"]), manifest=response.text,
+                "Unsupported manifest type {}".format(manifest["mediaType"]),
+                manifest=response.text,
             )
 
     # blob():
@@ -440,7 +454,8 @@ class DockerSource(Source):
         # pylint: disable=arguments-differ
 
         with self.timed_activity(
-            "Fetching image {}:{} with digest {}".format(self.image, self.tag, self.digest), silent_nested=True,
+            "Fetching image {}:{} with digest {}".format(self.image, self.tag, self.digest),
+            silent_nested=True,
         ):
             with self.tempdir() as tmpdir:
                 # move all files to a tmpdir
@@ -487,7 +502,8 @@ class DockerSource(Source):
                 # a flat mirror directory. We check one-by-one if there is any need to copy a file out of the tmpdir.
                 for fetched_file in os.listdir(tmpdir):
                     move_atomic(
-                        os.path.join(tmpdir, fetched_file), os.path.join(self.get_mirror_directory(), fetched_file),
+                        os.path.join(tmpdir, fetched_file),
+                        os.path.join(self.get_mirror_directory(), fetched_file),
                     )
 
     def stage(self, directory):
@@ -504,7 +520,10 @@ class DockerSource(Source):
                 blob_path = os.path.join(mirror_dir, layer_digest + ".tar.gz")
 
                 self._verify_blob(blob_path, expected_digest=layer_digest)
-                (extract_fileset, white_out_fileset,) = self._get_extract_and_remove_files(blob_path)
+                (
+                    extract_fileset,
+                    white_out_fileset,
+                ) = self._get_extract_and_remove_files(blob_path)
 
                 # remove files associated with whiteouts
                 for white_out_file in white_out_fileset:
diff --git a/src/buildstream_plugins/sources/git.py b/src/buildstream_plugins/sources/git.py
index 5d7603d..decd2ec 100644
--- a/src/buildstream_plugins/sources/git.py
+++ b/src/buildstream_plugins/sources/git.py
@@ -226,7 +226,8 @@ class GitMirror(SourceFetcher):
         if not os.path.exists(self.mirror):
             with self.source.tempdir() as tmpdir:
                 self.source.call(
-                    [self.source.host_git, "init", "--bare", tmpdir], fail="Failed to initialise repository",
+                    [self.source.host_git, "init", "--bare", tmpdir],
+                    fail="Failed to initialise repository",
                 )
 
                 try:
@@ -374,7 +375,15 @@ class GitMirror(SourceFetcher):
     #
     def describe(self, rev):
         _, output = self.source.check_output(
-            [self.source.host_git, "describe", "--tags", "--abbrev=40", "--long", "--always", rev,],
+            [
+                self.source.host_git,
+                "describe",
+                "--tags",
+                "--abbrev=40",
+                "--long",
+                "--always",
+                rev,
+            ],
             fail="Unable to find revision {}".format(rev),
             cwd=self.mirror,
         )
@@ -400,7 +409,14 @@ class GitMirror(SourceFetcher):
             ["--tags", "--first-parent"],
         ]:
             exit_code, output = self.source.check_output(
-                [self.source.host_git, "describe", "--abbrev=0", rev, *options,], cwd=self.mirror,
+                [
+                    self.source.host_git,
+                    "describe",
+                    "--abbrev=0",
+                    rev,
+                    *options,
+                ],
+                cwd=self.mirror,
             )
             if exit_code == 0:
                 tag = output.strip()
@@ -409,7 +425,10 @@ class GitMirror(SourceFetcher):
                     fail="Unable to resolve tag '{}'".format(tag),
                     cwd=self.mirror,
                 )
-                exit_code = self.source.call([self.source.host_git, "cat-file", "tag", tag], cwd=self.mirror,)
+                exit_code = self.source.call(
+                    [self.source.host_git, "cat-file", "tag", tag],
+                    cwd=self.mirror,
+                )
                 annotated = exit_code == 0
 
                 tags.add((tag, commit_ref.strip(), annotated))
@@ -423,7 +442,14 @@ class GitMirror(SourceFetcher):
         # case we're just checking out a specific commit and then removing the .git/
         # directory.
         self.source.call(
-            [self.source.host_git, "clone", "--no-checkout", "--shared", self.mirror, fullpath,],
+            [
+                self.source.host_git,
+                "clone",
+                "--no-checkout",
+                "--shared",
+                self.mirror,
+                fullpath,
+            ],
             fail="Failed to create git mirror {} in directory: {}".format(self.mirror, fullpath),
             fail_temporarily=True,
         )
@@ -444,7 +470,13 @@ class GitMirror(SourceFetcher):
         url = self.source.translate_url(self.url)
 
         self.source.call(
-            [self.source.host_git, "clone", "--no-checkout", self.mirror, fullpath,],
+            [
+                self.source.host_git,
+                "clone",
+                "--no-checkout",
+                self.mirror,
+                fullpath,
+            ],
             fail="Failed to clone git mirror {} in directory: {}".format(self.mirror, fullpath),
             fail_temporarily=True,
         )
@@ -596,13 +628,24 @@ class GitMirror(SourceFetcher):
                     )
                     commit_file.seek(0, 0)
                     self.source.call(
-                        [self.source.host_git, "hash-object", "-w", "-t", "commit", "--stdin",],
+                        [
+                            self.source.host_git,
+                            "hash-object",
+                            "-w",
+                            "-t",
+                            "commit",
+                            "--stdin",
+                        ],
                         stdin=commit_file,
                         fail="Failed to add commit object {}".format(rev),
                         cwd=fullpath,
                     )
 
-            with open(os.path.join(fullpath, ".git", "shallow"), "w", encoding="utf-8",) as shallow_file:
+            with open(
+                os.path.join(fullpath, ".git", "shallow"),
+                "w",
+                encoding="utf-8",
+            ) as shallow_file:
                 for rev in shallow:
                     shallow_file.write("{}\n".format(rev))
 
@@ -613,7 +656,14 @@ class GitMirror(SourceFetcher):
                         tag_file.write(tag_data.encode("ascii"))
                         tag_file.seek(0, 0)
                         _, tag_ref = self.source.check_output(
-                            [self.source.host_git, "hash-object", "-w", "-t", "tag", "--stdin",],
+                            [
+                                self.source.host_git,
+                                "hash-object",
+                                "-w",
+                                "-t",
+                                "tag",
+                                "--stdin",
+                            ],
                             stdin=tag_file,
                             fail="Failed to add tag object {}".format(tag),
                             cwd=fullpath,
@@ -677,7 +727,8 @@ class GitSource(Source):
         # If it is missing both then we will be unable to track or build.
         if self.mirror.ref is None and self.tracking is None:
             raise SourceError(
-                "{}: Git sources require a ref and/or track".format(self), reason="missing-track-and-ref",
+                "{}: Git sources require a ref and/or track".format(self),
+                reason="missing-track-and-ref",
             )
 
         self.checkout_submodules = node.get_bool("checkout-submodules", default=True)
@@ -791,14 +842,17 @@ class GitSource(Source):
             if self.mirror.ref is None:
                 detail = "Without a tracking branch ref can not be updated. Please " + "provide a ref or a track."
                 raise SourceError(
-                    "{}: No track or ref".format(self), detail=detail, reason="track-attempt-no-track",
+                    "{}: No track or ref".format(self),
+                    detail=detail,
+                    reason="track-attempt-no-track",
                 )
             return None
 
         # Resolve the URL for the message
         resolved_url = self.translate_url(self.mirror.url)
         with self.timed_activity(
-            "Tracking {} from {}".format(self.tracking, resolved_url), silent_nested=True,
+            "Tracking {} from {}".format(self.tracking, resolved_url),
+            silent_nested=True,
         ):
             self.mirror._fetch(resolved_url, fetch_all=True)
 
@@ -884,14 +938,28 @@ class GitSource(Source):
         ref_in_track = False
         if not re.match(EXACT_TAG_PATTERN, self.mirror.ref) and self.tracking:
             _, branch = self.check_output(
-                [self.host_git, "branch", "--list", self.tracking, "--contains", self.mirror.ref,],
+                [
+                    self.host_git,
+                    "branch",
+                    "--list",
+                    self.tracking,
+                    "--contains",
+                    self.mirror.ref,
+                ],
                 cwd=self.mirror.mirror,
             )
             if branch:
                 ref_in_track = True
             else:
                 _, tag = self.check_output(
-                    [self.host_git, "tag", "--list", self.tracking, "--contains", self.mirror.ref,],
+                    [
+                        self.host_git,
+                        "tag",
+                        "--list",
+                        self.tracking,
+                        "--contains",
+                        self.mirror.ref,
+                    ],
                     cwd=self.mirror.mirror,
                 )
                 if tag:
diff --git a/src/buildstream_plugins/sources/patch.py b/src/buildstream_plugins/sources/patch.py
index 2228723..e664005 100644
--- a/src/buildstream_plugins/sources/patch.py
+++ b/src/buildstream_plugins/sources/patch.py
@@ -95,12 +95,20 @@ class PatchSource(Source):
             # Bail out with a comprehensive message if the target directory is empty
             if not os.listdir(directory):
                 raise SourceError(
-                    "Nothing to patch in directory '{}'".format(directory), reason="patch-no-files",
+                    "Nothing to patch in directory '{}'".format(directory),
+                    reason="patch-no-files",
                 )
 
             strip_level_option = "-p{}".format(self.strip_level)
             self.call(
-                [self.host_patch, strip_level_option, "-i", self.fullpath, "-d", directory,],
+                [
+                    self.host_patch,
+                    strip_level_option,
+                    "-i",
+                    self.fullpath,
+                    "-d",
+                    directory,
+                ],
                 fail="Failed to apply patch {}".format(self.path),
             )
 
diff --git a/src/buildstream_plugins/sources/pip.py b/src/buildstream_plugins/sources/pip.py
index 21e26a0..32384af 100644
--- a/src/buildstream_plugins/sources/pip.py
+++ b/src/buildstream_plugins/sources/pip.py
@@ -92,7 +92,10 @@ _PYTHON_VERSIONS = [
 # https://docs.python.org/3/distutils/sourcedist.html.
 # Names of source distribution archives must be of the form
 # '%{package-name}-%{version}.%{extension}'.
-_SDIST_RE = re.compile(r"^([\w.-]+?)-((?:[\d.]+){2,})\.(?:tar|tar.bz2|tar.gz|tar.xz|tar.Z|zip)$", re.IGNORECASE,)
+_SDIST_RE = re.compile(
+    r"^([\w.-]+?)-((?:[\d.]+){2,})\.(?:tar|tar.bz2|tar.gz|tar.xz|tar.Z|zip)$",
+    re.IGNORECASE,
+)
 
 
 class PipSource(Source):
diff --git a/tests/conftest.py b/tests/conftest.py
index 833c9bb..9f30476 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -11,7 +11,10 @@ from .testutils.repo import Bzr, Git
 #################################################
 def pytest_addoption(parser):
     parser.addoption(
-        "--integration", action="store_true", default=False, help="Run integration tests",
+        "--integration",
+        action="store_true",
+        default=False,
+        help="Run integration tests",
     )
 
 
diff --git a/tests/elements/cmake.py b/tests/elements/cmake.py
index a409ac1..1cd9230 100644
--- a/tests/elements/cmake.py
+++ b/tests/elements/cmake.py
@@ -25,7 +25,10 @@ def test_cmake_build(cli, datafiles):
     result = cli.run(project=project, args=["build", element_name])
     assert result.exit_code == 0
 
-    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    result = cli.run(
+        project=project,
+        args=["artifact", "checkout", element_name, "--directory", checkout],
+    )
     assert result.exit_code == 0
 
     assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
@@ -41,7 +44,10 @@ def test_cmake_confroot_build(cli, datafiles):
     result = cli.run(project=project, args=["build", element_name])
     assert result.exit_code == 0
 
-    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    result = cli.run(
+        project=project,
+        args=["artifact", "checkout", element_name, "--directory", checkout],
+    )
     assert result.exit_code == 0
 
     assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
diff --git a/tests/elements/make.py b/tests/elements/make.py
index 4edb513..15923d7 100644
--- a/tests/elements/make.py
+++ b/tests/elements/make.py
@@ -27,7 +27,10 @@ def test_make_build(cli, datafiles):
     result = cli.run(project=project, args=["build", element_name])
     assert result.exit_code == 0
 
-    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    result = cli.run(
+        project=project,
+        args=["artifact", "checkout", element_name, "--directory", checkout],
+    )
     assert result.exit_code == 0
 
     assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
diff --git a/tests/elements/meson.py b/tests/elements/meson.py
index ac11fab..46328d3 100644
--- a/tests/elements/meson.py
+++ b/tests/elements/meson.py
@@ -25,7 +25,10 @@ def test_meson_build(cli, datafiles):
     result = cli.run(project=project, args=["build", element_name])
     assert result.exit_code == 0
 
-    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    result = cli.run(
+        project=project,
+        args=["artifact", "checkout", element_name, "--directory", checkout],
+    )
     assert result.exit_code == 0
 
     assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
@@ -41,7 +44,10 @@ def test_meson_confroot_build(cli, datafiles):
     result = cli.run(project=project, args=["build", element_name])
     assert result.exit_code == 0
 
-    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    result = cli.run(
+        project=project,
+        args=["artifact", "checkout", element_name, "--directory", checkout],
+    )
     assert result.exit_code == 0
 
     assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
diff --git a/tests/elements/pip.py b/tests/elements/pip.py
index 1664063..700b9ca 100644
--- a/tests/elements/pip.py
+++ b/tests/elements/pip.py
@@ -42,18 +42,29 @@ def test_pip_build(cli, datafiles):
         ],
     }
     os.makedirs(
-        os.path.dirname(os.path.join(element_path, element_name)), exist_ok=True,
+        os.path.dirname(os.path.join(element_path, element_name)),
+        exist_ok=True,
     )
     _yaml.roundtrip_dump(element, os.path.join(element_path, element_name))
 
     result = cli.run(project=project, args=["build", element_name])
     assert result.exit_code == 0
 
-    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    result = cli.run(
+        project=project,
+        args=["artifact", "checkout", element_name, "--directory", checkout],
+    )
     assert result.exit_code == 0
 
     assert_contains(
-        checkout, ["/usr", "/usr/lib", "/usr/bin", "/usr/bin/hello", "/usr/lib/python3.6",],
+        checkout,
+        [
+            "/usr",
+            "/usr/lib",
+            "/usr/bin",
+            "/usr/bin/hello",
+            "/usr/lib/python3.6",
+        ],
     )
 
 
@@ -96,7 +107,8 @@ def test_pip_element_should_install_pip_deps(cli, datafiles, setup_pypi_repo):
     pypi_repo = os.path.join(project, "files", "pypi-repo")
     os.makedirs(pypi_repo, exist_ok=True)
     os.makedirs(
-        os.path.dirname(os.path.join(elements_path, element_name)), exist_ok=True,
+        os.path.dirname(os.path.join(elements_path, element_name)),
+        exist_ok=True,
     )
     setup_pypi_repo(mock_packages, pypi_repo)
 
@@ -112,7 +124,11 @@ def test_pip_element_should_install_pip_deps(cli, datafiles, setup_pypi_repo):
                 # FIXME: remove hardcoded ref once issue #1010 is closed
                 "ref": "ad96570b552498807abec33c06210bf68378d854ced6753b77916c5ed517610d",
             },
-            {"kind": "pip", "url": "file://{}".format(os.path.realpath(pypi_repo)), "packages": [myreqs_packages],},
+            {
+                "kind": "pip",
+                "url": "file://{}".format(os.path.realpath(pypi_repo)),
+                "packages": [myreqs_packages],
+            },
         ],
     }
     _yaml.roundtrip_dump(element, os.path.join(elements_path, element_name))
diff --git a/tests/elements/setuptools.py b/tests/elements/setuptools.py
index 837aa2b..1580e4d 100644
--- a/tests/elements/setuptools.py
+++ b/tests/elements/setuptools.py
@@ -25,7 +25,10 @@ def test_setuptools_build(cli, datafiles):
     result = cli.run(project=project, args=["build", element_name])
     assert result.exit_code == 0
 
-    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    result = cli.run(
+        project=project,
+        args=["artifact", "checkout", element_name, "--directory", checkout],
+    )
     assert result.exit_code == 0
 
     assert_contains(checkout, ["/usr", "/usr/bin", "/usr/bin/hello"])
diff --git a/tests/sources/git.py b/tests/sources/git.py
index beab0e6..385ea31 100644
--- a/tests/sources/git.py
+++ b/tests/sources/git.py
@@ -37,7 +37,10 @@ from buildstream._testing import create_repo
 
 from tests.testutils.site import HAVE_GIT, HAVE_OLD_GIT
 
-DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "git",)
+DATA_DIR = os.path.join(
+    os.path.dirname(os.path.realpath(__file__)),
+    "git",
+)
 
 
 @pytest.mark.skipif(HAVE_GIT is False, reason="git is not available")
diff --git a/tests/sources/patch.py b/tests/sources/patch.py
index 91848f1..4d1515a 100644
--- a/tests/sources/patch.py
+++ b/tests/sources/patch.py
@@ -8,7 +8,10 @@ import pytest
 from buildstream.exceptions import ErrorDomain, LoadErrorReason
 from buildstream._testing import cli  # pylint: disable=unused-import
 
-DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "patch",)
+DATA_DIR = os.path.join(
+    os.path.dirname(os.path.realpath(__file__)),
+    "patch",
+)
 
 
 # generate_file_types()
diff --git a/tests/sources/pip.py b/tests/sources/pip.py
index bf30b20..c316296 100644
--- a/tests/sources/pip.py
+++ b/tests/sources/pip.py
@@ -9,7 +9,10 @@ from buildstream.exceptions import ErrorDomain
 from buildstream._testing import cli  # pylint: disable=unused-import
 from buildstream_plugins.sources.pip import _match_package_name
 
-DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "pip",)
+DATA_DIR = os.path.join(
+    os.path.dirname(os.path.realpath(__file__)),
+    "pip",
+)
 
 
 def generate_project(project_dir):
@@ -18,7 +21,13 @@ def generate_project(project_dir):
         {
             "name": "foo",
             "min-version": "2.0",
-            "plugins": [{"origin": "pip", "package-name": "buildstream-plugins", "sources": ["pip"],}],
+            "plugins": [
+                {
+                    "origin": "pip",
+                    "package-name": "buildstream-plugins",
+                    "sources": ["pip"],
+                }
+            ],
         },
         project_file,
     )
@@ -59,9 +68,21 @@ def test_no_packages(cli, datafiles):
         ("hyphenated-package-2.6.0.tar.gz", "hyphenated-package", "2.6.0"),
         ("underscore_pkg-3.1.0.tar.gz", "underscore_pkg", "3.1.0"),
         ("numbers2and5-1.0.1.tar.gz", "numbers2and5", "1.0.1"),
-        ("multiple.dots.package-5.6.7.tar.gz", "multiple.dots.package", "5.6.7",),
-        ("multiple-hyphens-package-1.2.3.tar.gz", "multiple-hyphens-package", "1.2.3",),
-        ("multiple_underscore_pkg-3.4.5.tar.gz", "multiple_underscore_pkg", "3.4.5",),
+        (
+            "multiple.dots.package-5.6.7.tar.gz",
+            "multiple.dots.package",
+            "5.6.7",
+        ),
+        (
+            "multiple-hyphens-package-1.2.3.tar.gz",
+            "multiple-hyphens-package",
+            "1.2.3",
+        ),
+        (
+            "multiple_underscore_pkg-3.4.5.tar.gz",
+            "multiple_underscore_pkg",
+            "3.4.5",
+        ),
         ("shortversion-1.0.tar.gz", "shortversion", "1.0"),
         ("longversion-1.2.3.4.tar.gz", "longversion", "1.2.3.4"),
     ],
diff --git a/tests/sources/pip_build.py b/tests/sources/pip_build.py
index 35333c9..27084d9 100644
--- a/tests/sources/pip_build.py
+++ b/tests/sources/pip_build.py
@@ -49,11 +49,16 @@ def test_pip_source_import_packages(cli, datafiles, setup_pypi_repo):
         "kind": "import",
         "sources": [
             {"kind": "local", "path": "files/pip-source"},
-            {"kind": "pip", "url": "file://{}".format(os.path.realpath(pypi_repo)), "packages": [myreqs_packages],},
+            {
+                "kind": "pip",
+                "url": "file://{}".format(os.path.realpath(pypi_repo)),
+                "packages": [myreqs_packages],
+            },
         ],
     }
     os.makedirs(
-        os.path.dirname(os.path.join(element_path, element_name)), exist_ok=True,
+        os.path.dirname(os.path.join(element_path, element_name)),
+        exist_ok=True,
     )
     _yaml.roundtrip_dump(element, os.path.join(element_path, element_name))
 
@@ -63,7 +68,10 @@ def test_pip_source_import_packages(cli, datafiles, setup_pypi_repo):
     result = cli.run(project=project, args=["build", element_name])
     assert result.exit_code == 0
 
-    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    result = cli.run(
+        project=project,
+        args=["artifact", "checkout", element_name, "--directory", checkout],
+    )
     assert result.exit_code == 0
 
     assert_contains(
@@ -119,7 +127,8 @@ def test_pip_source_import_requirements_files(cli, datafiles, setup_pypi_repo):
         ],
     }
     os.makedirs(
-        os.path.dirname(os.path.join(element_path, element_name)), exist_ok=True,
+        os.path.dirname(os.path.join(element_path, element_name)),
+        exist_ok=True,
     )
     _yaml.roundtrip_dump(element, os.path.join(element_path, element_name))
 
@@ -129,7 +138,10 @@ def test_pip_source_import_requirements_files(cli, datafiles, setup_pypi_repo):
     result = cli.run(project=project, args=["build", element_name])
     assert result.exit_code == 0
 
-    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    result = cli.run(
+        project=project,
+        args=["artifact", "checkout", element_name, "--directory", checkout],
+    )
     assert result.exit_code == 0
 
     assert_contains(
@@ -193,7 +205,8 @@ def test_pip_source_build(cli, datafiles, setup_pypi_repo):
         },
     }
     os.makedirs(
-        os.path.dirname(os.path.join(element_path, element_name)), exist_ok=True,
+        os.path.dirname(os.path.join(element_path, element_name)),
+        exist_ok=True,
     )
     _yaml.roundtrip_dump(element, os.path.join(element_path, element_name))
 
diff --git a/tests/testutils/python_repo.py b/tests/testutils/python_repo.py
index 52dd8a6..870de0c 100644
--- a/tests/testutils/python_repo.py
+++ b/tests/testutils/python_repo.py
@@ -76,7 +76,14 @@ def generate_pip_package(tmpdir, pypi, name, version="0.1", dependencies=None):
     setup_file = os.path.join(tmpdir, "setup.py")
     pkgdirname = re.sub("[^0-9a-zA-Z]+", "", name)
     with open(setup_file, "w", encoding="utf-8") as f:
-        f.write(SETUP_TEMPLATE.format(name=name, version=version, pkgdirname=pkgdirname, pkgdeps=dependencies,))
+        f.write(
+            SETUP_TEMPLATE.format(
+                name=name,
+                version=version,
+                pkgdirname=pkgdirname,
+                pkgdeps=dependencies,
+            )
+        )
     os.chmod(setup_file, 0o755)
 
     package = os.path.join(tmpdir, pkgdirname)
@@ -124,7 +131,10 @@ def setup_pypi_repo(tmpdir):
         for package, dependencies in packages.items():
             pkgdir = create_pkgdir(package)
             generate_pip_package(
-                pkgdir, pypi_repo, package, dependencies=list(dependencies.keys()),
+                pkgdir,
+                pypi_repo,
+                package,
+                dependencies=list(dependencies.keys()),
             )
             for dependency, dependency_dependencies in dependencies.items():
                 add_packages({dependency: dependency_dependencies}, pypi_repo)
diff --git a/tests/testutils/repo/bzrrepo.py b/tests/testutils/repo/bzrrepo.py
index 324244c..506dff1 100644
--- a/tests/testutils/repo/bzrrepo.py
+++ b/tests/testutils/repo/bzrrepo.py
@@ -30,7 +30,9 @@ class Bzr(Repo):
         self.copy_directory(directory, branch_dir)
         subprocess.call([self.bzr, "add", "."], env=self.env, cwd=branch_dir)
         subprocess.call(
-            [self.bzr, "commit", '--message="Initial commit"'], env=self.env, cwd=branch_dir,
+            [self.bzr, "commit", '--message="Initial commit"'],
+            env=self.env,
+            cwd=branch_dir,
         )
 
         return self.latest_commit()
@@ -48,7 +50,13 @@ class Bzr(Repo):
 
     def latest_commit(self):
         return subprocess.check_output(
-            [self.bzr, "version-info", "--custom", "--template={revno}", os.path.join(self.repo, "trunk"),],
+            [
+                self.bzr,
+                "version-info",
+                "--custom",
+                "--template={revno}",
+                os.path.join(self.repo, "trunk"),
+            ],
             env=self.env,
             universal_newlines=True,
         ).strip()
diff --git a/tests/testutils/repo/gitrepo.py b/tests/testutils/repo/gitrepo.py
index 9cb3c9a..a4bf6e4 100644
--- a/tests/testutils/repo/gitrepo.py
+++ b/tests/testutils/repo/gitrepo.py
@@ -95,7 +95,12 @@ class Git(Repo):
         return config
 
     def latest_commit(self):
-        return self._run_git("rev-parse", "HEAD", stdout=subprocess.PIPE, universal_newlines=True,).stdout.strip()
+        return self._run_git(
+            "rev-parse",
+            "HEAD",
+            stdout=subprocess.PIPE,
+            universal_newlines=True,
+        ).stdout.strip()
 
     def branch(self, branch_name):
         self._run_git("checkout", "-b", branch_name)
@@ -111,4 +116,9 @@ class Git(Repo):
         return self.latest_commit()
 
     def rev_parse(self, rev):
-        return self._run_git("rev-parse", rev, stdout=subprocess.PIPE, universal_newlines=True,).stdout.strip()
+        return self._run_git(
+            "rev-parse",
+            rev,
+            stdout=subprocess.PIPE,
+            universal_newlines=True,
+        ).stdout.strip()
diff --git a/tox.ini b/tox.ini
index 840c174..4a3f1ff 100644
--- a/tox.ini
+++ b/tox.ini
@@ -57,7 +57,7 @@ whitelist_externals =
 [testenv:format]
 skip_install = True
 deps =
-    black==19.10b0
+    black==22.3.0
 commands =
     black {posargs: src tests setup.py}
 
@@ -67,7 +67,7 @@ commands =
 [testenv:format-check]
 skip_install = True
 deps =
-    black==19.10b0
+    black==22.3.0
 commands =
     black --check --diff {posargs: src tests setup.py}
 


[buildstream-plugins] 33/49: tests/elements/pip.py: Adding tests for pip element

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 8b030e9eb33cad067011db90a3da91fab35518bd
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 16:54:35 2022 +0900

    tests/elements/pip.py: Adding tests for pip element
---
 tests/elements/pip.py                             | 132 ++++++++++++++++++++++
 tests/elements/pip/elements/base.bst              |   3 +
 tests/elements/pip/elements/base/alpine-image.bst |   6 +
 tests/elements/pip/files/piphello.tar.xz          | Bin 0 -> 628 bytes
 tests/elements/pip/project.conf                   |  15 +++
 tests/testutils/python_repo.py                    | 132 ++++++++++++++++++++++
 6 files changed, 288 insertions(+)

diff --git a/tests/elements/pip.py b/tests/elements/pip.py
new file mode 100644
index 0000000..1664063
--- /dev/null
+++ b/tests/elements/pip.py
@@ -0,0 +1,132 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+
+import pytest
+
+from buildstream import _yaml
+
+from buildstream._testing import cli_integration as cli  # pylint: disable=unused-import
+from buildstream._testing.integration import assert_contains
+from buildstream._testing.integration import integration_cache  # pylint: disable=unused-import
+from buildstream._testing._utils.site import HAVE_SANDBOX
+
+from tests.testutils.python_repo import setup_pypi_repo  # pylint: disable=unused-import
+
+
+pytestmark = pytest.mark.integration
+
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "pip")
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_pip_build(cli, datafiles):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_path = os.path.join(project, "elements")
+    element_name = "pip/hello.bst"
+
+    element = {
+        "kind": "pip",
+        "variables": {"pip": "pip3"},
+        "depends": [{"filename": "base.bst"}],
+        "sources": [
+            {
+                "kind": "tar",
+                "url": "file://{}/files/piphello.tar.xz".format(project),
+                "ref": "ad96570b552498807abec33c06210bf68378d854ced6753b77916c5ed517610d",
+            }
+        ],
+    }
+    os.makedirs(
+        os.path.dirname(os.path.join(element_path, element_name)), exist_ok=True,
+    )
+    _yaml.roundtrip_dump(element, os.path.join(element_path, element_name))
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    assert result.exit_code == 0
+
+    assert_contains(
+        checkout, ["/usr", "/usr/lib", "/usr/bin", "/usr/bin/hello", "/usr/lib/python3.6",],
+    )
+
+
+# Test running an executable built with pip
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_pip_run(cli, datafiles):
+    # Create and build our test element
+    test_pip_build(cli, datafiles)
+
+    project = str(datafiles)
+    element_name = "pip/hello.bst"
+
+    result = cli.run(project=project, args=["shell", element_name, "/usr/bin/hello"])
+    assert result.exit_code == 0
+    assert result.output == "Hello, world!\n"
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_pip_element_should_install_pip_deps(cli, datafiles, setup_pypi_repo):
+    project = str(datafiles)
+    elements_path = os.path.join(project, "elements")
+    element_name = "pip/hello.bst"
+
+    # check that exotically named packages are imported correctly
+    myreqs_packages = "alohalib"
+    dependencies = [
+        "app2",
+        "app.3",
+        "app-4",
+        "app_5",
+        "app.no.6",
+        "app-no-7",
+        "app_no_8",
+    ]
+    mock_packages = {myreqs_packages: {package: {} for package in dependencies}}
+
+    # set up directories
+    pypi_repo = os.path.join(project, "files", "pypi-repo")
+    os.makedirs(pypi_repo, exist_ok=True)
+    os.makedirs(
+        os.path.dirname(os.path.join(elements_path, element_name)), exist_ok=True,
+    )
+    setup_pypi_repo(mock_packages, pypi_repo)
+
+    # create pip element
+    element = {
+        "kind": "pip",
+        "variables": {"pip": "pip3"},
+        "depends": [{"filename": "base.bst"}],
+        "sources": [
+            {
+                "kind": "tar",
+                "url": "file://{}/files/piphello.tar.xz".format(project),
+                # FIXME: remove hardcoded ref once issue #1010 is closed
+                "ref": "ad96570b552498807abec33c06210bf68378d854ced6753b77916c5ed517610d",
+            },
+            {"kind": "pip", "url": "file://{}".format(os.path.realpath(pypi_repo)), "packages": [myreqs_packages],},
+        ],
+    }
+    _yaml.roundtrip_dump(element, os.path.join(elements_path, element_name))
+
+    result = cli.run(project=project, args=["source", "track", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    # get installed packages in sandbox
+    installed_packages = set(
+        cli.run(project=project, args=["shell", element_name, "pip3", "freeze"]).output.split("\n")
+    )
+    # compare with packages that are expected to be installed
+    pip_source_packages = {package.replace("_", "-") + "==0.1" for package in dependencies + [myreqs_packages]}
+    assert pip_source_packages.issubset(installed_packages)
diff --git a/tests/elements/pip/elements/base.bst b/tests/elements/pip/elements/base.bst
new file mode 100644
index 0000000..da7c70b
--- /dev/null
+++ b/tests/elements/pip/elements/base.bst
@@ -0,0 +1,3 @@
+kind: stack
+depends:
+- base/alpine-image.bst
diff --git a/tests/elements/pip/elements/base/alpine-image.bst b/tests/elements/pip/elements/base/alpine-image.bst
new file mode 100644
index 0000000..f8e00ba
--- /dev/null
+++ b/tests/elements/pip/elements/base/alpine-image.bst
@@ -0,0 +1,6 @@
+kind: import
+description: Import an alpine image as the platform
+sources:
+- kind: tar
+  url: alpine:integration-tests-base.v1.x86_64.tar.xz
+  ref: 3eb559250ba82b64a68d86d0636a6b127aa5f6d25d3601a79f79214dc9703639
diff --git a/tests/elements/pip/files/piphello.tar.xz b/tests/elements/pip/files/piphello.tar.xz
new file mode 100644
index 0000000..72ec9b3
Binary files /dev/null and b/tests/elements/pip/files/piphello.tar.xz differ
diff --git a/tests/elements/pip/project.conf b/tests/elements/pip/project.conf
new file mode 100644
index 0000000..f04cd98
--- /dev/null
+++ b/tests/elements/pip/project.conf
@@ -0,0 +1,15 @@
+# test project config
+name: test
+min-version: 2.0
+
+element-path: elements
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  elements:
+  - pip
+
+aliases:
+  alpine: https://bst-integration-test-images.ams3.cdn.digitaloceanspaces.com/
+  project_dir: file://{project_dir}
diff --git a/tests/testutils/python_repo.py b/tests/testutils/python_repo.py
new file mode 100644
index 0000000..52dd8a6
--- /dev/null
+++ b/tests/testutils/python_repo.py
@@ -0,0 +1,132 @@
+import os
+import re
+import shutil
+import subprocess
+import sys
+
+import pytest
+
+
+SETUP_TEMPLATE = """\
+from setuptools import setup
+
+setup(
+    name='{name}',
+    version='{version}',
+    description='{name}',
+    packages=['{pkgdirname}'],
+    install_requires={pkgdeps},
+    entry_points={{
+        'console_scripts': [
+            '{pkgdirname}={pkgdirname}:main'
+        ]
+    }}
+)
+"""
+
+# All packages generated via generate_pip_package will have the functions below
+INIT_TEMPLATE = """\
+def main():
+    print('This is {name}')
+
+def hello(actor='world'):
+    print('Hello {{}}! This is {name}'.format(actor))
+"""
+
+HTML_TEMPLATE = """\
+<html>
+  <head>
+    <title>Links for {name}</title>
+  </head>
+  <body>
+    <a href='{name}-{version}.tar.gz'>{name}-{version}.tar.gz</a><br />
+  </body>
+</html>
+"""
+
+
+# Creates a simple python source distribution and copies this into a specified
+# directory which is to serve as a mock python repository
+#
+# Args:
+#    tmpdir (str): Directory in which the source files will be created
+#    pypi (str): Directory serving as a mock python repository
+#    name (str): The name of the package to be created
+#    version (str): The version of the package to be created
+#
+# Returns:
+#    None
+#
+def generate_pip_package(tmpdir, pypi, name, version="0.1", dependencies=None):
+    if dependencies is None:
+        dependencies = []
+    # check if package already exists in pypi
+    pypi_package = os.path.join(pypi, re.sub("[^0-9a-zA-Z]+", "-", name))
+    if os.path.exists(pypi_package):
+        return
+
+    # create the package source files in tmpdir resulting in a directory
+    # tree resembling the following structure:
+    #
+    # tmpdir
+    # |-- setup.py
+    # `-- package
+    #     `-- __init__.py
+    #
+    setup_file = os.path.join(tmpdir, "setup.py")
+    pkgdirname = re.sub("[^0-9a-zA-Z]+", "", name)
+    with open(setup_file, "w", encoding="utf-8") as f:
+        f.write(SETUP_TEMPLATE.format(name=name, version=version, pkgdirname=pkgdirname, pkgdeps=dependencies,))
+    os.chmod(setup_file, 0o755)
+
+    package = os.path.join(tmpdir, pkgdirname)
+    os.makedirs(package)
+
+    main_file = os.path.join(package, "__init__.py")
+    with open(main_file, "w", encoding="utf-8") as f:
+        f.write(INIT_TEMPLATE.format(name=name))
+    os.chmod(main_file, 0o644)
+
+    # Run sdist with a fresh process
+    subprocess.run([sys.executable, "setup.py", "sdist"], cwd=tmpdir, check=True)
+
+    # create directory for this package in pypi resulting in a directory
+    # tree resembling the following structure:
+    #
+    # pypi
+    # `-- pypi_package
+    #     |-- index.html
+    #     `-- foo-0.1.tar.gz
+    #
+    os.makedirs(pypi_package)
+
+    # add an index html page
+    index_html = os.path.join(pypi_package, "index.html")
+    with open(index_html, "w", encoding="utf-8") as f:
+        f.write(HTML_TEMPLATE.format(name=name, version=version))
+
+    # copy generated tarfile to pypi package
+    dist_dir = os.path.join(tmpdir, "dist")
+    for tar in os.listdir(dist_dir):
+        tarpath = os.path.join(dist_dir, tar)
+        shutil.copy(tarpath, pypi_package)
+
+
+@pytest.fixture
+def setup_pypi_repo(tmpdir):
+    def create_pkgdir(package):
+        pkgdirname = re.sub("[^0-9a-zA-Z]+", "", package)
+        pkgdir = os.path.join(str(tmpdir), pkgdirname)
+        os.makedirs(pkgdir)
+        return pkgdir
+
+    def add_packages(packages, pypi_repo):
+        for package, dependencies in packages.items():
+            pkgdir = create_pkgdir(package)
+            generate_pip_package(
+                pkgdir, pypi_repo, package, dependencies=list(dependencies.keys()),
+            )
+            for dependency, dependency_dependencies in dependencies.items():
+                add_packages({dependency: dependency_dependencies}, pypi_repo)
+
+    return add_packages


[buildstream-plugins] 44/49: tests/elements/{cmake,meson}: Use zip source from bst-plugins-experimental

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 921d1a1f74ed4061bba06fefff6d2b48a929e54f
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Tue Apr 5 18:01:06 2022 +0900

    tests/elements/{cmake,meson}: Use zip source from bst-plugins-experimental
    
    We need the zip source in order to download and use ninja, but the zip source
    is removed from buildstream and buildstream-plugins - use it from the latest
    release of bst-plugins-experimental.
    
    Also update tox.ini to use the latest version of buildstream
---
 tests/elements/cmake/elements/bst-plugins-experimental-junction.bst | 5 +++++
 tests/elements/cmake/project.conf                                   | 5 +++++
 tests/elements/meson/elements/bst-plugins-experimental-junction.bst | 5 +++++
 tests/elements/meson/project.conf                                   | 5 +++++
 tox.ini                                                             | 2 +-
 5 files changed, 21 insertions(+), 1 deletion(-)

diff --git a/tests/elements/cmake/elements/bst-plugins-experimental-junction.bst b/tests/elements/cmake/elements/bst-plugins-experimental-junction.bst
new file mode 100644
index 0000000..769de4b
--- /dev/null
+++ b/tests/elements/cmake/elements/bst-plugins-experimental-junction.bst
@@ -0,0 +1,5 @@
+kind: junction
+sources:
+- kind: tar
+  url: pypi:0c/dd/c2afff7697104f37fd67d98931c402153409bdd2b35442e088460c452f9d/bst-plugins-experimental-1.93.7.tar.gz
+  ref: 0646cf740cdc049c6343059816d36d2181d31aa0d1632107159c737a4332c83c
diff --git a/tests/elements/cmake/project.conf b/tests/elements/cmake/project.conf
index bdcf99b..bd8d972 100644
--- a/tests/elements/cmake/project.conf
+++ b/tests/elements/cmake/project.conf
@@ -9,7 +9,12 @@ plugins:
   package-name: buildstream-plugins
   elements:
   - cmake
+- origin: junction
+  junction: bst-plugins-experimental-junction.bst
+  sources:
+  - zip
 
 aliases:
   alpine: https://bst-integration-test-images.ams3.cdn.digitaloceanspaces.com/
+  pypi: https://files.pythonhosted.org/packages/
   project_dir: file://{project_dir}
diff --git a/tests/elements/meson/elements/bst-plugins-experimental-junction.bst b/tests/elements/meson/elements/bst-plugins-experimental-junction.bst
new file mode 100644
index 0000000..769de4b
--- /dev/null
+++ b/tests/elements/meson/elements/bst-plugins-experimental-junction.bst
@@ -0,0 +1,5 @@
+kind: junction
+sources:
+- kind: tar
+  url: pypi:0c/dd/c2afff7697104f37fd67d98931c402153409bdd2b35442e088460c452f9d/bst-plugins-experimental-1.93.7.tar.gz
+  ref: 0646cf740cdc049c6343059816d36d2181d31aa0d1632107159c737a4332c83c
diff --git a/tests/elements/meson/project.conf b/tests/elements/meson/project.conf
index 52ffbf0..9ca5151 100644
--- a/tests/elements/meson/project.conf
+++ b/tests/elements/meson/project.conf
@@ -10,7 +10,12 @@ plugins:
   elements:
   - meson
   - setuptools
+- origin: junction
+  junction: bst-plugins-experimental-junction.bst
+  sources:
+  - zip
 
 aliases:
   alpine: https://bst-integration-test-images.ams3.cdn.digitaloceanspaces.com/
+  pypi: https://files.pythonhosted.org/packages/
   project_dir: file://{project_dir}
diff --git a/tox.ini b/tox.ini
index 4a3f1ff..8b1cf96 100644
--- a/tox.ini
+++ b/tox.ini
@@ -43,7 +43,7 @@ setenv =
     py{37,38,39,310}: XDG_CACHE_HOME = {envtmpdir}/cache
     py{37,38,39,310}: XDG_CONFIG_HOME = {envtmpdir}/config
     py{37,38,39,310}: XDG_DATA_HOME = {envtmpdir}/share
-    !master: BST_VERSION = 10be3784afa924d5af63958fea9354d0323eadad
+    !master: BST_VERSION = 1a3c707a6c46573ab159de64ac9cd92e7f6027e6
     master: BST_VERSION = master
 
 whitelist_externals =


[buildstream-plugins] 46/49: tests/sources/git: Use locally defined git source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 5ade411781c47d9745f837776663bcdee6995f73
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Tue Apr 5 18:35:48 2022 +0900

    tests/sources/git: Use locally defined git source
    
    Instead of accidentally testing the buildstream core git source which will
    be removed
---
 tests/sources/git/project-override/project.conf | 7 +++++++
 tests/sources/git/template/project.conf         | 6 ++++++
 2 files changed, 13 insertions(+)

diff --git a/tests/sources/git/project-override/project.conf b/tests/sources/git/project-override/project.conf
index 01c9016..7ced79e 100644
--- a/tests/sources/git/project-override/project.conf
+++ b/tests/sources/git/project-override/project.conf
@@ -1,6 +1,13 @@
 # Basic project
 name: foo
 min-version: 2.0
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  sources:
+  - git
+
 sources:
   git:
     config:
diff --git a/tests/sources/git/template/project.conf b/tests/sources/git/template/project.conf
index dc34380..a92b56b 100644
--- a/tests/sources/git/template/project.conf
+++ b/tests/sources/git/template/project.conf
@@ -1,3 +1,9 @@
 # Basic project
 name: foo
 min-version: 2.0
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  sources:
+  - git


[buildstream-plugins] 10/49: Initially adding meson element

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 52c3ab049165e0b6970f63b333898a3c0b544479
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 17:15:55 2022 +0900

    Initially adding meson element
    
    From bst-plugins-experimental
---
 src/buildstream_plugins/elements/meson.py   | 66 ++++++++++++++++++++++++
 src/buildstream_plugins/elements/meson.yaml | 79 +++++++++++++++++++++++++++++
 2 files changed, 145 insertions(+)

diff --git a/src/buildstream_plugins/elements/meson.py b/src/buildstream_plugins/elements/meson.py
new file mode 100644
index 0000000..e9d1d8d
--- /dev/null
+++ b/src/buildstream_plugins/elements/meson.py
@@ -0,0 +1,66 @@
+#  Copyright (C) 2017 Patrick Griffis
+#  Copyright (C) 2018 Codethink Ltd.
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+
+"""
+meson - Meson build element
+===========================
+This is a `BuildElement
+<https://docs.buildstream.build/master/buildstream.buildelement.html#module-buildstream.buildelement>`_
+implementation for using `Meson <http://mesonbuild.com/>`_ build scripts.
+
+You will often want to pass additional arguments to ``meson``. This should
+be done on a per-element basis by setting the ``meson-local`` variable.  Here is
+an example:
+
+.. code:: yaml
+
+   variables:
+     meson-local: |
+       -Dmonkeys=yes
+
+If you want to pass extra options to ``meson`` for every element in your
+project, set the ``meson-global`` variable in your project.conf file. Here is
+an example of that:
+
+.. code:: yaml
+
+   elements:
+     meson:
+       variables:
+         meson-global: |
+           -Dmonkeys=always
+
+Here is the default configuration for the ``meson`` element in full:
+
+  .. literalinclude:: ../../../src/buildstream_plugins/elements/meson.yaml
+     :language: yaml
+
+See `built-in functionality documentation
+<https://docs.buildstream.build/master/buildstream.buildelement.html#core-buildelement-builtins>`_ for
+details on common configuration options for build elements.
+"""
+
+from buildstream import BuildElement
+
+
+# Element implementation for the 'meson' kind.
+class MesonElement(BuildElement):
+
+    BST_MIN_VERSION = "2.0"
+
+
+# Plugin entry point
+def setup():
+    return MesonElement
diff --git a/src/buildstream_plugins/elements/meson.yaml b/src/buildstream_plugins/elements/meson.yaml
new file mode 100644
index 0000000..2172cb3
--- /dev/null
+++ b/src/buildstream_plugins/elements/meson.yaml
@@ -0,0 +1,79 @@
+# Meson default configuration
+
+variables:
+
+  build-dir: _builddir
+
+  # Project-wide extra arguments to be passed to `meson`
+  meson-global: ''
+
+  # Element-specific extra arguments to be passed to `meson`.
+  meson-local: ''
+
+  # For backwards compatibility only, do not use.
+  meson-extra: ''
+
+  meson-args: |
+
+    --prefix=%{prefix} \
+    --bindir=%{bindir} \
+    --sbindir=%{sbindir} \
+    --sysconfdir=%{sysconfdir} \
+    --datadir=%{datadir} \
+    --includedir=%{includedir} \
+    --libdir=%{libdir} \
+    --libexecdir=%{libexecdir} \
+    --localstatedir=%{localstatedir} \
+    --sharedstatedir=%{sharedstatedir} \
+    --mandir=%{mandir} \
+    --infodir=%{infodir} %{meson-extra} %{meson-global} %{meson-local}
+
+  meson: meson %{conf-root} %{build-dir} %{meson-args}
+
+  ninja: |
+    ninja -j ${NINJAJOBS} -C %{build-dir}
+
+  ninja-install: |
+    env DESTDIR="%{install-root}" ninja -C %{build-dir} install
+
+  # Set this if the sources cannot handle parallelization.
+  #
+  # notparallel: True
+
+config:
+
+  # Commands for configuring the software
+  #
+  configure-commands:
+  - |
+    %{meson}
+
+  # Commands for building the software
+  #
+  build-commands:
+  - |
+    %{ninja}
+
+  # Commands for installing the software into a
+  # destination folder
+  #
+  install-commands:
+  - |
+    %{ninja-install}
+
+  # Commands for stripping debugging information out of
+  # installed binaries
+  #
+  strip-commands:
+  - |
+    %{strip-binaries}
+
+# Use max-jobs CPUs for building
+environment:
+  NINJAJOBS: |
+    %{max-jobs}
+
+# And dont consider NINJAJOBS as something which may
+# affect build output.
+environment-nocache:
+- NINJAJOBS


[buildstream-plugins] 12/49: Initially adding pip element

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 6c4a5f69128bf806182289dd961765e3d6fa168b
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 17:21:03 2022 +0900

    Initially adding pip element
    
    From bst-plugins-experimental
---
 src/buildstream_plugins/elements/pip.py   | 46 +++++++++++++++++++++++++++++++
 src/buildstream_plugins/elements/pip.yaml | 34 +++++++++++++++++++++++
 2 files changed, 80 insertions(+)

diff --git a/src/buildstream_plugins/elements/pip.py b/src/buildstream_plugins/elements/pip.py
new file mode 100644
index 0000000..072681a
--- /dev/null
+++ b/src/buildstream_plugins/elements/pip.py
@@ -0,0 +1,46 @@
+#
+#  Copyright (C) 2017 Mathieu Bridon
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Mathieu Bridon <bo...@daitauha.fr>
+
+"""
+pip - Pip build element
+=======================
+A `BuildElement
+<https://docs.buildstream.build/master/buildstream.buildelement.html#module-buildstream.buildelement>`_
+implementation for installing Python modules with pip
+
+The pip default configuration:
+  .. literalinclude:: ../../../src/buildstream_plugins/elements/pip.yaml
+     :language: yaml
+
+See `built-in functionality documentation
+<https://docs.buildstream.build/master/buildstream.buildelement.html#core-buildelement-builtins>`_ for
+details on common configuration options for build elements.
+"""
+
+from buildstream import BuildElement
+
+
+# Element implementation for the 'pip' kind.
+class PipElement(BuildElement):
+
+    BST_MIN_VERSION = "2.0"
+
+
+# Plugin entry point
+def setup():
+    return PipElement
diff --git a/src/buildstream_plugins/elements/pip.yaml b/src/buildstream_plugins/elements/pip.yaml
new file mode 100644
index 0000000..e128d50
--- /dev/null
+++ b/src/buildstream_plugins/elements/pip.yaml
@@ -0,0 +1,34 @@
+# Pip default configurations
+
+variables:
+
+  pip: pip
+  pip-flags: |
+    %{pip} install --no-deps --root=%{install-root} --prefix=%{prefix}
+  pip-install-package: |
+    %{pip-flags} %{conf-root}
+  pip-download-dir: |
+    .bst_pip_downloads
+  pip-install-dependencies: |
+    if [ -e %{pip-download-dir} ]; then %{pip-flags} %{pip-download-dir}/*; fi
+
+config:
+
+  configure-commands: []
+  build-commands: []
+
+  # Commands for installing the software into a
+  # destination folder
+  #
+  install-commands:
+  - |
+    %{pip-install-package}
+  - |
+    %{pip-install-dependencies}
+
+  # Commands for stripping debugging information out of
+  # installed binaries
+  #
+  strip-commands:
+  - |
+    %{strip-binaries}


[buildstream-plugins] 16/49: requirements/: Adding initial requirements.txt files

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 325b77e4267c53dd31ff8621003bb6bddaf5c86c
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Mar 19 14:47:58 2022 +0900

    requirements/: Adding initial requirements.txt files
---
 requirements/plugin-requirements.txt | 8 ++++++++
 requirements/test-requirements.txt   | 9 +++++++++
 2 files changed, 17 insertions(+)

diff --git a/requirements/plugin-requirements.txt b/requirements/plugin-requirements.txt
new file mode 100644
index 0000000..b967ebb
--- /dev/null
+++ b/requirements/plugin-requirements.txt
@@ -0,0 +1,8 @@
+# The dependencies listed here are necessary for specifc plugins, but
+# aren't required for buildstream-plugins to be installed.
+
+# Cargo source
+toml
+
+# Docker source
+requests
diff --git a/requirements/test-requirements.txt b/requirements/test-requirements.txt
new file mode 100644
index 0000000..ff91d56
--- /dev/null
+++ b/requirements/test-requirements.txt
@@ -0,0 +1,9 @@
+pytest-env
+# Provide option to run tests in parallel, less reliable
+pytest-xdist
+pytest >= 6.0.1
+pytest-datafiles >= 2.0
+pylint
+pycodestyle
+pyftpdlib
+


[buildstream-plugins] 03/49: Initially adding cargo source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit fee253c97b0dfb785826c29ecae64c84dfa52e99
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 16:28:06 2022 +0900

    Initially adding cargo source
    
    From bst-plugins-experimental
---
 src/buildstream_plugins/sources/cargo.py | 442 +++++++++++++++++++++++++++++++
 1 file changed, 442 insertions(+)

diff --git a/src/buildstream_plugins/sources/cargo.py b/src/buildstream_plugins/sources/cargo.py
new file mode 100644
index 0000000..bebc305
--- /dev/null
+++ b/src/buildstream_plugins/sources/cargo.py
@@ -0,0 +1,442 @@
+#
+#  Copyright (C) 2019 Codethink Limited
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Tristan Van Berkom <tr...@codethink.co.uk>
+
+"""
+cargo - Automatically stage crate dependencies
+==============================================
+A convenience Source element for vendoring rust project dependencies.
+
+Placing this source in the source list, after a source which stages a
+Cargo.lock file, will allow this source to read the Cargo.lock file and
+obtain the crates automatically into %{vendordir}.
+
+**Usage:**
+
+.. code:: yaml
+
+   # Specify the cargo source kind
+   kind: cargo
+
+   # Url of the crates repository to download from (default: https://static.crates.io/crates)
+   url: https://static.crates.io/crates
+
+   # Internal source reference, this is a list of dictionaries
+   # which store the crate names and versions.
+   #
+   # This will be automatically updated with `bst track`
+   ref:
+   - name: packagename
+     version: 1.2.1
+   - name: packagename
+     version: 1.3.0
+
+   # Specify a directory for the vendored crates (defaults to ./crates)
+   vendor-dir: crates
+
+   # Optionally specify the name of the lock file to use (defaults to Cargo.lock)
+   cargo-lock: Cargo.lock
+
+
+See `built-in functionality doumentation
+<https://docs.buildstream.build/master/buildstream.source.html#core-source-builtins>`_ for
+details on common configuration options for sources.
+"""
+
+import contextlib
+import json
+import os.path
+import shutil
+import tarfile
+import urllib.error
+import urllib.request
+
+import toml
+from buildstream import Source, SourceFetcher, SourceError
+from buildstream import utils
+
+
+# This automatically goes into .cargo/config
+#
+_default_vendor_config_template = (
+    "[source.crates-io]\n"
+    + 'registry = "{vendorurl}"\n'
+    + 'replace-with = "vendored-sources"\n'
+    + "[source.vendored-sources]\n"
+    + 'directory = "{vendordir}"\n'
+)
+
+
+# Crate()
+#
+# Use a SourceFetcher class to be the per crate helper
+#
+# Args:
+#    cargo (Cargo): The main Source implementation
+#    name (str): The name of the crate to depend on
+#    version (str): The version of the crate to depend on
+#    sha (str|None): The sha256 checksum of the downloaded crate
+#
+class Crate(SourceFetcher):
+    def __init__(self, cargo, name, version, sha=None):
+        super().__init__()
+
+        self.cargo = cargo
+        self.name = name
+        self.version = str(version)
+        self.sha = sha
+        self.mark_download_url(self._get_url())
+
+    ########################################################
+    #     SourceFetcher API method implementations         #
+    ########################################################
+
+    def fetch(self, alias_override=None, **kwargs):
+
+        # Just a defensive check, it is impossible for the
+        # file to be already cached because Source.fetch() will
+        # not be called if the source is already cached.
+        #
+        if os.path.isfile(self._get_mirror_file()):
+            return  # pragma: nocover
+
+        # Download the crate
+        crate_url = self._get_url(alias_override)
+        with self.cargo.timed_activity("Downloading: {}".format(crate_url), silent_nested=True):
+            sha256 = self._download(crate_url)
+            if self.sha is not None and sha256 != self.sha:
+                raise SourceError(
+                    "File downloaded from {} has sha256sum '{}', not '{}'!".format(crate_url, sha256, self.sha)
+                )
+
+    ########################################################
+    #        Helper APIs for the Cargo Source to use       #
+    ########################################################
+
+    # stage()
+    #
+    # A delegate method to do the work for a single crate
+    # in Source.stage().
+    #
+    # Args:
+    #    (directory): The vendor subdirectory to stage to
+    #
+    def stage(self, directory):
+        try:
+            mirror_file = self._get_mirror_file()
+            with tarfile.open(mirror_file) as tar:
+                tar.extractall(path=directory)
+                members = tar.getmembers()
+
+            if members:
+                dirname = members[0].name.split("/")[0]
+                package_dir = os.path.join(directory, dirname)
+                checksum_file = os.path.join(package_dir, ".cargo-checksum.json")
+                with open(checksum_file, "w", encoding="utf-8") as f:
+                    checksum_data = {"package": self.sha, "files": {}}
+                    json.dump(checksum_data, f)
+
+        except (tarfile.TarError, OSError) as e:
+            raise SourceError("{}: Error staging source: {}".format(self, e)) from e
+
+    # is_cached()
+    #
+    # Get whether we have a local cached version of the source
+    #
+    # Returns:
+    #   (bool): Whether we are cached or not
+    #
+    def is_cached(self):
+        return os.path.isfile(self._get_mirror_file())
+
+    # is_resolved()
+    #
+    # Get whether the current crate is resolved
+    #
+    # Returns:
+    #   (bool): Whether we have a sha or not
+    #
+    def is_resolved(self):
+        return self.sha is not None
+
+    ########################################################
+    #                   Private helpers                    #
+    ########################################################
+
+    # _download()
+    #
+    # Downloads the crate from the url and caches it.
+    #
+    # Args:
+    #    url (str): The url to download from
+    #
+    # Returns:
+    #    (str): The sha256 checksum of the downloaded crate
+    #
+    def _download(self, url):
+
+        try:
+            with self.cargo.tempdir() as td:
+                default_name = os.path.basename(url)
+                request = urllib.request.Request(url)
+                request.add_header("Accept", "*/*")
+                request.add_header("User-Agent", "BuildStream/2")
+
+                # We do not use etag in case what we have in cache is
+                # not matching ref in order to be able to recover from
+                # corrupted download.
+                if self.sha:
+                    etag = self._get_etag(self.sha)
+                    if etag and self.is_cached():
+                        request.add_header("If-None-Match", etag)
+
+                with contextlib.closing(urllib.request.urlopen(request)) as response:
+                    info = response.info()
+
+                    etag = info["ETag"] if "ETag" in info else None
+
+                    filename = info.get_filename(default_name)
+                    filename = os.path.basename(filename)
+                    local_file = os.path.join(td, filename)
+                    with open(local_file, "wb") as dest:
+                        shutil.copyfileobj(response, dest)
+
+                # Make sure url-specific mirror dir exists.
+                os.makedirs(self._get_mirror_dir(), exist_ok=True)
+
+                # Store by sha256sum
+                sha256 = utils.sha256sum(local_file)
+                # Even if the file already exists, move the new file over.
+                # In case the old file was corrupted somehow.
+                os.rename(local_file, self._get_mirror_file(sha256))
+
+                if etag:
+                    self._store_etag(sha256, etag)
+                return sha256
+
+        except urllib.error.HTTPError as e:
+            if e.code == 304:
+                # 304 Not Modified.
+                # Because we use etag only for matching sha, currently specified sha is what
+                # we would have downloaded.
+                return self.sha
+            raise SourceError("{}: Error mirroring {}: {}".format(self, url, e), temporary=True,) from e
+
+        except (urllib.error.URLError, urllib.error.ContentTooShortError, OSError,) as e:
+            raise SourceError("{}: Error mirroring {}: {}".format(self, url, e), temporary=True,) from e
+
+    # _get_url()
+    #
+    # Fetches the URL to download this crate from
+    #
+    # Args:
+    #    alias (str|None): The URL alias to apply, if any
+    #
+    # Returns:
+    #    (str): The URL for this crate
+    #
+    def _get_url(self, alias=None):
+        url = self.cargo.translate_url(self.cargo.url, alias_override=alias)
+        return "{url}/{name}/{name}-{version}.crate".format(url=url, name=self.name, version=self.version)
+
+    # _get_etag()
+    #
+    # Fetches the locally stored ETag information for this
+    # crate's download.
+    #
+    # Args:
+    #    sha (str): The sha256 checksum of the downloaded crate
+    #
+    # Returns:
+    #    (str|None): The ETag to use for requests, or None if nothing is
+    #                locally downloaded
+    #
+    def _get_etag(self, sha):
+        etagfilename = os.path.join(self._get_mirror_dir(), "{}.etag".format(sha))
+        if os.path.exists(etagfilename):
+            with open(etagfilename, "r", encoding="utf-8") as etagfile:
+                return etagfile.read()
+
+        return None
+
+    # _store_etag()
+    #
+    # Stores the locally cached ETag information for this crate.
+    #
+    # Args:
+    #    sha (str): The sha256 checksum of the downloaded crate
+    #    etag (str): The ETag to use for requests of this crate
+    #
+    def _store_etag(self, sha, etag):
+        etagfilename = os.path.join(self._get_mirror_dir(), "{}.etag".format(sha))
+        with utils.save_file_atomic(etagfilename) as etagfile:
+            etagfile.write(etag)
+
+    # _get_mirror_dir()
+    #
+    # Gets the local mirror directory for this upstream cargo repository
+    #
+    def _get_mirror_dir(self):
+        return os.path.join(
+            self.cargo.get_mirror_directory(), utils.url_directory_name(self.cargo.url), self.name, self.version,
+        )
+
+    # _get_mirror_file()
+    #
+    # Gets the local mirror filename for this crate
+    #
+    # Args:
+    #    sha (str|None): The sha256 checksum of the downloaded crate
+    #
+    def _get_mirror_file(self, sha=None):
+        return os.path.join(self._get_mirror_dir(), sha or self.sha)
+
+
+class CargoSource(Source):
+    BST_MIN_VERSION = "2.0"
+
+    # We need the Cargo.lock file to construct our ref at track time
+    BST_REQUIRES_PREVIOUS_SOURCES_TRACK = True
+
+    ########################################################
+    #       Plugin/Source API method implementations       #
+    ########################################################
+    def configure(self, node):
+
+        # The url before any aliasing
+        #
+        self.url = node.get_str("url", "https://static.crates.io/crates")
+        # XXX: should we use get_sequence here?
+        self.ref = node.get_sequence("ref", None)
+        if self.ref is not None:
+            self.ref = self.ref.strip_node_info()
+        self.cargo_lock = node.get_str("cargo-lock", "Cargo.lock")
+        self.vendor_dir = node.get_str("vendor-dir", "crates")
+
+        node.validate_keys(Source.COMMON_CONFIG_KEYS + ["url", "ref", "cargo-lock", "vendor-dir"])
+
+        self.crates = self._parse_crates(self.ref)
+
+    def preflight(self):
+        return
+
+    def get_unique_key(self):
+        return [self.url, self.cargo_lock, self.vendor_dir, self.ref]
+
+    def is_resolved(self):
+        return (self.ref is not None) and all(crate.is_resolved() for crate in self.crates)
+
+    def is_cached(self):
+        return all(crate.is_cached() for crate in self.crates)
+
+    def load_ref(self, node):
+        # XXX: this should be get_sequence, and parse_crate should expect nodes
+        self.ref = node.get_sequence("ref", None)
+        self.crates = self._parse_crates(self.ref)
+
+    def get_ref(self):
+        return self.ref
+
+    def set_ref(self, ref, node):
+        node["ref"] = self.ref = ref
+        self.crates = self._parse_crates(self.ref)
+
+    def track(self, *, previous_sources_dir):
+        new_ref = []
+        lockfile = os.path.join(previous_sources_dir, self.cargo_lock)
+
+        try:
+            with open(lockfile, "r", encoding="utf-8") as f:
+                try:
+                    lock = toml.load(f)
+                except toml.TomlDecodeError as e:
+                    raise SourceError(
+                        "Malformed Cargo.lock file at: {}".format(self.cargo_lock), detail="{}".format(e),
+                    ) from e
+        except FileNotFoundError as e:
+            raise SourceError(
+                "Failed to find Cargo.lock file at: {}".format(self.cargo_lock),
+                detail="The cargo plugin expects to find a Cargo.lock file in\n"
+                + "the sources staged before it in the source list, but none was found.",
+            ) from e
+
+        # FIXME: Better validation would be good here, so we can raise more
+        #        useful error messages in the case of a malformed Cargo.lock file.
+        #
+        for package in lock["package"]:
+            if "source" not in package:
+                continue
+            new_ref += [{"name": package["name"], "version": str(package["version"])}]
+
+        # Make sure the order we set it at track time is deterministic
+        new_ref = sorted(new_ref, key=lambda c: (c["name"], c["version"]))
+
+        # Download the crates and get their shas
+        for crate_obj in new_ref:
+            crate = Crate(self, crate_obj["name"], crate_obj["version"])
+
+            crate_url = crate._get_url()
+            with self.timed_activity("Downloading: {}".format(crate_url), silent_nested=True):
+                crate_obj["sha"] = crate._download(crate_url)
+
+        return new_ref
+
+    def stage(self, directory):
+
+        # Stage the crates into the vendor directory
+        vendor_dir = os.path.join(directory, self.vendor_dir)
+        for crate in self.crates:
+            crate.stage(vendor_dir)
+
+        # Stage our vendor config
+        vendor_config = _default_vendor_config_template.format(
+            vendorurl=self.translate_url(self.url), vendordir=self.vendor_dir
+        )
+        conf_dir = os.path.join(directory, ".cargo")
+        conf_file = os.path.join(conf_dir, "config")
+        os.makedirs(conf_dir, exist_ok=True)
+        with open(conf_file, "w", encoding="utf-8") as f:
+            f.write(vendor_config)
+
+    def get_source_fetchers(self):
+        return self.crates
+
+    ########################################################
+    #                   Private helpers                    #
+    ########################################################
+
+    # _parse_crates():
+    #
+    # Generates a list of crates based on the passed ref
+    #
+    # Args:
+    #    (list|None) refs: The list of name/version dictionaries
+    #
+    # Returns:
+    #    (list): A list of Crate objects
+    #
+    def _parse_crates(self, refs):
+
+        # Return an empty list for no ref
+        if refs is None:
+            return []
+
+        return [Crate(self, crate["name"], crate["version"], sha=crate.get("sha", None),) for crate in refs]
+
+
+def setup():
+    return CargoSource


[buildstream-plugins] 15/49: Added project.conf to allow using these plugins with a junction

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit eac4b69b8f8b07b224c4fbadc2dc2215662c45fd
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 17:53:52 2022 +0900

    Added project.conf to allow using these plugins with a junction
---
 project.conf | 27 +++++++++++++++++++++++++++
 1 file changed, 27 insertions(+)

diff --git a/project.conf b/project.conf
new file mode 100644
index 0000000..a89e915
--- /dev/null
+++ b/project.conf
@@ -0,0 +1,27 @@
+#
+# This project.conf exposes the plugins as a buildstream project so
+# that plugins can be loaded via junctions.
+#
+name: buildstream-plugins
+min-version: 2.0
+
+plugins:
+- origin: local
+  path: src/buildstream_plugins/elements
+  elements:
+  - autotools
+  - cmake
+  - make
+  - meson
+  - pip
+  - setuptools
+
+- origin: local
+  path: src/buildstream_plugins/sources
+  sources:
+  - bzr
+  - cargo
+  - docker
+  - git
+  - patch
+  - pip


[buildstream-plugins] 14/49: Initially adding autotools element

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit f803536ac2fbf8365edc248128221a1dd42607c2
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 17:27:38 2022 +0900

    Initially adding autotools element
    
    From buildstream core plugins
---
 src/buildstream_plugins/elements/autotools.py   |  70 +++++++++++++
 src/buildstream_plugins/elements/autotools.yaml | 129 ++++++++++++++++++++++++
 2 files changed, 199 insertions(+)

diff --git a/src/buildstream_plugins/elements/autotools.py b/src/buildstream_plugins/elements/autotools.py
new file mode 100644
index 0000000..7f9e6d0
--- /dev/null
+++ b/src/buildstream_plugins/elements/autotools.py
@@ -0,0 +1,70 @@
+#
+#  Copyright (C) 2016, 2018 Codethink Limited
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Tristan Van Berkom <tr...@codethink.co.uk>
+
+"""
+autotools - Autotools build element
+===================================
+This is a :mod:`BuildElement <buildstream.buildelement>` implementation for
+using Autotools build scripts (also known as the `GNU Build System
+<https://en.wikipedia.org/wiki/GNU_Build_System>`_).
+
+You will often want to pass additional arguments to ``configure``. This should
+be done on a per-element basis by setting the ``conf-local`` variable.  Here is
+an example:
+
+.. code:: yaml
+
+   variables:
+     conf-local: |
+       --disable-foo --enable-bar
+
+If you want to pass extra options to ``configure`` for every element in your
+project, set the ``conf-global`` variable in your project.conf file. Here is
+an example of that:
+
+.. code:: yaml
+
+   elements:
+     autotools:
+       variables:
+         conf-global: |
+           --disable-gtk-doc --disable-static
+
+Here is the default configuration for the ``autotools`` element in full:
+
+  .. literalinclude:: ../../../src/buildstream_plugins/elements/autotools.yaml
+     :language: yaml
+
+See `built-in functionality documentation
+<https://docs.buildstream.build/master/buildstream.buildelement.html#core-buildelement-builtins>`_ for
+details on common configuration options for build elements.
+"""
+
+from buildstream import BuildElement
+
+
+# Element implementation for the 'autotools' kind.
+class AutotoolsElement(BuildElement):
+    # pylint: disable=attribute-defined-outside-init
+
+    BST_MIN_VERSION = "2.0"
+
+
+# Plugin entry point
+def setup():
+    return AutotoolsElement
diff --git a/src/buildstream_plugins/elements/autotools.yaml b/src/buildstream_plugins/elements/autotools.yaml
new file mode 100644
index 0000000..85f7393
--- /dev/null
+++ b/src/buildstream_plugins/elements/autotools.yaml
@@ -0,0 +1,129 @@
+# Autotools default configurations
+
+variables:
+
+  autogen: |
+    export NOCONFIGURE=1;
+
+    if [ -x %{conf-cmd} ]; then true;
+    elif [ -x %{conf-root}/autogen ]; then %{conf-root}/autogen;
+    elif [ -x %{conf-root}/autogen.sh ]; then %{conf-root}/autogen.sh;
+    elif [ -x %{conf-root}/bootstrap ]; then %{conf-root}/bootstrap;
+    elif [ -x %{conf-root}/bootstrap.sh ]; then %{conf-root}/bootstrap.sh;
+    else autoreconf -ivf %{conf-root};
+    fi
+
+  # Project-wide extra arguments to be passed to `configure`
+  conf-global: ''
+
+  # Element-specific extra arguments to be passed to `configure`.
+  conf-local: ''
+
+  # For backwards compatibility only, do not use.
+  conf-extra: ''
+
+  conf-cmd: "%{conf-root}/configure"
+  
+  conf-args: |
+
+    --prefix=%{prefix} \
+    --exec-prefix=%{exec_prefix} \
+    --bindir=%{bindir} \
+    --sbindir=%{sbindir} \
+    --sysconfdir=%{sysconfdir} \
+    --datadir=%{datadir} \
+    --includedir=%{includedir} \
+    --libdir=%{libdir} \
+    --libexecdir=%{libexecdir} \
+    --localstatedir=%{localstatedir} \
+    --sharedstatedir=%{sharedstatedir} \
+    --mandir=%{mandir} \
+    --infodir=%{infodir} %{conf-extra} %{conf-global} %{conf-local}
+
+  configure: |
+
+    %{conf-cmd} %{conf-args}
+
+  make: make
+  make-install: make -j1 DESTDIR="%{install-root}" install
+
+  # Set this if the sources cannot handle parallelization.
+  #
+  # notparallel: True
+
+
+  # Automatically remove libtool archive files
+  #
+  # Set remove-libtool-modules to "true" to remove .la files for 
+  # modules intended to be opened with lt_dlopen()
+  #
+  # Set remove-libtool-libraries to "true" to remove .la files for
+  # libraries
+  #
+  # Value must be "true" or "false"
+  remove-libtool-modules: "false"  
+  remove-libtool-libraries: "false"
+
+  delete-libtool-archives: |
+    if %{remove-libtool-modules} || %{remove-libtool-libraries}; then
+      find "%{install-root}" -name "*.la" -print0 | while read -d '' -r file; do
+        if grep '^shouldnotlink=yes$' "${file}" &>/dev/null; then
+          if %{remove-libtool-modules}; then
+            echo "Removing ${file}."
+            rm "${file}"
+          else
+            echo "Not removing ${file}."
+          fi
+        else
+          if %{remove-libtool-libraries}; then
+            echo "Removing ${file}."
+            rm "${file}"
+          else
+            echo "Not removing ${file}."
+          fi
+        fi
+      done
+    fi
+
+config:
+
+  # Commands for configuring the software
+  #
+  configure-commands:
+  - |
+    %{autogen}
+  - |
+    %{configure}
+
+  # Commands for building the software
+  #
+  build-commands:
+  - |
+    %{make}
+
+  # Commands for installing the software into a
+  # destination folder
+  #
+  install-commands:
+  - |
+    %{make-install}
+  - |
+    %{delete-libtool-archives}
+
+  # Commands for stripping debugging information out of
+  # installed binaries
+  #
+  strip-commands:
+  - |
+    %{strip-binaries}
+
+# Use max-jobs CPUs for building and enable verbosity
+environment:
+  MAKEFLAGS: -j%{max-jobs}
+  V: 1
+
+# And dont consider MAKEFLAGS or V as something which may
+# affect build output.
+environment-nocache:
+- MAKEFLAGS
+- V


[buildstream-plugins] 09/49: Initially adding cmake element

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 6e51cbca386c787090b7c561b87a6f89997d77da
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 17:14:29 2022 +0900

    Initially adding cmake element
    
    From bst-plugins-experimental
---
 src/buildstream_plugins/elements/cmake.py   | 69 +++++++++++++++++++++++++++
 src/buildstream_plugins/elements/cmake.yaml | 72 +++++++++++++++++++++++++++++
 2 files changed, 141 insertions(+)

diff --git a/src/buildstream_plugins/elements/cmake.py b/src/buildstream_plugins/elements/cmake.py
new file mode 100644
index 0000000..ccecaa5
--- /dev/null
+++ b/src/buildstream_plugins/elements/cmake.py
@@ -0,0 +1,69 @@
+#
+#  Copyright (C) 2018 Codethink Limited
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Tristan Van Berkom <tr...@codethink.co.uk>
+
+"""
+cmake - CMake build element
+===========================
+This is a `BuildElement
+<https://docs.buildstream.build/master/buildstream.buildelement.html#module-buildstream.buildelement>`_
+implementation for using the `CMake <https://cmake.org/>`_ build system.
+
+You will often want to pass additional arguments to the ``cmake`` program for
+specific configuration options. This should be done on a per-element basis by
+setting the ``cmake-local`` variable.  Here is an example:
+
+.. code:: yaml
+
+   variables:
+     cmake-local: |
+       -DCMAKE_BUILD_TYPE=Debug
+
+If you want to pass extra options to ``cmake`` for every element in your
+project, set the ``cmake-global`` variable in your project.conf file. Here is
+an example of that:
+
+.. code:: yaml
+
+   elements:
+     cmake:
+       variables:
+         cmake-global: |
+           -DCMAKE_BUILD_TYPE=Release
+
+Here is the default configuration for the ``cmake`` element in full:
+
+  .. literalinclude:: ../../../src/buildstream_plugins/elements/cmake.yaml
+     :language: yaml
+
+See `built-in functionality documentation
+<https://docs.buildstream.build/master/buildstream.buildelement.html#core-buildelement-builtins>`_ for
+details on common configuration options for build elements.
+"""
+
+from buildstream import BuildElement
+
+
+# Element implementation for the 'cmake' kind.
+class CMakeElement(BuildElement):
+
+    BST_MIN_VERSION = "2.0"
+
+
+# Plugin entry point
+def setup():
+    return CMakeElement
diff --git a/src/buildstream_plugins/elements/cmake.yaml b/src/buildstream_plugins/elements/cmake.yaml
new file mode 100644
index 0000000..cffa201
--- /dev/null
+++ b/src/buildstream_plugins/elements/cmake.yaml
@@ -0,0 +1,72 @@
+# CMake default configuration
+
+variables:
+
+  build-dir: _builddir
+
+  # Project-wide extra arguments to be passed to `cmake`
+  cmake-global: ''
+
+  # Element-specific extra arguments to be passed to `cmake`.
+  cmake-local: ''
+
+  # For backwards compatibility only, do not use.
+  cmake-extra: ''
+
+  # The cmake generator to use
+  generator: Ninja
+
+  cmake-args: |
+
+    -DCMAKE_INSTALL_PREFIX:PATH="%{prefix}" \
+    -DCMAKE_INSTALL_LIBDIR:PATH="%{lib}" %{cmake-extra} %{cmake-global} %{cmake-local}
+
+  cmake: |
+
+    cmake -B%{build-dir} -H"%{conf-root}" -G"%{generator}" %{cmake-args}
+
+  make: cmake --build %{build-dir} -- ${JOBS}
+  make-install: env DESTDIR="%{install-root}" cmake --build %{build-dir} --target install
+
+  # Set this if the sources cannot handle parallelization.
+  #
+  # notparallel: True
+
+config:
+
+  # Commands for configuring the software
+  #
+  configure-commands:
+  - |
+    %{cmake}
+
+  # Commands for building the software
+  #
+  build-commands:
+  - |
+    %{make}
+
+  # Commands for installing the software into a
+  # destination folder
+  #
+  install-commands:
+  - |
+    %{make-install}
+
+  # Commands for stripping debugging information out of
+  # installed binaries
+  #
+  strip-commands:
+  - |
+    %{strip-binaries}
+
+# Use max-jobs CPUs for building and enable verbosity
+environment:
+  JOBS: -j%{max-jobs}
+  V: 1
+
+# And dont consider JOBS or V as something which may
+# affect build output.
+environment-nocache:
+- JOBS
+- V


[buildstream-plugins] 06/49: Initially adding patch source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 3ad513fa2d94be52e63f66263227e6b355555ef6
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 16:40:11 2022 +0900

    Initially adding patch source
    
    From buildstream core plugins
---
 src/buildstream_plugins/sources/patch.py | 110 +++++++++++++++++++++++++++++++
 1 file changed, 110 insertions(+)

diff --git a/src/buildstream_plugins/sources/patch.py b/src/buildstream_plugins/sources/patch.py
new file mode 100644
index 0000000..2228723
--- /dev/null
+++ b/src/buildstream_plugins/sources/patch.py
@@ -0,0 +1,110 @@
+#
+#  Copyright Bloomberg Finance LP
+#  Copyright (C) 2018 Codethink Limited
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Chandan Singh <cs...@bloomberg.net>
+#        Tiago Gomes <ti...@codethink.co.uk>
+
+"""
+patch - apply locally stored patches
+====================================
+
+**Host dependencies:**
+
+  * patch
+
+**Usage:**
+
+.. code:: yaml
+
+   # Specify the local source kind
+   kind: patch
+
+   # Specify the project relative path to a patch file
+   path: files/somefile.diff
+
+   # Optionally specify the strip level, defaults to 1
+   strip-level: 1
+
+
+See `built-in functionality doumentation
+<https://docs.buildstream.build/master/buildstream.source.html#core-source-builtins>`_ for
+details on common configuration options for sources.
+"""
+
+import os
+from buildstream import Source, SourceError
+from buildstream import utils
+
+
+class PatchSource(Source):
+    # pylint: disable=attribute-defined-outside-init
+
+    BST_MIN_VERSION = "2.0"
+
+    BST_REQUIRES_PREVIOUS_SOURCES_STAGE = True
+
+    def configure(self, node):
+        node.validate_keys(["path", "strip-level", *Source.COMMON_CONFIG_KEYS])
+        self.path = self.node_get_project_path(node.get_scalar("path"), check_is_file=True)
+        self.strip_level = node.get_int("strip-level", default=1)
+        self.fullpath = os.path.join(self.get_project_directory(), self.path)
+
+    def preflight(self):
+        # Check if patch is installed, get the binary at the same time
+        self.host_patch = utils.get_host_tool("patch")
+
+    def get_unique_key(self):
+        return [self.path, utils.sha256sum(self.fullpath), self.strip_level]
+
+    def is_resolved(self):
+        return True
+
+    def is_cached(self):
+        return True
+
+    def load_ref(self, node):
+        pass
+
+    def get_ref(self):
+        return None  # pragma: nocover
+
+    def set_ref(self, ref, node):
+        pass  # pragma: nocover
+
+    def fetch(self):  # pylint: disable=arguments-differ
+        # Nothing to do here for a local source
+        pass  # pragma: nocover
+
+    def stage(self, directory):
+        with self.timed_activity("Applying local patch: {}".format(self.path)):
+
+            # Bail out with a comprehensive message if the target directory is empty
+            if not os.listdir(directory):
+                raise SourceError(
+                    "Nothing to patch in directory '{}'".format(directory), reason="patch-no-files",
+                )
+
+            strip_level_option = "-p{}".format(self.strip_level)
+            self.call(
+                [self.host_patch, strip_level_option, "-i", self.fullpath, "-d", directory,],
+                fail="Failed to apply patch {}".format(self.path),
+            )
+
+
+# Plugin entry point
+def setup():
+    return PatchSource


[buildstream-plugins] 21/49: Adding pyproject.toml

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 10f5d1cbb4f7f3bb3bf703a71bc5211536957c10
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 13:51:50 2022 +0900

    Adding pyproject.toml
---
 pyproject.toml | 26 ++++++++++++++++++++++++++
 1 file changed, 26 insertions(+)

diff --git a/pyproject.toml b/pyproject.toml
new file mode 100644
index 0000000..fefbbec
--- /dev/null
+++ b/pyproject.toml
@@ -0,0 +1,26 @@
+[build-system]
+requires = [
+    # We need at least version 36.6.0 that introduced "build_meta"
+    "setuptools>=36.6.0",
+    # In order to build wheels, and as required by PEP 517
+    "wheel",
+    "Cython"
+]
+build-backend = "setuptools.build_meta"
+
+[tool.black]
+line-length = 119
+exclude = '''
+(
+  /(
+      \.eggs
+    | \.git
+    | \.mypy_cache
+    | \.tox
+    | _build
+    | build
+    | dist
+  )/
+  | src/buildstream/_protos
+)
+'''


[buildstream-plugins] 40/49: .github/workflows: Adding the ci/merge/release workflows

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 1c51f01175b245c3e1d3fb38af3618df0db1a3c7
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 25 14:53:30 2022 +0900

    .github/workflows: Adding the ci/merge/release workflows
---
 .github/workflows/ci.yml    | 81 ++++++++++++++++++++++++++++++++++++++++++++
 .github/workflows/merge.yml | 82 +++++++++++++++++++++++++++++++++++++++++++++
 2 files changed, 163 insertions(+)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
new file mode 100644
index 0000000..7017a26
--- /dev/null
+++ b/.github/workflows/ci.yml
@@ -0,0 +1,81 @@
+name: PR Checks
+
+# Pre-merge CI to run on push and pull_request events, even if this seems
+# redundant, we avoid concurrency with the below configuration.
+#
+on:
+  pull_request:
+  workflow_dispatch:
+
+# Use the concurrency feature to ensure we don't run redundant workflows
+#
+concurrency:
+  group: ${{ github.repository }}-${{ github.ref }}-${{ github.workflow }}
+  cancel-in-progress: true
+
+# Left to-do:
+# - coverage
+# - publishing docs to gh-pages
+# - persistent artifact cache
+# - overnight jobs
+# - wsl tasks (TODO: Check if GitHub's Windows runners allow WSL)
+#
+# New opportunities:
+# - run tests on mac (GitHub provides MacOS runners)
+# - standardize WSL tasks by using GitHub-provided runners
+
+jobs:
+  tests:
+    runs-on: ubuntu-20.04
+    continue-on-error: ${{ matrix.allow-failure || false }}
+
+    strategy:
+      fail-fast: false
+      matrix:
+
+        # The names here should map to a valid service defined in
+        # "../compose/ci.docker-compose.yml"
+        test-name:
+          - debian-10
+          - fedora-34
+          - fedora-35
+          - fedora-missing-deps
+          - lint
+          - mypy
+
+        include:
+          - test-name: bst-master
+            allow-failure: true
+
+    steps:
+      - name: Check out repository
+        uses: actions/checkout@v2
+        # BuildStream requires tags to be able to find its version.
+        with:
+          fetch-depth: 0
+
+      - name: Run tests with Docker Compose
+        run: |
+          ${GITHUB_WORKSPACE}/.github/run-ci.sh ${{ matrix.test-name }}
+
+  docs:
+    runs-on: ubuntu-20.04
+    steps:
+      - name: Check out repository
+        uses: actions/checkout@v2
+        # BuildStream requires tags to be able to find its version.
+        with:
+          fetch-depth: 0
+
+      - name: Give `testuser` ownership of the source directory
+        run: sudo chown -R 1000:1000 ${GITHUB_WORKSPACE}
+
+      - name: Build documentation using Docker Compose
+        run: |
+          ${GITHUB_WORKSPACE}/.github/run-ci.sh docs
+
+      - name: Upload artifacts
+        uses: actions/upload-artifact@v2
+        with:
+          name: docs
+          path: doc/build/html
diff --git a/.github/workflows/merge.yml b/.github/workflows/merge.yml
new file mode 100644
index 0000000..8bb492e
--- /dev/null
+++ b/.github/workflows/merge.yml
@@ -0,0 +1,82 @@
+name: Merge actions
+
+on:
+  push:
+    branches:
+    - master
+
+jobs:
+  build:
+    name: Build documentation
+    runs-on: ubuntu-20.04
+    steps:
+    - name: Checkout code
+      uses: actions/checkout@v2
+      # BuildStream requires tags to be able to find its version.
+      with:
+        fetch-depth: 0
+
+    - name: Give `testuser` ownership of the source directory
+      run: sudo chown -R 1000:1000 ${GITHUB_WORKSPACE}
+
+    - name: Build documentation using Docker Compose
+      run: |
+        ${GITHUB_WORKSPACE}/.github/run-ci.sh docs
+
+        # Restore permissions to the current user
+        sudo chown -R ${USER} ${GITHUB_WORKSPACE}
+
+        # Include a tarball in the published docs, allowing for
+        # easy re-publishing of master docs on docs.buildstream.build
+        tar -C doc/build/html -zcf docs.tgz .
+
+    - name: Upload artifacts
+      uses: actions/upload-artifact@v2
+      with:
+        name: docs
+        path: |
+          doc/build/html
+          docs.tgz
+
+  publish:
+    needs: build
+    runs-on: ubuntu-20.04
+    steps:
+
+    - name: Download artifact
+      uses: actions/download-artifact@v2
+      with:
+        name: docs
+        path: docs
+
+    - name: Checkout code
+      uses: actions/checkout@v2
+      with:
+        ref: gh-pages
+        path: pages
+        fetch-depth: 0
+
+    - name: Update repo
+      run: |
+
+        # First reset the branch state to the initial commit, this ensures that
+        # we do not pollute the repository with the history of every docs package
+        # we've ever published (history of docs packages for major releases is
+        # also stored as GitHub release assets)
+        #
+        cd pages/
+        git reset --hard GH_PAGES_FIRST_COMMIT
+
+        # Copy the docs asset over and overwrite the orphan gh-pages branch, ensure
+        # that we disable GitHub's jekyll by creating the .nojekyll file, otherwise
+        # it will interfere with the rendering of the site.
+        #
+        cp -a ../docs/doc/build/html/* .
+        cp -a ../docs/docs.tgz .
+        touch .nojekyll
+
+        git add .
+        git config --local user.email "merge-ci@ponyland"
+        git config --local user.name  "Github Actions Nightly Job"
+        git commit -m "Update repo for docs build $GITHUB_RUN_NUMBER"
+        git push --force "https://$GITHUB_ACTOR:$GITHUB_TOKEN@github.com/$GITHUB_REPOSITORY.git" gh-pages


[buildstream-plugins] 08/49: Initially adding docker source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit d848a2426f442fc256b3710f0677554e8cb8ea20
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 16:56:11 2022 +0900

    Initially adding docker source
    
    From bst-plugins-container
---
 src/buildstream_plugins/sources/docker.py | 568 ++++++++++++++++++++++++++++++
 1 file changed, 568 insertions(+)

diff --git a/src/buildstream_plugins/sources/docker.py b/src/buildstream_plugins/sources/docker.py
new file mode 100644
index 0000000..4b2afde
--- /dev/null
+++ b/src/buildstream_plugins/sources/docker.py
@@ -0,0 +1,568 @@
+#
+#  Copyright (C) 2017 Codethink Limited
+#  Copyright (C) 2018 Bloomberg Finance LP
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Sam Thursfield <sa...@codethink.co.uk>
+#        Chandan Singh <cs...@bloomberg.net>
+
+"""
+docker - stage files from Docker images
+=======================================
+
+**Usage:**
+
+.. code:: yaml
+
+   # Specify the docker source kind
+   kind: docker
+
+   # Specify the registry endpoint, defaults to Docker Hub (optional)
+   registry-url: https://registry.hub.docker.com
+
+   # Image path (required)
+   image: library/alpine
+
+   # Image tag to follow (optional)
+   track: latest
+
+   # Specify the digest of the exact image to use (required)
+   ref: 6c9f6f68a131ec6381da82f2bff978083ed7f4f7991d931bfa767b7965ebc94b
+
+   # Some images are built for multiple platforms. When tracking a tag, we
+   # will choose which image to use based on these settings. Default values
+   # are chosen based on the output of `uname -m` and `uname -s`, but you
+   # can override them.
+   #architecture: arm64
+   #os: linux
+
+Note that Docker images may contain device nodes. BuildStream elements cannot
+contain device nodes so those will be dropped. Any regular files in the /dev
+directory will also be dropped.
+
+See `built-in functionality doumentation
+<https://docs.buildstream.build/master/buildstream.source.html#core-source-builtins>`_ for
+details on common configuration options for sources.
+"""
+
+import hashlib
+import json
+import os
+import platform
+import shutil
+import tarfile
+import urllib.parse
+
+import requests
+
+from buildstream import Source, SourceError
+from buildstream.utils import (
+    save_file_atomic,
+    sha256sum,
+    link_files,
+    move_atomic,
+)
+
+_DOCKER_HUB_URL = "https://registry.hub.docker.com"
+
+
+def parse_bearer_authorization_challenge(text):
+    # Hand-written and probably broken parsing of the Www-Authenticate
+    # response. I can't find a built-in way to parse this, but I probably
+    # didn't look hard enough.
+    if not text.startswith("Bearer "):
+        raise SourceError("Unexpected Www-Authenticate response: %{}".format(text))
+
+    pairs = {}
+    text = text[len("Bearer ") :]
+    for pair in text.split(","):
+        key, value = pair.split("=")
+        pairs[key] = value[1:-1]
+    return pairs
+
+
+def default_architecture():
+    machine = platform.machine()
+    if machine == "x86_64":
+        return "amd64"
+    elif machine == "aarch64":
+        return "arm64"
+    else:
+        return machine
+
+
+def default_os():
+    return platform.system().lower()
+
+
+# Variant of urllib.parse.urljoin() allowing multiple path components at once.
+def urljoin(url, *args):
+    for arg in args:
+        if not url.endswith("/"):
+            url += "/"
+        url = urllib.parse.urljoin(url, arg.lstrip("/"))
+    return url
+
+
+# DockerManifestError
+#
+# Raised if something goes wrong while querying an image manifest from a remote
+# registry.
+#
+class DockerManifestError(SourceError):
+    def __init__(self, message, manifest=None):
+        super().__init__(message)
+        self.manifest = manifest
+
+
+class DockerRegistryV2Client:
+    def __init__(self, endpoint, api_timeout=3):
+        self.endpoint = endpoint
+        self.api_timeout = api_timeout
+
+        self.token = None
+
+    def _request(self, subpath, extra_headers=None, stream=False, _reauthorized=False):
+        if not extra_headers:
+            extra_headers = {}
+
+        headers = {"content-type": "application/json"}
+        headers.update(extra_headers)
+
+        if self.token:
+            headers["Authorization"] = "Bearer {}".format(self.token)
+
+        url = urljoin(self.endpoint, "v2", subpath)
+        response = requests.get(url, headers=headers, stream=stream, timeout=self.api_timeout)
+
+        if response.status_code == requests.codes["unauthorized"] and not _reauthorized:
+            # This request requires (re)authorization. See:
+            # https://docs.docker.com/registry/spec/auth/token/
+            auth_challenge = response.headers["Www-Authenticate"]
+            auth_vars = parse_bearer_authorization_challenge(auth_challenge)
+            self._auth(auth_vars["realm"], auth_vars["service"], auth_vars["scope"])
+            return self._request(subpath, extra_headers=extra_headers, _reauthorized=True)
+        else:
+            response.raise_for_status()
+
+            return response
+
+    def _auth(self, realm, service, scope):
+        # Respond to an Www-Authenticate challenge by requesting the necessary
+        # token from the 'realm' (endpoint) that we were given in the challenge.
+        request_url = "{}?service={}&scope={}".format(realm, service, scope)
+        response = requests.get(request_url, timeout=self.api_timeout)
+        response.raise_for_status()
+        self.token = response.json()["token"]
+
+    # digest():
+    #
+    # Calculate a Docker-compatible digest of an arbitrary string of bytes.
+    #
+    # Args:
+    #    content (bytes): Content to hash
+    #
+    # Returns:
+    #    (str) A Docker-compatible digest of 'content'
+    @staticmethod
+    def digest(content):
+        digest_hash = hashlib.sha256()
+        digest_hash.update(content)
+        return "sha256:" + digest_hash.hexdigest()
+
+    # manifest():
+    #
+    # Fetches the image manifest for a given image from the remote registry.
+    #
+    # If this is a "fat" (multiplatform) image, the 'artitecture' and 'os'
+    # parameters control which of the available images is chosen.
+    #
+    # The manifest is returned verbatim, so you need to parse it yourself
+    # with json.loads() to get at its contents. The verbatim text can be used
+    # to recalculate the content digest, just encode it and pass to .digest().
+    # If we returned only the parsed JSON data you wouldn't be able to do this.
+    #
+    # Args:
+    #    image_path (str): Relative path to the image, e.g. library/alpine
+    #    reference (str): Either a tag name (such as 'latest') or the content
+    #                     digest of an exact version of the image.
+    #    architecture (str): Architecture name (amd64, arm64, etc.)
+    #    os_ (str): OS name (e.g. linux)
+    #
+    # Raises:
+    #    requests.RequestException, if network errors occur
+    #
+    # Returns:
+    #    (str, str): A tuple of the manifest content as text, and its content hash
+    def manifest(
+        self, image_path, reference, architecture=default_architecture(), os_=default_os(),
+    ):
+        # pylint: disable=too-many-locals
+
+        accept_types = [
+            "application/vnd.docker.distribution.manifest.v2+json",
+            "application/vnd.docker.distribution.manifest.list.v2+json",
+        ]
+
+        manifest_url = urljoin(image_path, "manifests", urllib.parse.quote(reference))
+        response = self._request(manifest_url, extra_headers={"Accept": ",".join(accept_types)})
+
+        try:
+            manifest = json.loads(response.text)
+        except json.JSONDecodeError as e:
+            raise DockerManifestError(
+                "Server did not return a valid manifest: {}".format(e), manifest=response.text,
+            ) from e
+
+        schema_version = manifest.get("schemaVersion")
+        if schema_version == 1:
+            raise DockerManifestError("Schema version 1 is unsupported.", manifest=response.text)
+        if schema_version is None:
+            raise DockerManifestError(
+                "Manifest did not include the schemaVersion key.", manifest=response.text,
+            )
+
+        our_digest = self.digest(response.text.encode("utf8"))
+        their_digest = response.headers.get("Docker-Content-Digest")
+
+        if not their_digest:
+            raise DockerManifestError(
+                "Server did not set the Docker-Content-Digest header.", manifest=response.text,
+            )
+        if our_digest != their_digest:
+            raise DockerManifestError(
+                "Server returned a non-matching content digest. "
+                "Our digest: {}, their digest: {}".format(our_digest, their_digest),
+                manifest=response.text,
+            )
+
+        if manifest["mediaType"] == "application/vnd.docker.distribution.manifest.list.v2+json":
+            # This is a "fat manifest", we need to narrow down to a specific
+            # architecture.
+            for sub in manifest["manifests"]:
+                if sub["platform"]["architecture"] == architecture and sub["platform"]["os"]:
+                    sub_digest = sub["digest"]
+                    return self.manifest(image_path, sub_digest, architecture=architecture, os_=os_,)
+                else:
+                    raise DockerManifestError(
+                        "No images found for architecture {}, OS {}".format(architecture, os_), manifest=response.text,
+                    )
+        elif manifest["mediaType"] == "application/vnd.docker.distribution.manifest.v2+json":
+            return response.text, our_digest
+        else:
+            raise DockerManifestError(
+                "Unsupported manifest type {}".format(manifest["mediaType"]), manifest=response.text,
+            )
+
+    # blob():
+    #
+    # Fetch a blob from the remote registry. This is used for getting each
+    # layer of an image in tar.gz format.
+    #
+    # Raises:
+    #    requests.RequestException, if network errors occur
+    #
+    # Args:
+    #    image_path (str): Relative path to the image, e.g. library/alpine
+    #    blob_digest (str): Content hash of the blob.
+    #    download_to (str): Path to a file where the content will be written.
+    def blob(self, image_path, blob_digest, download_to):
+        blob_url = urljoin(image_path, "blobs", urllib.parse.quote(blob_digest))
+
+        response = self._request(blob_url, stream=True)
+
+        with save_file_atomic(download_to, "wb") as f:
+            shutil.copyfileobj(response.raw, f)
+
+
+class ReadableTarInfo(tarfile.TarInfo):
+    """
+    The goal is to override`TarFile`'s `extractall` semantics by ensuring that on extraction, the
+    files are readable by the owner of the file. This is done by over-riding the accessor for the
+    mode` attribute in `TarInfo`, class that encapsulates the internal meta-data of the tarball,
+    so that the owner-read bit is always set.
+    """
+
+    # The mode attribute is not declared as a property and so
+    # this trips up the static type checker, mark this as "ignore"
+    #
+    @property  # type: ignore
+    def mode(self):
+        # ensure file is readable by owner
+        return self.__permission | 0o400
+
+    @mode.setter
+    def mode(self, permission):
+        self.__permission = permission
+
+
+class DockerSource(Source):
+    # pylint: disable=too-many-instance-attributes
+
+    BST_MIN_VERSION = "2.0"
+
+    # Docker identifies images by a content digest calculated from the image's
+    # manifest. This corresponds well with the concept of a 'ref' in
+    # BuildStream. However, Docker theoretically supports multiple hash
+    # methods while BuildStream does not. Right now every Docker registry
+    # uses sha256 so let's ignore that issue for the time being.
+    @staticmethod
+    def _digest_to_ref(digest):
+        if digest.startswith("sha256:"):
+            return digest[len("sha256:") :]
+        else:
+            method = digest.split(":")[0]
+            raise SourceError("Unsupported digest method: {}".format(method))
+
+    @staticmethod
+    def _ref_to_digest(ref):
+        return "sha256:" + ref
+
+    def configure(self, node):
+        # url is deprecated, but accept it as a valid key so that we can raise
+        # a nicer warning.
+        node.validate_keys(["registry-url", "image", "ref", "track", "url"] + Source.COMMON_CONFIG_KEYS)
+
+        if "url" in node:
+            raise SourceError(
+                "{}: 'url' parameter is now deprecated, " "use 'registry-url' and 'image' instead.".format(self)
+            )
+
+        self.image = node.get_str("image")
+        self.original_registry_url = node.get_str("registry-url", _DOCKER_HUB_URL)
+        self.registry_url = self.translate_url(self.original_registry_url)
+
+        if "ref" in node:
+            self.digest = self._ref_to_digest(node.get_str("ref"))
+        else:
+            self.digest = None
+        self.tag = node.get_str("track", "") or None
+
+        self.architecture = node.get_str("architecture", "") or default_architecture()
+        self.os = node.get_str("os", "") or default_os()
+
+        if not (self.digest or self.tag):
+            raise SourceError("{}: Must specify either 'ref' or 'track' parameters".format(self))
+
+        self.client = DockerRegistryV2Client(self.registry_url)
+
+        self.manifest = None
+
+    def preflight(self):
+        return
+
+    def get_unique_key(self):
+        return [self.original_registry_url, self.image, self.digest]
+
+    def get_ref(self):
+        return None if self.digest is None else self._digest_to_ref(self.digest)
+
+    def set_ref(self, ref, node):
+        node["ref"] = ref
+        self.digest = self._ref_to_digest(ref)
+
+    def track(self):
+        # pylint: disable=arguments-differ
+
+        # If the tracking ref is not specified it's not an error, just silently return
+        if not self.tag:
+            return None
+
+        with self.timed_activity(
+            "Fetching image manifest for image: '{}:{}' from: {}".format(self.image, self.tag, self.registry_url)
+        ):
+            try:
+                _, digest = self.client.manifest(self.image, self.tag)
+            except DockerManifestError as e:
+                self.log("Problem downloading manifest", detail=e.manifest)
+                raise
+            except (OSError, requests.RequestException) as e:
+                raise SourceError(e) from e
+
+        return self._digest_to_ref(digest)
+
+    def is_resolved(self):
+        return self.digest is not None
+
+    def is_cached(self):
+        mirror_dir = self.get_mirror_directory()
+        try:
+            manifest = self._load_manifest()
+
+            for layer in manifest["layers"]:
+                layer_digest = layer["digest"]
+                blob_path = os.path.join(mirror_dir, layer_digest + ".tar.gz")
+                try:
+                    self._verify_blob(blob_path, expected_digest=layer_digest)
+                except FileNotFoundError:
+                    # digest fetched, but some layer blob not fetched
+                    return False
+            return True
+        except (FileNotFoundError, SourceError):
+            return False
+
+    def _load_manifest(self):
+        manifest_file = os.path.join(self.get_mirror_directory(), self.digest + ".manifest.json")
+
+        with open(manifest_file, "rb") as f:
+            text = f.read()
+
+        real_digest = self.client.digest(text)
+        if real_digest != self.digest:
+            raise SourceError("Manifest {} is corrupt; got content hash of {}".format(manifest_file, real_digest))
+
+        return json.loads(text.decode("utf-8"))
+
+    def _save_manifest(self, text, path):
+        manifest_file = os.path.join(path, self.digest + ".manifest.json")
+        with save_file_atomic(manifest_file, "wb") as f:
+            f.write(text.encode("utf-8"))
+
+    @staticmethod
+    def _verify_blob(path, expected_digest):
+        blob_digest = "sha256:" + sha256sum(path)
+        if expected_digest != blob_digest:
+            raise SourceError("Blob {} is corrupt; got content hash of {}.".format(path, blob_digest))
+
+    def fetch(self):
+        # pylint: disable=arguments-differ
+
+        with self.timed_activity(
+            "Fetching image {}:{} with digest {}".format(self.image, self.tag, self.digest), silent_nested=True,
+        ):
+            with self.tempdir() as tmpdir:
+                # move all files to a tmpdir
+                try:
+                    manifest = self._load_manifest()
+                except FileNotFoundError as e:
+                    try:
+                        manifest_text, digest = self.client.manifest(self.image, self.digest)
+                    except requests.RequestException as ee:
+                        raise SourceError(ee) from ee
+
+                    if digest != self.digest:
+                        raise SourceError(
+                            "Requested image {}, got manifest with digest {}".format(self.digest, digest)
+                        ) from e
+                    self._save_manifest(manifest_text, tmpdir)
+                    manifest = json.loads(manifest_text)
+                except DockerManifestError as e:
+                    self.log("Unexpected manifest", detail=e.manifest)
+                    raise
+                except (OSError, requests.RequestException) as e:
+                    raise SourceError(e) from e
+
+                for layer in manifest["layers"]:
+                    if layer["mediaType"] != "application/vnd.docker.image.rootfs.diff.tar.gzip":
+                        raise SourceError("Unsupported layer type: {}".format(layer["mediaType"]))
+
+                    layer_digest = layer["digest"]
+                    blob_path = os.path.join(tmpdir, layer_digest + ".tar.gz")
+
+                    if not os.path.exists(blob_path):
+                        try:
+                            self.client.blob(self.image, layer_digest, download_to=blob_path)
+                        except (OSError, requests.RequestException) as e:
+                            if os.path.exists(blob_path):
+                                shutil.rmtree(blob_path)
+                            raise SourceError(e) from e
+
+                    self._verify_blob(blob_path, expected_digest=layer_digest)
+
+                # Only if all sources are successfully fetched, move files to staging directory
+
+                # As both the manifest and blobs are content addressable, we can optimize space by having
+                # a flat mirror directory. We check one-by-one if there is any need to copy a file out of the tmpdir.
+                for fetched_file in os.listdir(tmpdir):
+                    move_atomic(
+                        os.path.join(tmpdir, fetched_file), os.path.join(self.get_mirror_directory(), fetched_file),
+                    )
+
+    def stage(self, directory):
+        mirror_dir = self.get_mirror_directory()
+
+        try:
+            manifest = self._load_manifest()
+        except (OSError, SourceError) as e:
+            raise SourceError("Unable to load manifest: {}".format(e)) from e
+
+        try:
+            for layer in manifest["layers"]:
+                layer_digest = layer["digest"]
+                blob_path = os.path.join(mirror_dir, layer_digest + ".tar.gz")
+
+                self._verify_blob(blob_path, expected_digest=layer_digest)
+                (extract_fileset, white_out_fileset,) = self._get_extract_and_remove_files(blob_path)
+
+                # remove files associated with whiteouts
+                for white_out_file in white_out_fileset:
+                    white_out_file = os.path.join(directory, white_out_file)
+                    os.remove(white_out_file)
+
+                # extract files for the current layer
+                with tarfile.open(blob_path, tarinfo=ReadableTarInfo) as tar:
+                    with self.tempdir() as td:
+                        tar.extractall(path=td, members=extract_fileset)
+                        link_files(td, directory)
+
+        except (OSError, SourceError, tarfile.TarError) as e:
+            raise SourceError("{}: Error staging source: {}".format(self, e)) from e
+
+    @staticmethod
+    def _get_extract_and_remove_files(layer_tar_path):
+        """Return the set of files to remove and extract for a given layer
+
+        :param layer_tar_path: The path where a layer has been extracted
+        :return: Tuple of filesets
+          - extract_fileset: files to extract into staging directory
+          - delete_fileset: files to remove from staging directory as the current layer
+            contains a whiteout corresponding to a staged file in the previous layers
+
+        """
+
+        def strip_wh(white_out_file):
+            """Strip the prefixing .wh. for given file
+
+            :param white_out_file: path of file
+            :return: path without white-out prefix
+            """
+            # whiteout files have the syntax of `*/.wh.*`
+            file_name = os.path.basename(white_out_file)
+            path = os.path.join(os.path.dirname(white_out_file), file_name.split(".wh.")[1])
+            return path
+
+        def is_regular_file(info):
+            """Check if file is a non-device file
+
+            :param info: tar member metadata
+            :return: if the file is a non-device file
+            """
+            return not (info.name.startswith("dev/") or info.isdev())
+
+        with tarfile.open(layer_tar_path) as tar:
+            extract_fileset = []
+            delete_fileset = []
+            for member in tar.getmembers():
+                if os.path.basename(member.name).startswith(".wh."):
+                    delete_fileset.append(strip_wh(member.name))
+                elif is_regular_file(member):
+                    extract_fileset.append(member)
+
+        return extract_fileset, delete_fileset
+
+
+# Plugin entry point
+def setup():
+    return DockerSource


[buildstream-plugins] 04/49: Initially adding pip source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 8ab4718dd37355fcefc7b9590cbc7e3808cbdd5c
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 16:28:53 2022 +0900

    Initially adding pip source
    
    From bst-plugins-experimental
---
 src/buildstream_plugins/sources/pip.py | 262 +++++++++++++++++++++++++++++++++
 1 file changed, 262 insertions(+)

diff --git a/src/buildstream_plugins/sources/pip.py b/src/buildstream_plugins/sources/pip.py
new file mode 100644
index 0000000..21e26a0
--- /dev/null
+++ b/src/buildstream_plugins/sources/pip.py
@@ -0,0 +1,262 @@
+#
+#  Copyright 2018 Bloomberg Finance LP
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Chandan Singh <cs...@bloomberg.net>
+
+"""
+pip - stage python packages using pip
+=====================================
+
+**Host depndencies:**
+
+  * ``pip`` python module
+
+This plugin will download source distributions for specified packages using
+``pip`` but will not install them. It is expected that the elements using this
+source will install the downloaded packages.
+
+Downloaded tarballs will be stored in a directory called ".bst_pip_downloads".
+
+**Usage:**
+
+.. code:: yaml
+
+   # Specify the pip source kind
+   kind: pip
+
+   # Optionally specify index url, defaults to PyPi
+   # This url is used to discover new versions of packages and download them
+   # Projects intending to mirror their sources to a permanent location should
+   # use an aliased url, and declare the alias in the project configuration
+   url: https://mypypi.example.com/simple
+
+   # Optionally specify the path to requirements files
+   # Note that either 'requirements-files' or 'packages' must be defined
+   requirements-files:
+   - requirements.txt
+
+   # Optionally specify a list of additional packages
+   # Note that either 'requirements-files' or 'packages' must be defined
+   packages:
+   - flake8
+
+   # Specify the ref. It is a list of strings of format
+   # "<package-name>==<version>", separated by "\\n".
+   # Usually this will be contents of a requirements.txt file where all
+   # package versions have been frozen.
+   ref: "flake8==3.5.0\\nmccabe==0.6.1\\npkg-resources==0.0.0\\npycodestyle==2.3.1\\npyflakes==1.6.0"
+
+See `built-in functionality doumentation
+<https://docs.buildstream.build/master/buildstream.source.html#core-source-builtins>`_ for
+details on common configuration options for sources.
+"""
+
+import hashlib
+import os
+import re
+
+from buildstream import Source, SourceError, utils
+
+_OUTPUT_DIRNAME = ".bst_pip_downloads"
+_PYPI_INDEX_URL = "https://pypi.org/simple/"
+
+# Used only for finding pip command
+_PYTHON_VERSIONS = [
+    "python",  # when running in a venv, we might not have the exact version
+    "python2.7",
+    "python3.0",
+    "python3.1",
+    "python3.2",
+    "python3.3",
+    "python3.4",
+    "python3.5",
+    "python3.6",
+    "python3.7",
+    "python3.8",
+]
+
+# List of allowed extensions taken from
+# https://docs.python.org/3/distutils/sourcedist.html.
+# Names of source distribution archives must be of the form
+# '%{package-name}-%{version}.%{extension}'.
+_SDIST_RE = re.compile(r"^([\w.-]+?)-((?:[\d.]+){2,})\.(?:tar|tar.bz2|tar.gz|tar.xz|tar.Z|zip)$", re.IGNORECASE,)
+
+
+class PipSource(Source):
+    # pylint: disable=attribute-defined-outside-init
+    BST_MIN_VERSION = "2.0"
+
+    # We need access to previous sources at track time to use requirements.txt
+    # but not at fetch time as self.ref should contain sufficient information
+    # for this plugin
+    BST_REQUIRES_PREVIOUS_SOURCES_TRACK = True
+
+    def configure(self, node):
+        node.validate_keys(["url", "packages", "ref", "requirements-files"] + Source.COMMON_CONFIG_KEYS)
+        self.ref = node.get_str("ref", None)
+        self.original_url = node.get_str("url", _PYPI_INDEX_URL)
+        self.index_url = self.translate_url(self.original_url)
+        self.packages = node.get_str_list("packages", [])
+        self.requirements_files = node.get_str_list("requirements-files", [])
+
+        if not (self.packages or self.requirements_files):
+            raise SourceError("{}: Either 'packages' or 'requirements-files' must be specified".format(self))
+
+    def preflight(self):
+        # Try to find a pip version that spports download command
+        self.host_pip = None
+        for python in reversed(_PYTHON_VERSIONS):
+            try:
+                host_python = utils.get_host_tool(python)
+                rc = self.call([host_python, "-m", "pip", "download", "--help"])
+                if rc == 0:
+                    self.host_pip = [host_python, "-m", "pip"]
+                    break
+            except utils.ProgramNotFoundError:
+                pass
+
+        if self.host_pip is None:
+            raise SourceError("{}: Unable to find a suitable pip command".format(self))
+
+    def get_unique_key(self):
+        return [self.original_url, self.ref]
+
+    def is_cached(self):
+        return os.path.exists(self._mirror) and os.listdir(self._mirror)
+
+    def get_ref(self):
+        return self.ref
+
+    def load_ref(self, node):
+        self.ref = node.get_str("ref", None)
+
+    def set_ref(self, ref, node):
+        node["ref"] = self.ref = ref
+
+    def track(self, previous_sources_dir):  # pylint: disable=arguments-differ
+        # XXX pip does not offer any public API other than the CLI tool so it
+        # is not feasible to correctly parse the requirements file or to check
+        # which package versions pip is going to install.
+        # See https://pip.pypa.io/en/stable/user_guide/#using-pip-from-your-program
+        # for details.
+        # As a result, we have to wastefully install the packages during track.
+        with self.tempdir() as tmpdir:
+            install_args = self.host_pip + [
+                "download",
+                "--no-binary",
+                ":all:",
+                "--index-url",
+                self.index_url,
+                "--dest",
+                tmpdir,
+            ]
+            for requirement_file in self.requirements_files:
+                fpath = os.path.join(previous_sources_dir, requirement_file)
+                install_args += ["-r", fpath]
+            install_args += self.packages
+
+            self.call(install_args, fail="Failed to install python packages")
+            reqs = self._parse_sdist_names(tmpdir)
+
+        return "\n".join(["{}=={}".format(pkg, ver) for pkg, ver in reqs])
+
+    def fetch(self):  # pylint: disable=arguments-differ
+        with self.tempdir() as tmpdir:
+            packages = self.ref.strip().split("\n")
+            package_dir = os.path.join(tmpdir, "packages")
+            os.makedirs(package_dir)
+            self.call(
+                [
+                    *self.host_pip,
+                    "download",
+                    "--no-binary",
+                    ":all:",
+                    "--index-url",
+                    self.index_url,
+                    "--dest",
+                    package_dir,
+                    *packages,
+                ],
+                fail="Failed to install python packages: {}".format(packages),
+            )
+
+            # If the mirror directory already exists, assume that some other
+            # process has fetched the sources before us and ensure that we do
+            # not raise an error in that case.
+            try:
+                utils.move_atomic(package_dir, self._mirror)
+            except utils.DirectoryExistsError:
+                # Another process has beaten us and has fetched the sources
+                # before us.
+                pass
+            except OSError as e:
+                raise SourceError(
+                    "{}: Failed to move downloaded pip packages from '{}' to '{}': {}".format(
+                        self, package_dir, self._mirror, e
+                    )
+                ) from e
+
+    def stage(self, directory):
+        with self.timed_activity("Staging Python packages", silent_nested=True):
+            utils.copy_files(self._mirror, os.path.join(directory, _OUTPUT_DIRNAME))
+
+    # Directory where this source should stage its files
+    #
+    @property
+    def _mirror(self):
+        if not self.ref:
+            return None
+        return os.path.join(
+            self.get_mirror_directory(),
+            utils.url_directory_name(self.original_url),
+            hashlib.sha256(self.ref.encode()).hexdigest(),
+        )
+
+    # Parse names of downloaded source distributions
+    #
+    # Args:
+    #    basedir (str): Directory containing source distribution archives
+    #
+    # Returns:
+    #    (list): List of (package_name, version) tuples in sorted order
+    #
+    def _parse_sdist_names(self, basedir):
+        reqs = []
+        for f in os.listdir(basedir):
+            pkg = _match_package_name(f)
+            if pkg is not None:
+                reqs.append(pkg)
+
+        return sorted(reqs)
+
+
+# Extract the package name and version of a source distribution
+#
+# Args:
+#    filename (str): Filename of the source distribution
+#
+# Returns:
+#    (tuple): A tuple of (package_name, version)
+#
+def _match_package_name(filename):
+    pkg_match = _SDIST_RE.match(filename)
+    if pkg_match is None:
+        return None
+    return pkg_match.groups()
+
+
+def setup():
+    return PipSource


[buildstream-plugins] 35/49: tests/sources/pip_build.py: Adding test which builds using the pip source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 1a11f454e8ff81d62033b5d235327bac4c75eb31
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Mon Mar 21 17:46:42 2022 +0900

    tests/sources/pip_build.py: Adding test which builds using the pip source
---
 setup.cfg                                          |   2 +-
 tests/sources/pip-build/elements/base.bst          |   3 +
 .../pip-build/elements/base/alpine-image.bst       |   6 +
 tests/sources/pip-build/files/pip-source/app1.py   |  11 ++
 .../sources/pip-build/files/pip-source/myreqs.txt  |   1 +
 tests/sources/pip-build/project.conf               |  15 ++
 tests/sources/pip_build.py                         | 208 +++++++++++++++++++++
 7 files changed, 245 insertions(+), 1 deletion(-)

diff --git a/setup.cfg b/setup.cfg
index c324a67..2796004 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -3,7 +3,7 @@ test=pytest
 
 [tool:pytest]
 addopts = --verbose --basetemp ./tmp --durations=20
-norecursedirs = integration-cache tmp __pycache__ .eggs
+norecursedirs = tests/sources/pip-build integration-cache tmp __pycache__ .eggs
 python_files = tests/*.py
 markers =
     integration: run test only if --integration option is specified
diff --git a/tests/sources/pip-build/elements/base.bst b/tests/sources/pip-build/elements/base.bst
new file mode 100644
index 0000000..da7c70b
--- /dev/null
+++ b/tests/sources/pip-build/elements/base.bst
@@ -0,0 +1,3 @@
+kind: stack
+depends:
+- base/alpine-image.bst
diff --git a/tests/sources/pip-build/elements/base/alpine-image.bst b/tests/sources/pip-build/elements/base/alpine-image.bst
new file mode 100644
index 0000000..f8e00ba
--- /dev/null
+++ b/tests/sources/pip-build/elements/base/alpine-image.bst
@@ -0,0 +1,6 @@
+kind: import
+description: Import an alpine image as the platform
+sources:
+- kind: tar
+  url: alpine:integration-tests-base.v1.x86_64.tar.xz
+  ref: 3eb559250ba82b64a68d86d0636a6b127aa5f6d25d3601a79f79214dc9703639
diff --git a/tests/sources/pip-build/files/pip-source/app1.py b/tests/sources/pip-build/files/pip-source/app1.py
new file mode 100644
index 0000000..b96d14b
--- /dev/null
+++ b/tests/sources/pip-build/files/pip-source/app1.py
@@ -0,0 +1,11 @@
+#!/usr/bin/env python3
+
+from hellolib import hello
+
+
+def main():
+    hello("App1")
+
+
+if __name__ == "__main__":
+    main()
diff --git a/tests/sources/pip-build/files/pip-source/myreqs.txt b/tests/sources/pip-build/files/pip-source/myreqs.txt
new file mode 100644
index 0000000..c805aae
--- /dev/null
+++ b/tests/sources/pip-build/files/pip-source/myreqs.txt
@@ -0,0 +1 @@
+hellolib
diff --git a/tests/sources/pip-build/project.conf b/tests/sources/pip-build/project.conf
new file mode 100644
index 0000000..f04cd98
--- /dev/null
+++ b/tests/sources/pip-build/project.conf
@@ -0,0 +1,15 @@
+# test project config
+name: test
+min-version: 2.0
+
+element-path: elements
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  elements:
+  - pip
+
+aliases:
+  alpine: https://bst-integration-test-images.ams3.cdn.digitaloceanspaces.com/
+  project_dir: file://{project_dir}
diff --git a/tests/sources/pip_build.py b/tests/sources/pip_build.py
new file mode 100644
index 0000000..35333c9
--- /dev/null
+++ b/tests/sources/pip_build.py
@@ -0,0 +1,208 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import pytest
+
+from buildstream import _yaml
+
+from buildstream._testing import cli_integration as cli  # pylint: disable=unused-import
+from buildstream._testing.integration import assert_contains
+from buildstream._testing.integration import integration_cache  # pylint: disable=unused-import
+from buildstream._testing._utils.site import HAVE_SANDBOX
+
+from tests.testutils.python_repo import setup_pypi_repo  # pylint: disable=unused-import
+
+
+pytestmark = pytest.mark.integration
+
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "pip-build")
+
+
+@pytest.mark.datafiles(DATA_DIR)
+def test_pip_source_import_packages(cli, datafiles, setup_pypi_repo):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_path = os.path.join(project, "elements")
+    element_name = "pip/hello.bst"
+
+    # check that exotically named packages are imported correctly
+    myreqs_packages = "hellolib"
+    dependencies = [
+        "app2",
+        "app.3",
+        "app-4",
+        "app_5",
+        "app.no.6",
+        "app-no-7",
+        "app_no_8",
+    ]
+    mock_packages = {myreqs_packages: {package: {} for package in dependencies}}
+
+    # create mock pypi repository
+    pypi_repo = os.path.join(project, "files", "pypi-repo")
+    os.makedirs(pypi_repo, exist_ok=True)
+    setup_pypi_repo(mock_packages, pypi_repo)
+
+    element = {
+        "kind": "import",
+        "sources": [
+            {"kind": "local", "path": "files/pip-source"},
+            {"kind": "pip", "url": "file://{}".format(os.path.realpath(pypi_repo)), "packages": [myreqs_packages],},
+        ],
+    }
+    os.makedirs(
+        os.path.dirname(os.path.join(element_path, element_name)), exist_ok=True,
+    )
+    _yaml.roundtrip_dump(element, os.path.join(element_path, element_name))
+
+    result = cli.run(project=project, args=["source", "track", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    assert result.exit_code == 0
+
+    assert_contains(
+        checkout,
+        [
+            "/.bst_pip_downloads",
+            "/.bst_pip_downloads/hellolib-0.1.tar.gz",
+            "/.bst_pip_downloads/app2-0.1.tar.gz",
+            "/.bst_pip_downloads/app.3-0.1.tar.gz",
+            "/.bst_pip_downloads/app-4-0.1.tar.gz",
+            "/.bst_pip_downloads/app_5-0.1.tar.gz",
+            "/.bst_pip_downloads/app.no.6-0.1.tar.gz",
+            "/.bst_pip_downloads/app-no-7-0.1.tar.gz",
+            "/.bst_pip_downloads/app_no_8-0.1.tar.gz",
+        ],
+    )
+
+
+@pytest.mark.datafiles(DATA_DIR)
+def test_pip_source_import_requirements_files(cli, datafiles, setup_pypi_repo):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_path = os.path.join(project, "elements")
+    element_name = "pip/hello.bst"
+
+    # check that exotically named packages are imported correctly
+    myreqs_packages = "hellolib"
+    dependencies = [
+        "app2",
+        "app.3",
+        "app-4",
+        "app_5",
+        "app.no.6",
+        "app-no-7",
+        "app_no_8",
+    ]
+    mock_packages = {myreqs_packages: {package: {} for package in dependencies}}
+
+    # create mock pypi repository
+    pypi_repo = os.path.join(project, "files", "pypi-repo")
+    os.makedirs(pypi_repo, exist_ok=True)
+    setup_pypi_repo(mock_packages, pypi_repo)
+
+    element = {
+        "kind": "import",
+        "sources": [
+            {"kind": "local", "path": "files/pip-source"},
+            {
+                "kind": "pip",
+                "url": "file://{}".format(os.path.realpath(pypi_repo)),
+                "requirements-files": ["myreqs.txt"],
+            },
+        ],
+    }
+    os.makedirs(
+        os.path.dirname(os.path.join(element_path, element_name)), exist_ok=True,
+    )
+    _yaml.roundtrip_dump(element, os.path.join(element_path, element_name))
+
+    result = cli.run(project=project, args=["source", "track", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout],)
+    assert result.exit_code == 0
+
+    assert_contains(
+        checkout,
+        [
+            "/.bst_pip_downloads",
+            "/.bst_pip_downloads/hellolib-0.1.tar.gz",
+            "/.bst_pip_downloads/app2-0.1.tar.gz",
+            "/.bst_pip_downloads/app.3-0.1.tar.gz",
+            "/.bst_pip_downloads/app-4-0.1.tar.gz",
+            "/.bst_pip_downloads/app_5-0.1.tar.gz",
+            "/.bst_pip_downloads/app.no.6-0.1.tar.gz",
+            "/.bst_pip_downloads/app-no-7-0.1.tar.gz",
+            "/.bst_pip_downloads/app_no_8-0.1.tar.gz",
+        ],
+    )
+
+
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_pip_source_build(cli, datafiles, setup_pypi_repo):
+    project = str(datafiles)
+    element_path = os.path.join(project, "elements")
+    element_name = "pip/hello.bst"
+
+    # check that exotically named packages are imported correctly
+    myreqs_packages = "hellolib"
+    dependencies = [
+        "app2",
+        "app.3",
+        "app-4",
+        "app_5",
+        "app.no.6",
+        "app-no-7",
+        "app_no_8",
+    ]
+    mock_packages = {myreqs_packages: {package: {} for package in dependencies}}
+
+    # create mock pypi repository
+    pypi_repo = os.path.join(project, "files", "pypi-repo")
+    os.makedirs(pypi_repo, exist_ok=True)
+    setup_pypi_repo(mock_packages, pypi_repo)
+
+    element = {
+        "kind": "manual",
+        "depends": ["base.bst"],
+        "sources": [
+            {"kind": "local", "path": "files/pip-source"},
+            {
+                "kind": "pip",
+                "url": "file://{}".format(os.path.realpath(pypi_repo)),
+                "requirements-files": ["myreqs.txt"],
+                "packages": dependencies,
+            },
+        ],
+        "config": {
+            "install-commands": [
+                "pip3 install --no-index --prefix %{install-root}/usr .bst_pip_downloads/*.tar.gz",
+                "install app1.py %{install-root}/usr/bin/",
+            ]
+        },
+    }
+    os.makedirs(
+        os.path.dirname(os.path.join(element_path, element_name)), exist_ok=True,
+    )
+    _yaml.roundtrip_dump(element, os.path.join(element_path, element_name))
+
+    result = cli.run(project=project, args=["source", "track", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["shell", element_name, "/usr/bin/app1.py"])
+    assert result.exit_code == 0
+    assert result.output == "Hello App1! This is hellolib\n"


[buildstream-plugins] 29/49: tests/elements/autotools.py: Adding autotools element test

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 091af01090b6b6013dd44628af27dc7a31be9c84
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Thu Mar 24 15:29:45 2022 +0900

    tests/elements/autotools.py: Adding autotools element test
---
 tests/elements/autotools.py                        |  99 +++++++++++++++++++++
 tests/elements/autotools/elements/amhello.bst      |  10 +++
 .../autotools/elements/amhelloconfroot.bst         |  15 ++++
 tests/elements/autotools/elements/base.bst         |   3 +
 .../autotools/elements/base/alpine-image.bst       |   6 ++
 tests/elements/autotools/files/amhello.tar.gz      | Bin 0 -> 30555 bytes
 tests/elements/autotools/project.conf              |  15 ++++
 7 files changed, 148 insertions(+)

diff --git a/tests/elements/autotools.py b/tests/elements/autotools.py
new file mode 100644
index 0000000..2b4ce65
--- /dev/null
+++ b/tests/elements/autotools.py
@@ -0,0 +1,99 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import pytest
+
+from buildstream._testing.integration import integration_cache  # pylint: disable=unused-import
+from buildstream._testing import cli_integration as cli  # pylint: disable=unused-import
+from buildstream._testing.integration import assert_contains
+from buildstream._testing._utils.site import HAVE_SANDBOX
+
+
+pytestmark = pytest.mark.integration
+
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "autotools")
+
+
+# Test that an autotools build 'works' - we use the autotools sample
+# amhello project for this.
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_autotools_build(cli, datafiles):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_name = "amhello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout])
+    assert result.exit_code == 0
+
+    assert_contains(
+        checkout,
+        [
+            "/usr",
+            "/usr/lib",
+            "/usr/bin",
+            "/usr/share",
+            "/usr/bin/hello",
+            "/usr/share/doc",
+            "/usr/share/doc/amhello",
+            "/usr/share/doc/amhello/README",
+        ],
+    )
+
+    # Check the log
+    result = cli.run(project=project, args=["artifact", "log", element_name])
+    assert result.exit_code == 0
+    log = result.output
+
+    # Verify we get expected output exactly once
+    assert log.count("Making all in src") == 1
+
+
+# Test that an autotools build 'works' - we use the autotools sample
+# amhello project for this.
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_autotools_confroot_build(cli, datafiles):
+    project = str(datafiles)
+    checkout = os.path.join(cli.directory, "checkout")
+    element_name = "amhelloconfroot.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["artifact", "checkout", element_name, "--directory", checkout])
+    assert result.exit_code == 0
+
+    assert_contains(
+        checkout,
+        [
+            "/usr",
+            "/usr/lib",
+            "/usr/bin",
+            "/usr/share",
+            "/usr/bin/hello",
+            "/usr/share/doc",
+            "/usr/share/doc/amhello",
+            "/usr/share/doc/amhello/README",
+        ],
+    )
+
+
+# Test running an executable built with autotools
+@pytest.mark.datafiles(DATA_DIR)
+@pytest.mark.skipif(not HAVE_SANDBOX, reason="Only available with a functioning sandbox")
+def test_autotools_run(cli, datafiles):
+    project = str(datafiles)
+    element_name = "amhello.bst"
+
+    result = cli.run(project=project, args=["build", element_name])
+    assert result.exit_code == 0
+
+    result = cli.run(project=project, args=["shell", element_name, "/usr/bin/hello"])
+    assert result.exit_code == 0
+    assert result.output == "Hello World!\nThis is amhello 1.0.\n"
diff --git a/tests/elements/autotools/elements/amhello.bst b/tests/elements/autotools/elements/amhello.bst
new file mode 100644
index 0000000..ee3a029
--- /dev/null
+++ b/tests/elements/autotools/elements/amhello.bst
@@ -0,0 +1,10 @@
+kind: autotools
+description: Autotools test
+
+depends:
+- base.bst
+
+sources:
+- kind: tar
+  url: project_dir:/files/amhello.tar.gz
+  ref: 9ba123fa4e660929e9a0aa99f0c487b7eee59c5e7594f3284d015640b90f5590
diff --git a/tests/elements/autotools/elements/amhelloconfroot.bst b/tests/elements/autotools/elements/amhelloconfroot.bst
new file mode 100644
index 0000000..2892644
--- /dev/null
+++ b/tests/elements/autotools/elements/amhelloconfroot.bst
@@ -0,0 +1,15 @@
+kind: autotools
+description: Autotools test
+
+depends:
+- base.bst
+
+sources:
+- kind: tar
+  url: project_dir:/files/amhello.tar.gz
+  ref: 9ba123fa4e660929e9a0aa99f0c487b7eee59c5e7594f3284d015640b90f5590
+  directory: SourceFile
+
+variables:
+  conf-root: "%{build-root}/SourceFile"
+  command-subdir: build
diff --git a/tests/elements/autotools/elements/base.bst b/tests/elements/autotools/elements/base.bst
new file mode 100644
index 0000000..da7c70b
--- /dev/null
+++ b/tests/elements/autotools/elements/base.bst
@@ -0,0 +1,3 @@
+kind: stack
+depends:
+- base/alpine-image.bst
diff --git a/tests/elements/autotools/elements/base/alpine-image.bst b/tests/elements/autotools/elements/base/alpine-image.bst
new file mode 100644
index 0000000..f8e00ba
--- /dev/null
+++ b/tests/elements/autotools/elements/base/alpine-image.bst
@@ -0,0 +1,6 @@
+kind: import
+description: Import an alpine image as the platform
+sources:
+- kind: tar
+  url: alpine:integration-tests-base.v1.x86_64.tar.xz
+  ref: 3eb559250ba82b64a68d86d0636a6b127aa5f6d25d3601a79f79214dc9703639
diff --git a/tests/elements/autotools/files/amhello.tar.gz b/tests/elements/autotools/files/amhello.tar.gz
new file mode 100644
index 0000000..afe1899
Binary files /dev/null and b/tests/elements/autotools/files/amhello.tar.gz differ
diff --git a/tests/elements/autotools/project.conf b/tests/elements/autotools/project.conf
new file mode 100644
index 0000000..76ce388
--- /dev/null
+++ b/tests/elements/autotools/project.conf
@@ -0,0 +1,15 @@
+# test project config
+name: test
+min-version: 2.0
+
+element-path: elements
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  elements:
+  - autotools
+
+aliases:
+  alpine: https://bst-integration-test-images.ams3.cdn.digitaloceanspaces.com/
+  project_dir: file://{project_dir}


[buildstream-plugins] 17/49: MANIFEST.in: Adding initial manifest

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 98baba4bd8f0b3e546dadefd86bdc99f56e1d5c4
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Mar 19 14:48:39 2022 +0900

    MANIFEST.in: Adding initial manifest
---
 MANIFEST.in | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/MANIFEST.in b/MANIFEST.in
new file mode 100644
index 0000000..0ce98c8
--- /dev/null
+++ b/MANIFEST.in
@@ -0,0 +1,3 @@
+global-include *.yaml
+include requirements/*
+include project.conf


[buildstream-plugins] 20/49: setup.cfg: Adding initial setup.cfg

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 511ddc4d444b5ce3f03721c4b2614a7d0f930fb0
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Mar 19 15:45:02 2022 +0900

    setup.cfg: Adding initial setup.cfg
---
 setup.cfg | 12 ++++++++++++
 1 file changed, 12 insertions(+)

diff --git a/setup.cfg b/setup.cfg
new file mode 100644
index 0000000..c324a67
--- /dev/null
+++ b/setup.cfg
@@ -0,0 +1,12 @@
+[aliases]
+test=pytest
+
+[tool:pytest]
+addopts = --verbose --basetemp ./tmp --durations=20
+norecursedirs = integration-cache tmp __pycache__ .eggs
+python_files = tests/*.py
+markers =
+    integration: run test only if --integration option is specified
+    datafiles: share datafiles in tests
+env =
+    D:BST_TEST_SUITE=True


[buildstream-plugins] 41/49: .github/worflows: Use ubuntu 18.04 instead of 20.04

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 2ca29ca42b1bdb83ae387a84badd31fd7a5ccfe4
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Thu Mar 31 16:07:52 2022 +0900

    .github/worflows: Use ubuntu 18.04 instead of 20.04
    
    This host vm/kernel apparently doesnt break when running buildstream.
---
 .github/workflows/ci.yml    | 4 ++--
 .github/workflows/merge.yml | 4 ++--
 2 files changed, 4 insertions(+), 4 deletions(-)

diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
index 7017a26..4b78f21 100644
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -26,7 +26,7 @@ concurrency:
 
 jobs:
   tests:
-    runs-on: ubuntu-20.04
+    runs-on: ubuntu-18.04
     continue-on-error: ${{ matrix.allow-failure || false }}
 
     strategy:
@@ -59,7 +59,7 @@ jobs:
           ${GITHUB_WORKSPACE}/.github/run-ci.sh ${{ matrix.test-name }}
 
   docs:
-    runs-on: ubuntu-20.04
+    runs-on: ubuntu-18.04
     steps:
       - name: Check out repository
         uses: actions/checkout@v2
diff --git a/.github/workflows/merge.yml b/.github/workflows/merge.yml
index 8bb492e..0921a0f 100644
--- a/.github/workflows/merge.yml
+++ b/.github/workflows/merge.yml
@@ -8,7 +8,7 @@ on:
 jobs:
   build:
     name: Build documentation
-    runs-on: ubuntu-20.04
+    runs-on: ubuntu-18.04
     steps:
     - name: Checkout code
       uses: actions/checkout@v2
@@ -40,7 +40,7 @@ jobs:
 
   publish:
     needs: build
-    runs-on: ubuntu-20.04
+    runs-on: ubuntu-18.04
     steps:
 
     - name: Download artifact


[buildstream-plugins] 07/49: Initially adding git source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 2d439b596aa166ff32592c7a8225b7d01c1ce06e
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 18 16:52:27 2022 +0900

    Initially adding git source
    
    From buildstream core plugins - merged directly with GitSourceBase class,
    there is no need for this split anymore.
---
 src/buildstream_plugins/sources/git.py | 991 +++++++++++++++++++++++++++++++++
 1 file changed, 991 insertions(+)

diff --git a/src/buildstream_plugins/sources/git.py b/src/buildstream_plugins/sources/git.py
new file mode 100644
index 0000000..5d7603d
--- /dev/null
+++ b/src/buildstream_plugins/sources/git.py
@@ -0,0 +1,991 @@
+#
+#  Copyright (C) 2016 Codethink Limited
+#  Copyright (C) 2018 Bloomberg Finance LP
+#
+#  Licensed under the Apache License, Version 2.0 (the "License");
+#  you may not use this file except in compliance with the License.
+#  You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+#  Unless required by applicable law or agreed to in writing, software
+#  distributed under the License is distributed on an "AS IS" BASIS,
+#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+#  See the License for the specific language governing permissions and
+#  limitations under the License.
+#
+#  Authors:
+#        Tristan Van Berkom <tr...@codethink.co.uk>
+#        Chandan Singh <cs...@bloomberg.net>
+#        Tom Mewett <to...@codethink.co.uk>
+
+"""
+git - stage files from a git repository
+=======================================
+
+**Host dependencies:**
+
+  * git
+
+.. attention::
+
+    Note that this plugin **will checkout git submodules by default**; even if
+    they are not specified in the `.bst` file.
+
+**Usage:**
+
+.. code:: yaml
+
+   # Specify the git source kind
+   kind: git
+
+   # Specify the repository url, using an alias defined
+   # in your project configuration is recommended.
+   url: upstream:foo.git
+
+   # Optionally specify a symbolic tracking branch or tag, this
+   # will be used to update the 'ref' when refreshing the pipeline.
+   track: master
+
+   # Optionally specify the ref format used for tracking.
+   # The default is 'sha1' for the raw commit hash.
+   # If you specify 'git-describe', the commit hash will be prefixed
+   # with the closest tag.
+   ref-format: sha1
+
+   # Specify the commit ref, this must be specified in order to
+   # checkout sources and build, but can be automatically updated
+   # if the 'track' attribute was specified.
+   ref: d63cbb6fdc0bbdadc4a1b92284826a6d63a7ebcd
+
+   # Optionally specify whether submodules should be checked-out.
+   # This is done recursively, as with `git clone --recurse-submodules`.
+   # If not set, this will default to 'True'
+   checkout-submodules: True
+
+   # If your repository has submodules, explicitly specifying the
+   # url from which they are to be fetched allows you to easily
+   # rebuild the same sources from a different location. This is
+   # especially handy when used with project defined aliases which
+   # can be redefined at a later time.
+   # You may also explicitly specify whether to check out this
+   # submodule. If 'checkout' is set, it will control whether to
+   # checkout that submodule and recurse into it. It defaults to the
+   # value of 'checkout-submodules'.
+   submodules:
+     plugins/bar:
+       url: upstream:bar.git
+       checkout: True
+     plugins/bar/quux:
+       checkout: False
+     plugins/baz:
+       url: upstream:baz.git
+       checkout: False
+
+   # Enable tag tracking.
+   #
+   # This causes the `tags` metadata to be populated automatically
+   # as a result of tracking the git source.
+   #
+   # By default this is 'False'.
+   #
+   track-tags: True
+
+   # If the list of tags below is set, then a lightweight dummy
+   # git repository will be staged along with the content at
+   # build time.
+   #
+   # This is useful for a growing number of modules which use
+   # `git describe` at build time in order to determine the version
+   # which will be encoded into the built software.
+   #
+   # The 'tags' below is considered as a part of the git source
+   # reference and will be stored in the 'project.refs' file if
+   # that has been selected as your project's ref-storage.
+   #
+   # Migration notes:
+   #
+   #   If you are upgrading from BuildStream 1.2, which used to
+   #   stage the entire repository by default, you will notice that
+   #   some modules which use `git describe` are broken, and will
+   #   need to enable this feature in order to fix them.
+   #
+   #   If you need to enable this feature without changing the
+   #   the specific commit that you are building, then we recommend
+   #   the following migration steps for any git sources where
+   #   `git describe` is required:
+   #
+   #     o Enable `track-tags` feature
+   #     o Set the `track` parameter to the desired commit sha which
+   #       the current `ref` points to
+   #     o Run `bst source track` for these elements, this will result in
+   #       populating the `tags` portion of the refs without changing
+   #       the refs
+   #     o Restore the `track` parameter to the branches which you have
+   #       previously been tracking afterwards.
+   #
+   tags:
+   - tag: lightweight-example
+     commit: 04ad0dc656cb7cc6feb781aa13bdbf1d67d0af78
+     annotated: false
+   - tag: annotated-example
+     commit: 10abe77fe8d77385d86f225b503d9185f4ef7f3a
+     annotated: true
+
+See `built-in functionality doumentation
+<https://docs.buildstream.build/master/buildstream.source.html#core-source-builtins>`_ for
+details on common configuration options for sources.
+
+
+**Configurable Warnings:**
+
+This plugin provides the following
+`configurable warnings <https://docs.buildstream.build/master/format_project.html#configurable-warnings>`_:
+
+- ``git:inconsistent-submodule`` - A submodule present in the git repository's .gitmodules was never
+  added with `git submodule add`.
+
+- ``git:unlisted-submodule`` - A submodule is present in the git repository but was not specified in
+  the source configuration and was not disabled for checkout.
+
+- ``git:invalid-submodule`` - A submodule is specified in the source configuration but does not exist
+  in the repository.
+
+This plugin also utilises the following configurable
+`core warnings <https://docs.buildstream.build/master/buildstream.types.html#buildstream.types.CoreWarnings>`_:
+
+- `ref-not-in-track <https://docs.buildstream.build/master/buildstream.types.html#buildstream.types.CoreWarnings.REF_NOT_IN_TRACK>`_ -
+  The provided ref was not found in the provided track in the element's git repository.
+"""
+
+
+import os
+import re
+import shutil
+from io import StringIO
+from tempfile import TemporaryFile
+
+from configparser import RawConfigParser
+
+from buildstream import Source, SourceError, SourceFetcher
+from buildstream import CoreWarnings, FastEnum
+from buildstream import utils
+from buildstream.utils import DirectoryExistsError
+
+GIT_MODULES = ".gitmodules"
+EXACT_TAG_PATTERN = r"(?P<tag>.*)-0-g(?P<commit>.*)"
+
+# Warnings
+WARN_INCONSISTENT_SUBMODULE = "inconsistent-submodule"
+WARN_UNLISTED_SUBMODULE = "unlisted-submodule"
+WARN_INVALID_SUBMODULE = "invalid-submodule"
+
+
+class _RefFormat(FastEnum):
+    SHA1 = "sha1"
+    GIT_DESCRIBE = "git-describe"
+
+
+def _strip_tag(rev):
+    return rev.split("-g")[-1]
+
+
+# This class represents a single Git repository. The Git source needs to account for
+# submodules, but we don't want to cache them all under the umbrella of the
+# superproject - so we use this class which caches them independently, according
+# to their URL. Instances keep reference to their "parent" GitSource,
+# and if applicable, where in the superproject they are found.
+#
+# Args:
+#    source (GitSource): The parent source
+#    path (str): The relative location of the submodule in the superproject;
+#                the empty string for the superproject itself
+#    url (str): Where to clone the repo from
+#    ref (str): Specified 'ref' from the source configuration
+#    primary (bool): Whether this is the primary URL for the source
+#    tags (list): Tag configuration; see GitSource._load_tags
+#
+class GitMirror(SourceFetcher):
+    def __init__(self, source, path, url, ref, *, primary=False, tags=None):
+
+        super().__init__()
+        self.source = source
+        self.path = path
+        self.url = url
+        self.ref = ref
+        self.tags = tags or []
+        self.primary = primary
+        self.mirror = os.path.join(source.get_mirror_directory(), utils.url_directory_name(url))
+
+    # _ensure_repo():
+    #
+    # Ensures that the Git repository exists at the mirror location and is configured
+    # to fetch from the given URL
+    #
+    def _ensure_repo(self):
+        if not os.path.exists(self.mirror):
+            with self.source.tempdir() as tmpdir:
+                self.source.call(
+                    [self.source.host_git, "init", "--bare", tmpdir], fail="Failed to initialise repository",
+                )
+
+                try:
+                    utils.move_atomic(tmpdir, self.mirror)
+                except DirectoryExistsError:
+                    # Another process was quicker to download this repository.
+                    # Let's discard our own
+                    self.source.status("{}: Discarding duplicate repository".format(self.source))
+                except OSError as e:
+                    raise SourceError(
+                        "{}: Failed to move created repository from '{}' to '{}': {}".format(
+                            self.source, tmpdir, self.mirror, e
+                        )
+                    ) from e
+
+    def _fetch(self, url, fetch_all=False):
+        self._ensure_repo()
+
+        # Work out whether we can fetch a specific tag: are we given a ref which
+        # 1. is in git-describe format
+        # 2. refers to an exact tag (is "...-0-g...")
+        # 3. is available on the remote and tags the specified commit?
+        # And lastly: are we on a new-enough Git which allows cloning from our potentially shallow cache?
+        if fetch_all:
+            pass
+        # Fetching from a shallow-cloned repo was first supported in v1.9.0
+        elif not self.ref or self.source.host_git_version is not None and self.source.host_git_version < (1, 9, 0):
+            fetch_all = True
+        else:
+            m = re.match(EXACT_TAG_PATTERN, self.ref)
+            if m is None:
+                fetch_all = True
+            else:
+                tag = m.group("tag")
+                commit = m.group("commit")
+
+                if not self.remote_has_tag(url, tag, commit):
+                    self.source.status(
+                        "{}: {} is not advertised on {}. Fetching all Git refs".format(self.source, self.ref, url)
+                    )
+                    fetch_all = True
+                else:
+                    exit_code = self.source.call(
+                        [
+                            self.source.host_git,
+                            "fetch",
+                            "--depth=1",
+                            url,
+                            "+refs/tags/{tag}:refs/tags/{tag}".format(tag=tag),
+                        ],
+                        cwd=self.mirror,
+                    )
+                    if exit_code != 0:
+                        self.source.status(
+                            "{}: Failed to fetch tag '{}' from {}. Fetching all Git refs".format(self.source, tag, url)
+                        )
+                        fetch_all = True
+
+        if fetch_all:
+            self.source.call(
+                [
+                    self.source.host_git,
+                    "fetch",
+                    "--prune",
+                    url,
+                    "+refs/heads/*:refs/heads/*",
+                    "+refs/tags/*:refs/tags/*",
+                ],
+                fail="Failed to fetch from remote git repository: {}".format(url),
+                fail_temporarily=True,
+                cwd=self.mirror,
+            )
+
+    def fetch(self, alias_override=None):  # pylint: disable=arguments-differ
+        resolved_url = self.source.translate_url(self.url, alias_override=alias_override, primary=self.primary)
+
+        with self.source.timed_activity("Fetching from {}".format(resolved_url), silent_nested=True):
+            if not self.has_ref():
+                self._fetch(resolved_url)
+            self.assert_ref()
+
+    def has_ref(self):
+        if not self.ref:
+            return False
+
+        # If the mirror doesnt exist, we also dont have the ref
+        if not os.path.exists(self.mirror):
+            return False
+
+        # Check if the ref is really there
+        rc = self.source.call([self.source.host_git, "cat-file", "-t", self.ref], cwd=self.mirror)
+        return rc == 0
+
+    def assert_ref(self):
+        if not self.has_ref():
+            raise SourceError(
+                "{}: expected ref '{}' was not found in git repository: '{}'".format(self.source, self.ref, self.url)
+            )
+
+    # remote_has_tag():
+    #
+    # Args:
+    #     url (str)
+    #     tag (str)
+    #     commit (str)
+    #
+    # Returns:
+    #     (bool): Whether the remote at `url` has the tag `tag` attached to `commit`
+    #
+    def remote_has_tag(self, url, tag, commit):
+        _, ls_remote = self.source.check_output(
+            [self.source.host_git, "ls-remote", url],
+            cwd=self.mirror,
+            fail="Failed to list advertised remote refs from git repository {}".format(url),
+        )
+
+        line = "{commit}\trefs/tags/{tag}".format(commit=commit, tag=tag)
+        return line in ls_remote or line + "^{}" in ls_remote
+
+    # to_commit():
+    #
+    # Args:
+    #     rev (str): A Git "commit-ish" rev
+    #
+    # Returns:
+    #     (str): The full revision ID of the commit
+    #
+    def to_commit(self, rev):
+        _, output = self.source.check_output(
+            [self.source.host_git, "rev-list", "-n", "1", rev],
+            fail="Unable to find revision {}".format(rev),
+            cwd=self.mirror,
+        )
+
+        return output.strip()
+
+    # describe():
+    #
+    # Args:
+    #     rev (str): A Git "commit-ish" rev
+    #
+    # Returns:
+    #     (str): The full revision ID of the commit given by rev, prepended
+    #            with tag information as given by git-describe (where available)
+    #
+    def describe(self, rev):
+        _, output = self.source.check_output(
+            [self.source.host_git, "describe", "--tags", "--abbrev=40", "--long", "--always", rev,],
+            fail="Unable to find revision {}".format(rev),
+            cwd=self.mirror,
+        )
+
+        return output.strip()
+
+    # reachable_tags():
+    #
+    # Args:
+    #     rev (str): A Git "commit-ish" rev
+    #
+    # Returns:
+    #     (list): A list of tags in the ancestry of rev. Each entry is a triple of the form
+    #             (tag name (str), commit ref (str), annotated (bool)) describing a tag,
+    #             its tagged commit and whether it's annotated
+    #
+    def reachable_tags(self, rev):
+        tags = set()
+        for options in [
+            [],
+            ["--first-parent"],
+            ["--tags"],
+            ["--tags", "--first-parent"],
+        ]:
+            exit_code, output = self.source.check_output(
+                [self.source.host_git, "describe", "--abbrev=0", rev, *options,], cwd=self.mirror,
+            )
+            if exit_code == 0:
+                tag = output.strip()
+                _, commit_ref = self.source.check_output(
+                    [self.source.host_git, "rev-parse", tag + "^{commit}"],
+                    fail="Unable to resolve tag '{}'".format(tag),
+                    cwd=self.mirror,
+                )
+                exit_code = self.source.call([self.source.host_git, "cat-file", "tag", tag], cwd=self.mirror,)
+                annotated = exit_code == 0
+
+                tags.add((tag, commit_ref.strip(), annotated))
+
+        return list(tags)
+
+    def stage(self, directory):
+        fullpath = os.path.join(directory, self.path)
+
+        # Using --shared here avoids copying the objects into the checkout, in any
+        # case we're just checking out a specific commit and then removing the .git/
+        # directory.
+        self.source.call(
+            [self.source.host_git, "clone", "--no-checkout", "--shared", self.mirror, fullpath,],
+            fail="Failed to create git mirror {} in directory: {}".format(self.mirror, fullpath),
+            fail_temporarily=True,
+        )
+
+        self.source.call(
+            [self.source.host_git, "checkout", "--force", self.ref],
+            fail="Failed to checkout git ref {}".format(self.ref),
+            cwd=fullpath,
+        )
+
+        # Remove .git dir
+        shutil.rmtree(os.path.join(fullpath, ".git"))
+
+        self._rebuild_git(fullpath)
+
+    def init_workspace(self, directory):
+        fullpath = os.path.join(directory, self.path)
+        url = self.source.translate_url(self.url)
+
+        self.source.call(
+            [self.source.host_git, "clone", "--no-checkout", self.mirror, fullpath,],
+            fail="Failed to clone git mirror {} in directory: {}".format(self.mirror, fullpath),
+            fail_temporarily=True,
+        )
+
+        self.source.call(
+            [self.source.host_git, "remote", "set-url", "origin", url],
+            fail='Failed to add remote origin "{}"'.format(url),
+            cwd=fullpath,
+        )
+
+        self.source.call(
+            [self.source.host_git, "checkout", "--force", self.ref],
+            fail="Failed to checkout git ref {}".format(self.ref),
+            cwd=fullpath,
+        )
+
+    # get_submodule_mirrors():
+    #
+    # Returns:
+    #     An iterator through new instances of this class, one of each submodule
+    #     in the repo
+    #
+    def get_submodule_mirrors(self):
+        for path, url in self.submodule_list():
+            ref = self.submodule_ref(path)
+            if ref is not None:
+                mirror = self.__class__(self.source, os.path.join(self.path, path), url, ref)
+                yield mirror
+
+    # List the submodules (path/url tuples) present at the given ref of this repo
+    def submodule_list(self):
+        modules = "{}:{}".format(self.ref, GIT_MODULES)
+        exit_code, output = self.source.check_output([self.source.host_git, "show", modules], cwd=self.mirror)
+
+        # If git show reports error code 128 here, we take it to mean there is
+        # no .gitmodules file to display for the given revision.
+        if exit_code == 128:
+            return
+        elif exit_code != 0:
+            raise SourceError("{plugin}: Failed to show gitmodules at ref {ref}".format(plugin=self, ref=self.ref))
+
+        content = "\n".join([l.strip() for l in output.splitlines()])
+
+        io = StringIO(content)
+        parser = RawConfigParser()
+        parser.read_file(io)
+
+        for section in parser.sections():
+            # validate section name against the 'submodule "foo"' pattern
+            if re.match(r'submodule "(.*)"', section):
+                path = parser.get(section, "path")
+                url = parser.get(section, "url")
+
+                yield (path, url)
+
+    # Fetch the ref which this mirror requires its submodule to have,
+    # at the given ref of this mirror.
+    def submodule_ref(self, submodule, ref=None):
+        if not ref:
+            ref = self.ref
+
+        # list objects in the parent repo tree to find the commit
+        # object that corresponds to the submodule
+        _, output = self.source.check_output(
+            [self.source.host_git, "ls-tree", ref, submodule],
+            fail="ls-tree failed for commit {} and submodule: {}".format(ref, submodule),
+            cwd=self.mirror,
+        )
+
+        # read the commit hash from the output
+        fields = output.split()
+        if len(fields) >= 2 and fields[1] == "commit":
+            submodule_commit = output.split()[2]
+
+            # fail if the commit hash is invalid
+            if len(submodule_commit) != 40:
+                raise SourceError(
+                    "{}: Error reading commit information for submodule '{}'".format(self.source, submodule)
+                )
+
+            return submodule_commit
+
+        else:
+            detail = (
+                "The submodule '{}' is defined either in the BuildStream source\n".format(submodule)
+                + "definition, or in a .gitmodules file. But the submodule was never added to the\n"
+                + "underlying git repository with `git submodule add`."
+            )
+
+            self.source.warn(
+                "{}: Ignoring inconsistent submodule '{}'".format(self.source, submodule),
+                detail=detail,
+                warning_token=WARN_INCONSISTENT_SUBMODULE,
+            )
+
+            return None
+
+    def _rebuild_git(self, fullpath):
+        if not self.tags:
+            return
+
+        with self.source.tempdir() as tmpdir:
+            included = set()
+            shallow = set()
+            for _, commit_ref, _ in self.tags:
+
+                if commit_ref == self.ref:
+                    # rev-list does not work in case of same rev
+                    shallow.add(self.ref)
+                else:
+                    _, out = self.source.check_output(
+                        [
+                            self.source.host_git,
+                            "rev-list",
+                            "--ancestry-path",
+                            "--boundary",
+                            "{}..{}".format(commit_ref, self.ref),
+                        ],
+                        fail="Failed to get git history {}..{} in directory: {}".format(
+                            commit_ref, self.ref, fullpath
+                        ),
+                        fail_temporarily=True,
+                        cwd=self.mirror,
+                    )
+                    self.source.warn("refs {}..{}: {}".format(commit_ref, self.ref, out.splitlines()))
+                    for line in out.splitlines():
+                        rev = line.lstrip("-")
+                        if line[0] == "-":
+                            shallow.add(rev)
+                        else:
+                            included.add(rev)
+
+            shallow -= included
+            included |= shallow
+
+            self.source.call(
+                [self.source.host_git, "init"],
+                fail="Cannot initialize git repository: {}".format(fullpath),
+                cwd=fullpath,
+            )
+
+            for rev in included:
+                with TemporaryFile(dir=tmpdir) as commit_file:
+                    self.source.call(
+                        [self.source.host_git, "cat-file", "commit", rev],
+                        stdout=commit_file,
+                        fail="Failed to get commit {}".format(rev),
+                        cwd=self.mirror,
+                    )
+                    commit_file.seek(0, 0)
+                    self.source.call(
+                        [self.source.host_git, "hash-object", "-w", "-t", "commit", "--stdin",],
+                        stdin=commit_file,
+                        fail="Failed to add commit object {}".format(rev),
+                        cwd=fullpath,
+                    )
+
+            with open(os.path.join(fullpath, ".git", "shallow"), "w", encoding="utf-8",) as shallow_file:
+                for rev in shallow:
+                    shallow_file.write("{}\n".format(rev))
+
+            for tag, commit_ref, annotated in self.tags:
+                if annotated:
+                    with TemporaryFile(dir=tmpdir) as tag_file:
+                        tag_data = "object {}\ntype commit\ntag {}\n".format(commit_ref, tag)
+                        tag_file.write(tag_data.encode("ascii"))
+                        tag_file.seek(0, 0)
+                        _, tag_ref = self.source.check_output(
+                            [self.source.host_git, "hash-object", "-w", "-t", "tag", "--stdin",],
+                            stdin=tag_file,
+                            fail="Failed to add tag object {}".format(tag),
+                            cwd=fullpath,
+                        )
+
+                    self.source.call(
+                        [self.source.host_git, "tag", tag, tag_ref.strip()],
+                        fail="Failed to tag: {}".format(tag),
+                        cwd=fullpath,
+                    )
+                else:
+                    self.source.call(
+                        [self.source.host_git, "tag", tag, commit_ref],
+                        fail="Failed to tag: {}".format(tag),
+                        cwd=fullpath,
+                    )
+
+            with open(os.path.join(fullpath, ".git", "HEAD"), "w", encoding="utf-8") as head:
+                self.source.call(
+                    [self.source.host_git, "rev-parse", self.ref],
+                    stdout=head,
+                    fail="Failed to parse commit {}".format(self.ref),
+                    cwd=self.mirror,
+                )
+
+
+class GitSource(Source):
+    # pylint: disable=attribute-defined-outside-init
+
+    BST_MIN_VERSION = "2.0"
+
+    def configure(self, node):
+        ref = node.get_str("ref", None)
+
+        config_keys = [
+            "url",
+            "track",
+            "ref",
+            "submodules",
+            "checkout-submodules",
+            "ref-format",
+            "track-tags",
+            "tags",
+        ]
+        node.validate_keys(config_keys + Source.COMMON_CONFIG_KEYS)
+
+        tags_node = node.get_sequence("tags", [])
+        for tag_node in tags_node:
+            tag_node.validate_keys(["tag", "commit", "annotated"])
+
+        tags = self._load_tags(node)
+        self.track_tags = node.get_bool("track-tags", default=False)
+
+        self.original_url = node.get_str("url")
+        self.mirror = GitMirror(self, "", self.original_url, ref, tags=tags, primary=True)
+        self.tracking = node.get_str("track", None)
+
+        self.ref_format = node.get_enum("ref-format", _RefFormat, _RefFormat.SHA1)
+
+        # At this point we now know if the source has a ref and/or a track.
+        # If it is missing both then we will be unable to track or build.
+        if self.mirror.ref is None and self.tracking is None:
+            raise SourceError(
+                "{}: Git sources require a ref and/or track".format(self), reason="missing-track-and-ref",
+            )
+
+        self.checkout_submodules = node.get_bool("checkout-submodules", default=True)
+
+        # Parse a dict of submodule overrides, stored in the submodule_overrides
+        # and submodule_checkout_overrides dictionaries.
+        self.submodule_overrides = {}
+        self.submodule_checkout_overrides = {}
+        modules = node.get_mapping("submodules", {})
+        for path in modules.keys():
+            submodule = modules.get_mapping(path)
+            url = submodule.get_str("url", None)
+
+            # Make sure to mark all URLs that are specified in the configuration
+            if url:
+                self.mark_download_url(url, primary=False)
+
+            self.submodule_overrides[path] = url
+            if "checkout" in submodule:
+                checkout = submodule.get_bool("checkout")
+                self.submodule_checkout_overrides[path] = checkout
+
+        self.mark_download_url(self.original_url)
+
+    def preflight(self):
+        # Check if git is installed, get the binary at the same time
+        self.host_git = utils.get_host_tool("git")
+
+        rc, version_str = self.check_output([self.host_git, "--version"])
+        # e.g. on Git for Windows we get "git version 2.21.0.windows.1".
+        # e.g. on Mac via Homebrew we get "git version 2.19.0".
+        if rc == 0:
+            self.host_git_version = tuple(int(x) for x in version_str.split(" ")[2].split(".")[:3])
+        else:
+            self.host_git_version = None
+
+    def get_unique_key(self):
+        ref = self.mirror.ref
+        if ref is not None:
+            # Strip any (arbitary) tag information, leaving just the commit ID
+            ref = _strip_tag(ref)
+
+        # Here we want to encode the local name of the repository and
+        # the ref, if the user changes the alias to fetch the same sources
+        # from another location, it should not affect the cache key.
+        key = [self.original_url, ref]
+        if self.mirror.tags:
+            tags = {tag: (commit, annotated) for tag, commit, annotated in self.mirror.tags}
+            key.append({"tags": tags})
+
+        # Only modify the cache key with checkout_submodules if it's something
+        # other than the default behaviour.
+        if self.checkout_submodules is False:
+            key.append({"checkout_submodules": self.checkout_submodules})
+
+        # We want the cache key to change if the source was
+        # configured differently, and submodules count.
+        if self.submodule_overrides:
+            key.append(self.submodule_overrides)
+
+        if self.submodule_checkout_overrides:
+            key.append({"submodule_checkout_overrides": self.submodule_checkout_overrides})
+
+        return key
+
+    def is_resolved(self):
+        return self.mirror.ref is not None
+
+    def is_cached(self):
+        return self._have_all_refs()
+
+    def load_ref(self, node):
+        self.mirror.ref = node.get_str("ref", None)
+        self.mirror.tags = self._load_tags(node)
+
+    def get_ref(self):
+        if self.mirror.ref is None:
+            return None
+        return self.mirror.ref, self.mirror.tags
+
+    def set_ref(self, ref, node):
+        if not ref:
+            self.mirror.ref = None
+            if "ref" in node:
+                del node["ref"]
+            self.mirror.tags = []
+            if "tags" in node:
+                del node["tags"]
+        else:
+            actual_ref, tags = ref
+            node["ref"] = self.mirror.ref = actual_ref
+            self.mirror.tags = tags
+            if tags:
+                node["tags"] = []
+                for tag, commit_ref, annotated in tags:
+                    data = {
+                        "tag": tag,
+                        "commit": commit_ref,
+                        "annotated": annotated,
+                    }
+                    node["tags"].append(data)
+            else:
+                if "tags" in node:
+                    del node["tags"]
+
+    def track(self):  # pylint: disable=arguments-differ
+
+        # If self.tracking is not specified it's not an error, just silently return
+        if not self.tracking:
+            # Is there a better way to check if a ref is given.
+            if self.mirror.ref is None:
+                detail = "Without a tracking branch ref can not be updated. Please " + "provide a ref or a track."
+                raise SourceError(
+                    "{}: No track or ref".format(self), detail=detail, reason="track-attempt-no-track",
+                )
+            return None
+
+        # Resolve the URL for the message
+        resolved_url = self.translate_url(self.mirror.url)
+        with self.timed_activity(
+            "Tracking {} from {}".format(self.tracking, resolved_url), silent_nested=True,
+        ):
+            self.mirror._fetch(resolved_url, fetch_all=True)
+
+            ref = self.mirror.to_commit(self.tracking)
+            tags = self.mirror.reachable_tags(ref) if self.track_tags else []
+
+            if self.ref_format == _RefFormat.GIT_DESCRIBE:
+                ref = self.mirror.describe(ref)
+
+            return ref, tags
+
+    def init_workspace(self, directory):
+        with self.timed_activity('Setting up workspace "{}"'.format(directory), silent_nested=True):
+            self.mirror.init_workspace(directory)
+            for mirror in self._recurse_submodules(configure=True):
+                mirror.init_workspace(directory)
+
+    def stage(self, directory):
+        # Stage the main repo in the specified directory
+        #
+        with self.timed_activity("Staging {}".format(self.mirror.url), silent_nested=True):
+            self.mirror.stage(directory)
+            for mirror in self._recurse_submodules(configure=True):
+                mirror.stage(directory)
+
+    def get_source_fetchers(self):
+        self.mirror.mark_download_url(self.mirror.url)
+        yield self.mirror
+        # _recurse_submodules only iterates those which are known at the current
+        # cached state - but fetch is called on each result as we go, so this will
+        # yield all configured submodules
+        for submodule in self._recurse_submodules(configure=True):
+            submodule.mark_download_url(submodule.url)
+            yield submodule
+
+    def validate_cache(self):
+        discovered_submodules = {}
+        unlisted_submodules = []
+        invalid_submodules = []
+
+        for submodule in self._recurse_submodules(configure=False):
+            discovered_submodules[submodule.path] = submodule.url
+            if self._ignoring_submodule(submodule.path):
+                continue
+
+            if submodule.path not in self.submodule_overrides:
+                unlisted_submodules.append((submodule.path, submodule.url))
+
+        # Warn about submodules which are explicitly configured but do not exist
+        for path, url in self.submodule_overrides.items():
+            if path not in discovered_submodules:
+                invalid_submodules.append((path, url))
+
+        if invalid_submodules:
+            detail = []
+            for path, url in invalid_submodules:
+                detail.append("  Submodule URL '{}' at path '{}'".format(url, path))
+
+            self.warn(
+                "{}: Invalid submodules specified".format(self),
+                warning_token=WARN_INVALID_SUBMODULE,
+                detail="The following submodules are specified in the source "
+                "description but do not exist according to the repository\n\n" + "\n".join(detail),
+            )
+
+        # Warn about submodules which exist but have not been explicitly configured
+        if unlisted_submodules:
+            detail = []
+            for path, url in unlisted_submodules:
+                detail.append("  Submodule URL '{}' at path '{}'".format(url, path))
+
+            self.warn(
+                "{}: Unlisted submodules exist".format(self),
+                warning_token=WARN_UNLISTED_SUBMODULE,
+                detail="The following submodules exist but are not specified "
+                + "in the source description\n\n"
+                + "\n".join(detail),
+            )
+
+        # Assert that the ref exists in the track tag/branch, if track has been specified.
+        # Also don't do this check if an exact tag ref is given, as we probably didn't fetch
+        # any branch refs
+        ref_in_track = False
+        if not re.match(EXACT_TAG_PATTERN, self.mirror.ref) and self.tracking:
+            _, branch = self.check_output(
+                [self.host_git, "branch", "--list", self.tracking, "--contains", self.mirror.ref,],
+                cwd=self.mirror.mirror,
+            )
+            if branch:
+                ref_in_track = True
+            else:
+                _, tag = self.check_output(
+                    [self.host_git, "tag", "--list", self.tracking, "--contains", self.mirror.ref,],
+                    cwd=self.mirror.mirror,
+                )
+                if tag:
+                    ref_in_track = True
+
+            if not ref_in_track:
+                detail = (
+                    "The ref provided for the element does not exist locally "
+                    + "in the provided track branch / tag '{}'.\n".format(self.tracking)
+                    + "You may wish to track the element to update the ref from '{}' ".format(self.tracking)
+                    + "with `bst source track`,\n"
+                    + "or examine the upstream at '{}' for the specific ref.".format(self.mirror.url)
+                )
+
+                self.warn(
+                    "{}: expected ref '{}' was not found in given track '{}' for staged repository: '{}'\n".format(
+                        self, self.mirror.ref, self.tracking, self.mirror.url
+                    ),
+                    detail=detail,
+                    warning_token=CoreWarnings.REF_NOT_IN_TRACK,
+                )
+
+    ###########################################################
+    #                     Local Functions                     #
+    ###########################################################
+
+    def _have_all_refs(self):
+        return self.mirror.has_ref() and all(
+            submodule.has_ref() for submodule in self._recurse_submodules(configure=True)
+        )
+
+    # _configure_submodules():
+    #
+    # Args:
+    #     submodules: An iterator of GitMirror (or similar) objects for submodules
+    #
+    # Returns:
+    #     An iterator through `submodules` but filtered of any ignored submodules
+    #     and modified to use any custom URLs configured in the source
+    #
+    def _configure_submodules(self, submodules):
+        for submodule in submodules:
+            if self._ignoring_submodule(submodule.path):
+                continue
+            # Allow configuration to override the upstream location of the submodules.
+            submodule.url = self.submodule_overrides.get(submodule.path, submodule.url)
+            yield submodule
+
+    # _recurse_submodules():
+    #
+    # Recursively iterates through GitMirrors for submodules of the main repo. Only
+    # submodules that are cached are recursed into - but this is decided at
+    # iteration time, so you can fetch in a for loop over this function to fetch
+    # all submodules.
+    #
+    # Args:
+    #     configure (bool): Whether to apply the 'submodule' config while recursing
+    #                       (URL changing and 'checkout' overrides)
+    #
+    def _recurse_submodules(self, configure):
+        def recurse(mirror):
+            submodules = mirror.get_submodule_mirrors()
+            if configure:
+                submodules = self._configure_submodules(submodules)
+
+            for submodule in submodules:
+                yield submodule
+                if submodule.has_ref():
+                    yield from recurse(submodule)
+
+        yield from recurse(self.mirror)
+
+    def _load_tags(self, node):
+        tags = []
+        tags_node = node.get_sequence("tags", [])
+        for tag_node in tags_node:
+            tag = tag_node.get_str("tag")
+            commit_ref = tag_node.get_str("commit")
+            annotated = tag_node.get_bool("annotated")
+            tags.append((tag, commit_ref, annotated))
+        return tags
+
+    # _ignoring_submodule():
+    #
+    # Args:
+    #     path (str): The path of a submodule in the superproject
+    #
+    # Returns:
+    #     (bool): Whether to not clone/checkout this submodule
+    #
+    def _ignoring_submodule(self, path):
+        return not self.submodule_checkout_overrides.get(path, self.checkout_submodules)
+
+
+# Plugin entry point
+def setup():
+    return GitSource


[buildstream-plugins] 38/49: .github/CODEOWNERS: Adding CODEOWNERS file

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 09adcfcfbc9520e3334c1a9b97d0d937f9d90417
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Fri Mar 25 14:13:08 2022 +0900

    .github/CODEOWNERS: Adding CODEOWNERS file
---
 .github/CODEOWNERS | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS
new file mode 100644
index 0000000..dfc7c85
--- /dev/null
+++ b/.github/CODEOWNERS
@@ -0,0 +1,6 @@
+# Each line is a file pattern followed by one or more owners.
+
+# These owners will be the default owners for everything in
+# the repo, unless a later match takes precedence.
+#
+*       @gtristan @juergbi @BenjaminSchubert @cs-shadow @abderrahim


[buildstream-plugins] 36/49: doc: Adding docs build.

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit c74456f7cb9389414378f73371f5773b660ceaa8
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Thu Mar 24 14:26:22 2022 +0900

    doc: Adding docs build.
    
    Now `tox -e docs` properly generates the documentation.
---
 doc/Makefile                  |  97 ++++++++++++
 doc/source/conf.py            | 345 ++++++++++++++++++++++++++++++++++++++++++
 doc/source/index.rst          |  31 ++++
 doc/source/plugin.rsttemplate |   4 +
 doc/sphinx-build3             |  15 ++
 5 files changed, 492 insertions(+)

diff --git a/doc/Makefile b/doc/Makefile
new file mode 100644
index 0000000..eae1340
--- /dev/null
+++ b/doc/Makefile
@@ -0,0 +1,97 @@
+# Makefile for Sphinx documentation
+#
+
+# You can set these variables from the command line.
+SPHINXOPTS    = -W
+SPHINXBUILD   = sphinx-build
+PAPER         =
+BUILDDIR      = build
+
+# Internal variables.
+PAPEROPT_a4     = -D latex_paper_size=a4
+PAPEROPT_letter = -D latex_paper_size=letter
+ALLSPHINXOPTS   = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
+# the i18n builder cannot share the environment and doctrees with the others
+I18NSPHINXOPTS  = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
+
+# Fix for when python is mapped to python2 not python3
+# This is an issue in the sphinx-build script provided in the default install
+# because it uses the generic python env, so we need to have a copy of this script
+# but with an explicit call to python3.
+#
+# Why python3? We are using some features of sphinx that are only implemented
+# currently in python3
+#
+PYV=$(shell python -c "import sys;t='{v[0]}'.format(v=list(sys.version_info[:2]));sys.stdout.write(t)")
+
+ifeq ($(PYV), 2)
+	SPHINXBUILD = ./sphinx-build3
+endif
+
+.PHONY: all clean templates html devhelp
+
+# Canned recipe for generating plugin api skeletons
+#   $1 = the plugin directory
+#   $2 = the output docs directory
+#
+# Explanation:
+#
+#   Sphinx does not have any option for skipping documentation,
+#   we don't want to document plugin code because nobody uses that
+#   but we do want to use module-level docstrings in plugins in
+#   order to explain how plugins work.
+#
+#   For this purpose, we replace sphinx-apidoc with a simple
+#   makefile rule which generates a template slightly differently
+#   from how sphinx does it, allowing us to get what we want
+#   from plugin documentation.
+#
+define plugin-doc-skeleton
+    @for file in $$(find ${1}/${2} -name "*.py" ! -name "_*"); do \
+        base=$$(basename $$file);                                   \
+        module=${2}.$${base%.py};                                        \
+        modname=$${base%.py};                                       \
+        echo -n "Generating source/${2}/$${modname}.rst... ";       \
+        sed -e "s|@@MODULENAME@@|$${modname}|g"                     \
+            -e "s|@@MODULE@@|$${module}|g"                          \
+            source/plugin.rsttemplate >                             \
+            source/${2}/$${modname}.rst.tmp &&                      \
+            mv source/${2}/$${modname}.rst.tmp source/${2}/$${modname}.rst || exit 1; \
+        echo "Done."; \
+    done
+endef
+
+# We set PYTHONPATH here because source/conf.py sys.modules hacks don't seem to help sphinx-build import the plugins
+all: html devhelp
+
+clean: templates-clean
+	rm -rf build
+
+# Generate rst templates for the docs using a mix of sphinx-apidoc and
+# our 'plugin-doc-skeleton' routine for plugin pages.
+templates:
+	mkdir -p source/elements
+	mkdir -p source/sources
+	$(call plugin-doc-skeleton,$(CURDIR)/../src/buildstream_plugins,elements)
+	$(call plugin-doc-skeleton,$(CURDIR)/../src/buildstream_plugins,sources)
+
+templates-clean:
+	rm -rf source/elements
+	rm -rf source/sources
+
+# Targets which generate docs with sphinx build
+#
+#
+html devhelp: templates
+	@echo "Building $@..."
+	PYTHONPATH=$(CURDIR)/../src/buildstream_plugins \
+	    $(SPHINXBUILD) -b $@ $(ALLSPHINXOPTS) "$(BUILDDIR)/$@" \
+	    $(wildcard source/*.rst) \
+	    $(wildcard source/elements/*.rst) \
+	    $(wildcard source/sources/*.rst)
+	@echo
+	@echo "Build of $@ finished, output: $(CURDIR)/$(BUILDDIR)/$@"
+
+testy:
+	@echo "Using $(SPHINXBUILD)"
+	@echo "Py is $(PYV)"
diff --git a/doc/source/conf.py b/doc/source/conf.py
new file mode 100644
index 0000000..fb238b9
--- /dev/null
+++ b/doc/source/conf.py
@@ -0,0 +1,345 @@
+#!/usr/bin/env python3
+# -*- coding: utf-8 -*-
+#
+# BuildStream documentation build configuration file, created by
+# sphinx-quickstart on Mon Nov  7 21:03:37 2016.
+#
+# This file is execfile()d with the current directory set to its
+# containing dir.
+#
+# Note that not all possible configuration values are present in this
+# autogenerated file.
+#
+# All configuration values have a default; values that are commented out
+# serve to show the default.
+
+# If extensions (or modules to document with autodoc) are in another directory,
+# add these directories to sys.path here. If the directory is relative to the
+# documentation root, use os.path.abspath to make it absolute, like shown here.
+#
+import os
+import sys
+sys.path.insert(0, os.path.abspath('..'))
+
+# -- General configuration ------------------------------------------------
+
+# If your documentation needs a minimal Sphinx version, state it here.
+#
+# needs_sphinx = '1.0'
+
+# Add any Sphinx extension module names here, as strings. They can be
+# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
+# ones.
+extensions = [
+    'sphinx.ext.autodoc',
+    'sphinx.ext.napoleon',
+    'sphinx.ext.extlinks',
+]
+
+# Add any paths that contain templates here, relative to this directory.
+templates_path = ['.templates']
+
+# The suffix(es) of source filenames.
+# You can specify multiple suffix as a list of string:
+#
+# source_suffix = ['.rst', '.md']
+source_suffix = '.rst'
+
+# The encoding of source files.
+#
+# source_encoding = 'utf-8-sig'
+
+# The master toctree document.
+master_doc = 'index'
+
+# General information about the project.
+project = 'BuildStream Plugins'
+copyright = '2022, BuildStream Developers'
+author = 'BuildStream Developers'
+
+# The version info for the project you're documenting, acts as replacement for
+# |version| and |release|, also used in various other places throughout the
+# built documents.
+#
+# The short X.Y version.
+version = '1.91'
+# The full version, including alpha/beta/rc tags.
+release = '1.91.0'
+
+# The language for content autogenerated by Sphinx. Refer to documentation
+# for a list of supported languages.
+#
+# This is also used if you do content translation via gettext catalogs.
+# Usually you set "language" from the command line for these cases.
+language = None
+
+# There are two options for replacing |today|: either, you set today to some
+# non-false value, then it is used:
+#
+# today = ''
+#
+# Else, today_fmt is used as the format for a strftime call.
+#
+# today_fmt = '%B %d, %Y'
+
+# List of patterns, relative to source directory, that match files and
+# directories to ignore when looking for source files.
+# This patterns also effect to html_static_path and html_extra_path
+exclude_patterns = []
+
+# The reST default role (used for this markup: `text`) to use for all
+# documents.
+#
+# default_role = None
+
+# If true, '()' will be appended to :func: etc. cross-reference text.
+#
+# add_function_parentheses = True
+
+# If true, the current module name will be prepended to all description
+# unit titles (such as .. function::).
+#
+add_module_names = False
+
+# If true, sectionauthor and moduleauthor directives will be shown in the
+# output. They are ignored by default.
+#
+# show_authors = False
+
+# The name of the Pygments (syntax highlighting) style to use.
+pygments_style = 'sphinx'
+
+# A list of ignored prefixes for module index sorting.
+modindex_common_prefix = [ 'buildstream.' ]
+
+# If true, keep warnings as "system message" paragraphs in the built documents.
+# keep_warnings = False
+
+# If true, `todo` and `todoList` produce output, else they produce nothing.
+todo_include_todos = False
+
+
+# -- Options for HTML output ----------------------------------------------
+
+# The theme to use for HTML and HTML Help pages.  See the documentation for
+# a list of builtin themes.
+#
+html_theme = 'sphinx_rtd_theme'
+
+# Theme options are theme-specific and customize the look and feel of a theme
+# further.  For a list of options available for each theme, see the
+# documentation.
+#
+# html_theme_options = {}
+
+# Add any paths that contain custom themes here, relative to this directory.
+# html_theme_path = []
+
+# The name for this set of Sphinx documents.
+# "<project> v<release> documentation" by default.
+#
+# html_title = 'BuildStream v0.1'
+
+# A shorter title for the navigation bar.  Default is the same as html_title.
+#
+# html_short_title = None
+
+# The name of an image file (relative to this directory) to place at the top
+# of the sidebar.
+#
+# html_logo = None
+
+# The name of an image file (relative to this directory) to use as a favicon of
+# the docs.  This file should be a Windows icon file (.ico) being 16x16 or 32x32
+# pixels large.
+#
+# html_favicon = None
+
+# Add any paths that contain custom static files (such as style sheets) here,
+# relative to this directory. They are copied after the builtin static files,
+# so a file named "default.css" will overwrite the builtin "default.css".
+html_static_path = []
+
+# Add any extra paths that contain custom files (such as robots.txt or
+# .htaccess) here, relative to this directory. These files are copied
+# directly to the root of the documentation.
+#
+# html_extra_path = []
+
+# If not None, a 'Last updated on:' timestamp is inserted at every page
+# bottom, using the given strftime format.
+# The empty string is equivalent to '%b %d, %Y'.
+#
+# html_last_updated_fmt = None
+
+# If true, SmartyPants will be used to convert quotes and dashes to
+# typographically correct entities.
+#
+# html_use_smartypants = True
+
+# Custom sidebar templates, maps document names to template names.
+#
+# html_sidebars = {}
+
+# Additional templates that should be rendered to pages, maps page names to
+# template names.
+#
+# html_additional_pages = {}
+
+# If false, no module index is generated.
+#
+# html_domain_indices = True
+
+# If false, no index is generated.
+#
+# html_use_index = True
+
+# If true, the index is split into individual pages for each letter.
+#
+# html_split_index = False
+
+# If true, links to the reST sources are added to the pages.
+#
+# html_show_sourcelink = True
+
+# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
+#
+# html_show_sphinx = True
+
+# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
+#
+# html_show_copyright = True
+
+# If true, an OpenSearch description file will be output, and all pages will
+# contain a <link> tag referring to it.  The value of this option must be the
+# base URL from which the finished HTML is served.
+#
+# html_use_opensearch = ''
+
+# This is the file name suffix for HTML files (e.g. ".xhtml").
+# html_file_suffix = None
+
+# Language to be used for generating the HTML full-text search index.
+# Sphinx supports the following languages:
+#   'da', 'de', 'en', 'es', 'fi', 'fr', 'h', 'it', 'ja'
+#   'nl', 'no', 'pt', 'ro', 'r', 'sv', 'tr', 'zh'
+#
+# html_search_language = 'en'
+
+# A dictionary with options for the search language support, empty by default.
+# 'ja' uses this config value.
+# 'zh' user can custom change `jieba` dictionary path.
+#
+# html_search_options = {'type': 'default'}
+
+# The name of a javascript file (relative to the configuration directory) that
+# implements a search results scorer. If empty, the default will be used.
+#
+# html_search_scorer = 'scorer.js'
+
+# Output file base name for HTML help builder.
+htmlhelp_basename = 'BuildStreamdoc'
+
+# -- Options for LaTeX output ---------------------------------------------
+
+latex_elements = {
+     # The paper size ('letterpaper' or 'a4paper').
+     #
+     # 'papersize': 'letterpaper',
+
+     # The font size ('10pt', '11pt' or '12pt').
+     #
+     # 'pointsize': '10pt',
+
+     # Additional stuff for the LaTeX preamble.
+     #
+     # 'preamble': '',
+
+     # Latex figure (float) alignment
+     #
+     # 'figure_align': 'htbp',
+}
+
+# Grouping the document tree into LaTeX files. List of tuples
+# (source start file, target name, title,
+#  author, documentclass [howto, manual, or own class]).
+latex_documents = [
+    (master_doc, 'BuildStream.tex', 'BuildStream Plugins Documentation',
+     'BuildStream Developers', 'manual'),
+]
+
+# The name of an image file (relative to this directory) to place at the top of
+# the title page.
+#
+# latex_logo = None
+
+# For "manual" documents, if this is true, then toplevel headings are parts,
+# not chapters.
+#
+# latex_use_parts = False
+
+# If true, show page references after internal links.
+#
+# latex_show_pagerefs = False
+
+# If true, show URL addresses after external links.
+#
+# latex_show_urls = False
+
+# Documents to append as an appendix to all manuals.
+#
+# latex_appendices = []
+
+# It false, will not define \strong, \code, 	itleref, \crossref ... but only
+# \sphinxstrong, ..., \sphinxtitleref, ... To help avoid clash with user added
+# packages.
+#
+# latex_keep_old_macro_names = True
+
+# If false, no module index is generated.
+#
+# latex_domain_indices = True
+
+
+# -- Options for manual page output ---------------------------------------
+
+# One entry per manual page. List of tuples
+# (source start file, name, description, authors, manual section).
+man_pages = [
+    (master_doc, 'buildstream', 'BuildStream Documentation',
+     [author], 1)
+]
+
+# If true, show URL addresses after external links.
+#
+# man_show_urls = False
+
+
+# -- Options for Texinfo output -------------------------------------------
+
+# Grouping the document tree into Texinfo files. List of tuples
+# (source start file, target name, title, author,
+#  dir menu entry, description, category)
+texinfo_documents = [
+    (master_doc, 'BuildStream', 'BuildStream Plugins Documentation',
+     author, 'BuildStream', 'A collection of BuildStream plugins.',
+     'Miscellaneous'),
+]
+
+# Documents to append as an appendix to all manuals.
+#
+# texinfo_appendices = []
+
+# If false, no module index is generated.
+#
+# texinfo_domain_indices = True
+
+# How to display URL addresses: 'footnote', 'no', or 'inline'.
+#
+# texinfo_show_urls = 'footnote'
+
+# If true, do not generate a @detailmenu in the "Top" node's menu.
+#
+# texinfo_no_detailmenu = False
+
+autodoc_member_order = 'bysource'
diff --git a/doc/source/index.rst b/doc/source/index.rst
new file mode 100644
index 0000000..3d17b69
--- /dev/null
+++ b/doc/source/index.rst
@@ -0,0 +1,31 @@
+.. toctree::
+   :maxdepth: 2
+
+BuildStream Plugins Documentation
+=================================
+This is a collection of plugins to use with Buildstream.
+
+To these plugins in your project, follow the
+`plugin loading documentation <https://docs.buildstream.build/master/format_project.html#loading-plugins>`_.
+
+.. toctree::
+   :maxdepth: 1
+   :caption: Element Plugins
+
+   elements/autotools
+   elements/cmake
+   elements/make
+   elements/meson
+   elements/pip
+   elements/setuptools
+
+.. toctree::
+   :maxdepth: 1
+   :caption: Source Plugins
+
+   sources/bzr
+   sources/cargo
+   sources/docker
+   sources/git
+   sources/patch
+   sources/pip
diff --git a/doc/source/plugin.rsttemplate b/doc/source/plugin.rsttemplate
new file mode 100644
index 0000000..11e090f
--- /dev/null
+++ b/doc/source/plugin.rsttemplate
@@ -0,0 +1,4 @@
+@@MODULENAME@@ plugin
+============================================
+
+.. automodule:: @@MODULE@@
diff --git a/doc/sphinx-build3 b/doc/sphinx-build3
new file mode 100755
index 0000000..590fd52
--- /dev/null
+++ b/doc/sphinx-build3
@@ -0,0 +1,15 @@
+#!/usr/bin/python3
+# -*- coding: utf-8 -*-
+"""
+Same as /usr/bin/sphinx-build but with different
+interpreter
+"""
+
+import sys
+
+if __name__ == '__main__':
+    from sphinx import main, make_main
+    if sys.argv[1:2] == ['-M']:
+        sys.exit(make_main(sys.argv))
+    else:
+        sys.exit(main(sys.argv))
\ No newline at end of file


[buildstream-plugins] 24/49: tests: Adding initial tests

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 8bd6f6bcec763b795260eac09e6d8ddaadcbaf4e
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sun Mar 20 16:53:50 2022 +0900

    tests: Adding initial tests
    
    Adding the Repo scaffolding and hooks into the private BuildStream provided
    general test cases for sources.
---
 tests/__init__.py                |   0
 tests/conftest.py                |  32 +++++++++++
 tests/testutils/__init__.py      |   0
 tests/testutils/repo/__init__.py |   2 +
 tests/testutils/repo/bzrrepo.py  |  54 +++++++++++++++++++
 tests/testutils/repo/gitrepo.py  | 114 +++++++++++++++++++++++++++++++++++++++
 tests/testutils/site.py          |  45 ++++++++++++++++
 7 files changed, 247 insertions(+)

diff --git a/tests/__init__.py b/tests/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/tests/conftest.py b/tests/conftest.py
new file mode 100644
index 0000000..833c9bb
--- /dev/null
+++ b/tests/conftest.py
@@ -0,0 +1,32 @@
+import pytest
+
+from buildstream._testing import sourcetests_collection_hook
+from buildstream._testing import register_repo_kind
+
+from .testutils.repo import Bzr, Git
+
+
+#################################################
+#            Implement pytest option            #
+#################################################
+def pytest_addoption(parser):
+    parser.addoption(
+        "--integration", action="store_true", default=False, help="Run integration tests",
+    )
+
+
+def pytest_runtest_setup(item):
+    # Without --integration: skip tests not marked with 'integration'
+    if not item.config.getvalue("integration"):
+        if item.get_closest_marker("integration"):
+            pytest.skip("skipping integration test")
+
+
+register_repo_kind("bzr", Bzr, "buildstream_plugins")
+register_repo_kind("git", Git, "buildstream_plugins")
+
+
+# This hook enables pytest to collect the templated source tests from
+# buildstream._testing
+def pytest_sessionstart(session):
+    sourcetests_collection_hook(session)
diff --git a/tests/testutils/__init__.py b/tests/testutils/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/tests/testutils/repo/__init__.py b/tests/testutils/repo/__init__.py
new file mode 100644
index 0000000..8046c89
--- /dev/null
+++ b/tests/testutils/repo/__init__.py
@@ -0,0 +1,2 @@
+from .bzrrepo import Bzr
+from .gitrepo import Git
diff --git a/tests/testutils/repo/bzrrepo.py b/tests/testutils/repo/bzrrepo.py
new file mode 100644
index 0000000..324244c
--- /dev/null
+++ b/tests/testutils/repo/bzrrepo.py
@@ -0,0 +1,54 @@
+import os
+import subprocess
+import pytest
+
+from buildstream._testing import Repo
+
+from ..site import BZR, BZR_ENV, HAVE_BZR
+
+
+class Bzr(Repo):
+    def __init__(self, directory, subdir="repo"):
+        if not HAVE_BZR:
+            pytest.skip("bzr is not available")
+        super().__init__(directory, subdir)
+        self.bzr = BZR
+
+        self.env = os.environ.copy()
+        self.env.update(BZR_ENV)
+
+    def create(self, directory):
+        # Work around race condition in bzr's creation of ~/.bazaar in
+        # ensure_config_dir_exists() when running tests in parallel.
+        bazaar_config_dir = os.path.expanduser("~/.bazaar")
+        os.makedirs(bazaar_config_dir, exist_ok=True)
+
+        branch_dir = os.path.join(self.repo, "trunk")
+
+        subprocess.call([self.bzr, "init-repo", self.repo], env=self.env)
+        subprocess.call([self.bzr, "init", branch_dir], env=self.env)
+        self.copy_directory(directory, branch_dir)
+        subprocess.call([self.bzr, "add", "."], env=self.env, cwd=branch_dir)
+        subprocess.call(
+            [self.bzr, "commit", '--message="Initial commit"'], env=self.env, cwd=branch_dir,
+        )
+
+        return self.latest_commit()
+
+    def source_config(self, ref=None):
+        config = {
+            "kind": "bzr",
+            "url": "file://" + self.repo,
+            "track": "trunk",
+        }
+        if ref is not None:
+            config["ref"] = ref
+
+        return config
+
+    def latest_commit(self):
+        return subprocess.check_output(
+            [self.bzr, "version-info", "--custom", "--template={revno}", os.path.join(self.repo, "trunk"),],
+            env=self.env,
+            universal_newlines=True,
+        ).strip()
diff --git a/tests/testutils/repo/gitrepo.py b/tests/testutils/repo/gitrepo.py
new file mode 100644
index 0000000..9cb3c9a
--- /dev/null
+++ b/tests/testutils/repo/gitrepo.py
@@ -0,0 +1,114 @@
+import os
+import shutil
+import subprocess
+
+import pytest
+
+from buildstream._testing import Repo
+
+from ..site import GIT, GIT_ENV, HAVE_GIT
+
+
+class Git(Repo):
+    def __init__(self, directory, subdir="repo"):
+        if not HAVE_GIT:
+            pytest.skip("git is not available")
+
+        self.submodules = {}
+
+        super().__init__(directory, subdir)
+
+        self.env = os.environ.copy()
+        self.env.update(GIT_ENV)
+
+    def _run_git(self, *args, **kwargs):
+        argv = [GIT]
+        argv.extend(args)
+        if "env" not in kwargs:
+            kwargs["env"] = dict(self.env, PWD=self.repo)
+        kwargs.setdefault("cwd", self.repo)
+        kwargs.setdefault("check", True)
+        return subprocess.run(argv, **kwargs)  # pylint: disable=subprocess-run-check
+
+    def create(self, directory):
+        self.copy_directory(directory, self.repo)
+        self._run_git("init", ".")
+        self._run_git("add", ".")
+        self._run_git("commit", "-m", "Initial commit")
+        return self.latest_commit()
+
+    def add_tag(self, tag):
+        self._run_git("tag", tag)
+
+    def add_annotated_tag(self, tag, message):
+        self._run_git("tag", "-a", tag, "-m", message)
+
+    def add_commit(self):
+        self._run_git("commit", "--allow-empty", "-m", "Additional commit")
+        return self.latest_commit()
+
+    def add_file(self, filename):
+        shutil.copy(filename, self.repo)
+        self._run_git("add", os.path.basename(filename))
+        self._run_git("commit", "-m", "Added {}".format(os.path.basename(filename)))
+        return self.latest_commit()
+
+    def modify_file(self, new_file, path):
+        shutil.copy(new_file, os.path.join(self.repo, path))
+        self._run_git("commit", path, "-m", "Modified {}".format(os.path.basename(path)))
+        return self.latest_commit()
+
+    def add_submodule(self, subdir, url=None, checkout=None):
+        submodule = {}
+        if checkout is not None:
+            submodule["checkout"] = checkout
+        if url is not None:
+            submodule["url"] = url
+        self.submodules[subdir] = submodule
+        self._run_git("submodule", "add", url, subdir)
+        self._run_git("commit", "-m", "Added the submodule")
+        return self.latest_commit()
+
+    # This can also be used to a file or a submodule
+    def remove_path(self, path):
+        self._run_git("rm", path)
+        self._run_git("commit", "-m", "Removing {}".format(path))
+        return self.latest_commit()
+
+    def source_config(self, ref=None):
+        return self.source_config_extra(ref)
+
+    def source_config_extra(self, ref=None, checkout_submodules=None):
+        config = {
+            "kind": "git",
+            "url": "file://" + self.repo,
+            "track": "master",
+        }
+        if ref is not None:
+            config["ref"] = ref
+        if checkout_submodules is not None:
+            config["checkout-submodules"] = checkout_submodules
+
+        if self.submodules:
+            config["submodules"] = dict(self.submodules)
+
+        return config
+
+    def latest_commit(self):
+        return self._run_git("rev-parse", "HEAD", stdout=subprocess.PIPE, universal_newlines=True,).stdout.strip()
+
+    def branch(self, branch_name):
+        self._run_git("checkout", "-b", branch_name)
+
+    def delete_tag(self, tag_name):
+        self._run_git("tag", "-d", tag_name)
+
+    def checkout(self, commit):
+        self._run_git("checkout", commit)
+
+    def merge(self, commit):
+        self._run_git("merge", "-m", "Merge", commit)
+        return self.latest_commit()
+
+    def rev_parse(self, rev):
+        return self._run_git("rev-parse", rev, stdout=subprocess.PIPE, universal_newlines=True,).stdout.strip()
diff --git a/tests/testutils/site.py b/tests/testutils/site.py
new file mode 100644
index 0000000..c60df1e
--- /dev/null
+++ b/tests/testutils/site.py
@@ -0,0 +1,45 @@
+import subprocess
+
+from typing import Optional
+
+from buildstream import utils, ProgramNotFoundError
+
+GIT: Optional[str]
+BZR: Optional[str]
+
+try:
+    GIT = utils.get_host_tool("git")
+    HAVE_GIT = True
+
+    out = str(subprocess.check_output(["git", "--version"]), "utf-8")
+    # e.g. on Git for Windows we get "git version 2.21.0.windows.1".
+    # e.g. on Mac via Homebrew we get "git version 2.19.0".
+    version = tuple(int(x) for x in out.split(" ")[2].split(".")[:3])
+    HAVE_OLD_GIT = version < (1, 8, 5)
+
+    GIT_ENV = {
+        "GIT_AUTHOR_DATE": "1320966000 +0200",
+        "GIT_AUTHOR_NAME": "tomjon",
+        "GIT_AUTHOR_EMAIL": "tom@jon.com",
+        "GIT_COMMITTER_DATE": "1320966000 +0200",
+        "GIT_COMMITTER_NAME": "tomjon",
+        "GIT_COMMITTER_EMAIL": "tom@jon.com",
+    }
+except ProgramNotFoundError:
+    GIT = None
+    HAVE_GIT = False
+    HAVE_OLD_GIT = False
+    GIT_ENV = {}
+
+try:
+    BZR = utils.get_host_tool("bzr")
+    HAVE_BZR = True
+    # Breezy 3.0 supports `BRZ_EMAIL` but not `BZR_EMAIL`
+    BZR_ENV = {
+        "BZR_EMAIL": "Testy McTesterson <te...@example.com>",
+        "BRZ_EMAIL": "Testy McTesterson <te...@example.com>",
+    }
+except ProgramNotFoundError:
+    BZR = None
+    HAVE_BZR = False
+    BZR_ENV = {}


[buildstream-plugins] 25/49: tests/sources/pip.py: Adding pip source test

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit ad335d2d5394d2fed34fc4ed3a0e9b8468928e89
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sun Mar 20 17:41:45 2022 +0900

    tests/sources/pip.py: Adding pip source test
---
 tests/sources/__init__.py                     |  0
 tests/sources/pip.py                          | 71 +++++++++++++++++++++++++++
 tests/sources/pip/first-source-pip/target.bst |  6 +++
 tests/sources/pip/no-packages/file            |  1 +
 tests/sources/pip/no-packages/target.bst      |  6 +++
 tests/sources/pip/no-ref/file                 |  1 +
 tests/sources/pip/no-ref/target.bst           |  8 +++
 7 files changed, 93 insertions(+)

diff --git a/tests/sources/__init__.py b/tests/sources/__init__.py
new file mode 100644
index 0000000..e69de29
diff --git a/tests/sources/pip.py b/tests/sources/pip.py
new file mode 100644
index 0000000..bf30b20
--- /dev/null
+++ b/tests/sources/pip.py
@@ -0,0 +1,71 @@
+# Pylint doesn't play well with fixtures and dependency injection from pytest
+# pylint: disable=redefined-outer-name
+
+import os
+import pytest
+
+from buildstream import _yaml
+from buildstream.exceptions import ErrorDomain
+from buildstream._testing import cli  # pylint: disable=unused-import
+from buildstream_plugins.sources.pip import _match_package_name
+
+DATA_DIR = os.path.join(os.path.dirname(os.path.realpath(__file__)), "pip",)
+
+
+def generate_project(project_dir):
+    project_file = os.path.join(project_dir, "project.conf")
+    _yaml.roundtrip_dump(
+        {
+            "name": "foo",
+            "min-version": "2.0",
+            "plugins": [{"origin": "pip", "package-name": "buildstream-plugins", "sources": ["pip"],}],
+        },
+        project_file,
+    )
+
+
+# Test that without ref, consistency is set appropriately.
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "no-ref"))
+def test_no_ref(cli, datafiles):
+    project = str(datafiles)
+    generate_project(project)
+    assert cli.get_element_state(project, "target.bst") == "no reference"
+
+
+# Test that pip is not allowed to be the first source
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "first-source-pip"))
+def test_first_source(cli, datafiles):
+    project = str(datafiles)
+    generate_project(project)
+    result = cli.run(project=project, args=["show", "target.bst"])
+    result.assert_main_error(ErrorDomain.ELEMENT, None)
+
+
+# Test that error is raised when neither packges nor requirements files
+# have been specified
+@pytest.mark.datafiles(os.path.join(DATA_DIR, "no-packages"))
+def test_no_packages(cli, datafiles):
+    project = str(datafiles)
+    generate_project(project)
+    result = cli.run(project=project, args=["show", "target.bst"])
+    result.assert_main_error(ErrorDomain.SOURCE, None)
+
+
+# Test that pip source parses tar ball names correctly for the ref
+@pytest.mark.parametrize(
+    "tarball, expected_name, expected_version",
+    [
+        ("dotted.package-0.9.8.tar.gz", "dotted.package", "0.9.8"),
+        ("hyphenated-package-2.6.0.tar.gz", "hyphenated-package", "2.6.0"),
+        ("underscore_pkg-3.1.0.tar.gz", "underscore_pkg", "3.1.0"),
+        ("numbers2and5-1.0.1.tar.gz", "numbers2and5", "1.0.1"),
+        ("multiple.dots.package-5.6.7.tar.gz", "multiple.dots.package", "5.6.7",),
+        ("multiple-hyphens-package-1.2.3.tar.gz", "multiple-hyphens-package", "1.2.3",),
+        ("multiple_underscore_pkg-3.4.5.tar.gz", "multiple_underscore_pkg", "3.4.5",),
+        ("shortversion-1.0.tar.gz", "shortversion", "1.0"),
+        ("longversion-1.2.3.4.tar.gz", "longversion", "1.2.3.4"),
+    ],
+)
+def test_match_package_name(tarball, expected_name, expected_version):
+    name, version = _match_package_name(tarball)
+    assert (expected_name, expected_version) == (name, version)
diff --git a/tests/sources/pip/first-source-pip/target.bst b/tests/sources/pip/first-source-pip/target.bst
new file mode 100644
index 0000000..e5f20ab
--- /dev/null
+++ b/tests/sources/pip/first-source-pip/target.bst
@@ -0,0 +1,6 @@
+kind: import
+description: pip should not be allowed to be the first source
+sources:
+- kind: pip
+  packages:
+  - flake8
diff --git a/tests/sources/pip/no-packages/file b/tests/sources/pip/no-packages/file
new file mode 100644
index 0000000..980a0d5
--- /dev/null
+++ b/tests/sources/pip/no-packages/file
@@ -0,0 +1 @@
+Hello World!
diff --git a/tests/sources/pip/no-packages/target.bst b/tests/sources/pip/no-packages/target.bst
new file mode 100644
index 0000000..0d8b948
--- /dev/null
+++ b/tests/sources/pip/no-packages/target.bst
@@ -0,0 +1,6 @@
+kind: import
+description: The kind of this element is irrelevant.
+sources:
+- kind: local
+  path: file
+- kind: pip
diff --git a/tests/sources/pip/no-ref/file b/tests/sources/pip/no-ref/file
new file mode 100644
index 0000000..980a0d5
--- /dev/null
+++ b/tests/sources/pip/no-ref/file
@@ -0,0 +1 @@
+Hello World!
diff --git a/tests/sources/pip/no-ref/target.bst b/tests/sources/pip/no-ref/target.bst
new file mode 100644
index 0000000..ec450b7
--- /dev/null
+++ b/tests/sources/pip/no-ref/target.bst
@@ -0,0 +1,8 @@
+kind: import
+description: The kind of this element is irrelevant.
+sources:
+- kind: local
+  path: file
+- kind: pip
+  packages:
+  - flake8


[buildstream-plugins] 45/49: tests/elements/{cmake,meson}: Use git source from buildstream-plugins

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit a17e02a35a2043d0b5f3746e285db90ef554ac7c
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Tue Apr 5 18:21:39 2022 +0900

    tests/elements/{cmake,meson}: Use git source from buildstream-plugins
    
    We should not use the git source from buildstream as that will be removed
---
 tests/elements/cmake/project.conf | 2 ++
 tests/elements/meson/project.conf | 2 ++
 2 files changed, 4 insertions(+)

diff --git a/tests/elements/cmake/project.conf b/tests/elements/cmake/project.conf
index bd8d972..7e30988 100644
--- a/tests/elements/cmake/project.conf
+++ b/tests/elements/cmake/project.conf
@@ -7,6 +7,8 @@ element-path: elements
 plugins:
 - origin: pip
   package-name: buildstream-plugins
+  sources:
+  - git
   elements:
   - cmake
 - origin: junction
diff --git a/tests/elements/meson/project.conf b/tests/elements/meson/project.conf
index 9ca5151..456b39f 100644
--- a/tests/elements/meson/project.conf
+++ b/tests/elements/meson/project.conf
@@ -7,6 +7,8 @@ element-path: elements
 plugins:
 - origin: pip
   package-name: buildstream-plugins
+  sources:
+  - git
   elements:
   - meson
   - setuptools


[buildstream-plugins] 48/49: tests/sources/bzr: Use locally defined bzr source

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/buildstream-plugins.git

commit 8f80351efd073fb6e143e615bf83e8a4f67339a6
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Tue Apr 5 18:37:41 2022 +0900

    tests/sources/bzr: Use locally defined bzr source
    
    Instead of buildstream core bzr source which will be removed.
---
 tests/sources/bzr/project.conf | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/tests/sources/bzr/project.conf b/tests/sources/bzr/project.conf
index 08a9d60..e1701f0 100644
--- a/tests/sources/bzr/project.conf
+++ b/tests/sources/bzr/project.conf
@@ -1,3 +1,9 @@
 # Basic Project
 name: foo
 min-version: 2.0
+
+plugins:
+- origin: pip
+  package-name: buildstream-plugins
+  sources:
+  - bzr