You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@buildstream.apache.org by tv...@apache.org on 2021/01/17 09:41:12 UTC

[buildstream] branch tristan/refactor-remote-config created (now 11bbba0)

This is an automated email from the ASF dual-hosted git repository.

tvb pushed a change to branch tristan/refactor-remote-config
in repository https://gitbox.apache.org/repos/asf/buildstream.git.


      at 11bbba0  _remotespec.py: Moving RemoteSpec and RemoteExecutionSpec to its own file

This branch includes the following new commits:

     new f957614  node.pyi: Adding some missing type annotations
     new 248e71a  element.py: Remove local __remote_execution_specs instance variable
     new 11bbba0  _remotespec.py: Moving RemoteSpec and RemoteExecutionSpec to its own file

The 3 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[buildstream] 02/03: element.py: Remove local __remote_execution_specs instance variable

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch tristan/refactor-remote-config
in repository https://gitbox.apache.org/repos/asf/buildstream.git

commit 248e71a455a605ac1170b6d068efa0baddd15381
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Jan 16 19:17:32 2021 +0900

    element.py: Remove local __remote_execution_specs instance variable
    
    Caching this variable locally only adds additional redirection and
    confusion around where these are coming from, making the code more
    difficult to understand.
---
 src/buildstream/element.py | 18 ++----------------
 1 file changed, 2 insertions(+), 16 deletions(-)

diff --git a/src/buildstream/element.py b/src/buildstream/element.py
index 945ca5c..dbb5aeb 100644
--- a/src/buildstream/element.py
+++ b/src/buildstream/element.py
@@ -2706,14 +2706,6 @@ class Element(Plugin):
 
         return self.__tainted
 
-    # __use_remote_execution():
-    #
-    # Returns True if remote execution is configured, defaults to user defined
-    # project-specific overrides, project config, or global user config.
-    #
-    def __use_remote_execution(self):
-        return bool(self.__remote_execution_specs)
-
     # __collect_overlaps():
     #
     # A context manager for collecting overlap warnings and errors.
@@ -2755,7 +2747,7 @@ class Element(Plugin):
         else:
             output_node_properties = None
 
-        if directory is not None and allow_remote and self.__use_remote_execution():
+        if directory is not None and allow_remote and project.remote_execution_specs:
 
             self.info("Using a remote sandbox for artifact {} with directory '{}'".format(self.name, directory))
 
@@ -2769,7 +2761,7 @@ class Element(Plugin):
                 stdout=stdout,
                 stderr=stderr,
                 config=config,
-                specs=self.__remote_execution_specs,
+                specs=project.remote_execution_specs,
                 output_files_required=output_files_required,
                 output_node_properties=output_node_properties,
             )
@@ -2841,12 +2833,6 @@ class Element(Plugin):
 
         self._configure(self.__config)
 
-        # Extract remote execution URL
-        if load_element.first_pass:
-            self.__remote_execution_specs = None
-        else:
-            self.__remote_execution_specs = project.remote_execution_specs
-
         # Extract Sandbox config
         sandbox_config = self.__extract_sandbox_config(project, load_element)
         self.__variables.expand(sandbox_config)


[buildstream] 03/03: _remotespec.py: Moving RemoteSpec and RemoteExecutionSpec to its own file

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch tristan/refactor-remote-config
in repository https://gitbox.apache.org/repos/asf/buildstream.git

commit 11bbba0a2d2fea5d3cf4a8e218c5c40f8921f29e
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Jan 16 16:36:59 2021 +0900

    _remotespec.py: Moving RemoteSpec and RemoteExecutionSpec to its own file
    
    This is in preparation to move all configuration parsing and
    data structures out of the cache handling code and closer into
    the domain of Context/Project.
    
    The RemoteSpec/RemoteExecutionSpec objects are also changed to be proper
    strongly typed objects instead of a named tuples.
    
    RemoteSpec now also handles the nitty gritty parts of lazy loading
    the credentials from disk (now featuring proper provenance inclusive error
    reporting), loading the grpc credentials object, and opening connections,
    this allows us to remove a hand full of duplicated code paths.
    
    Summary of changes:
    
      * _remotespec.py: Added RemoteType, RemoteSpec and RemoteExecutionSpec here
    
      * _project.py: Load RemoteExecutionSpec in the new way
    
      * _context.py: Load RemoteExecutionSpec in the new way
    
      * sandbox/_sandboxremote.py: Adapt to new moved RemoteExecutionSpec,
        remove a bunch of duplicated code here and rely on the spec
        for loading credentials and opening channels to end points
    
      * _remote.py:
        - Removed RemoteType and RemoteSpec from here
        - Reordered and documenting the file according to coding style
        - Removed redundant instance members which are already available
          on the spec (subclasses adapted to obtain these members in their
          now singular location on the spec)
    
      * _assetcache.py:
        - Adapting to the Remote API changes
        - Reordering and documenting the file as per coding style
        - Removed helper function _initialize_remotes(), which only
          makes the code harder to follow
    
      * _cas/casremote.py: Adapting to Remote API changes
    
      * _artifactcache.py: Adapting to Remote API changes
    
      * _sourcecache.py: Adapting to Remote API changes
    
      * _elementsourcescache.py: Adapting to Remote API changes
    
      * tests/artifactcache/config.py: Updated test cases
---
 src/buildstream/_artifactcache.py         |   4 +-
 src/buildstream/_assetcache.py            |  98 +++++----
 src/buildstream/_cas/casremote.py         |  12 +-
 src/buildstream/_context.py               |  10 +-
 src/buildstream/_elementsourcescache.py   |   4 +-
 src/buildstream/_project.py               |  34 +--
 src/buildstream/_remote.py                | 205 +++---------------
 src/buildstream/_remotespec.py            | 331 ++++++++++++++++++++++++++++++
 src/buildstream/_sourcecache.py           |   4 +-
 src/buildstream/sandbox/_sandboxremote.py | 179 ++--------------
 tests/artifactcache/config.py             |  33 +--
 11 files changed, 480 insertions(+), 434 deletions(-)

diff --git a/src/buildstream/_artifactcache.py b/src/buildstream/_artifactcache.py
index ded8767..d73c7f0 100644
--- a/src/buildstream/_artifactcache.py
+++ b/src/buildstream/_artifactcache.py
@@ -123,8 +123,8 @@ class ArtifactCache(AssetCache):
         project = element._get_project()
         display_key = element._get_display_key()
 
-        index_remotes = [r for r in self._index_remotes[project] if r.push]
-        storage_remotes = [r for r in self._storage_remotes[project] if r.push]
+        index_remotes = [r for r in self._index_remotes[project] if r.spec.push]
+        storage_remotes = [r for r in self._storage_remotes[project] if r.spec.push]
 
         artifact_proto = artifact._get_proto()
         artifact_digest = self.cas.add_object(buffer=artifact_proto.SerializeToString())
diff --git a/src/buildstream/_assetcache.py b/src/buildstream/_assetcache.py
index fef597a..803243e 100644
--- a/src/buildstream/_assetcache.py
+++ b/src/buildstream/_assetcache.py
@@ -26,7 +26,8 @@ from . import utils
 from . import _yaml
 from ._cas import CASRemote
 from ._exceptions import AssetCacheError, LoadError, RemoteError
-from ._remote import BaseRemote, RemoteSpec, RemoteType
+from ._remotespec import RemoteSpec, RemoteType
+from ._remote import BaseRemote
 from ._protos.build.bazel.remote.asset.v1 import remote_asset_pb2, remote_asset_pb2_grpc
 from ._protos.google.rpc import code_pb2
 
@@ -63,8 +64,8 @@ class AssetRemote(BaseRemote):
     #
     def _check(self):
         request = remote_asset_pb2.FetchBlobRequest()
-        if self.instance_name:
-            request.instance_name = self.instance_name
+        if self.spec.instance_name:
+            request.instance_name = self.spec.instance_name
 
         try:
             self.fetch_service.FetchBlob(request)
@@ -82,8 +83,8 @@ class AssetRemote(BaseRemote):
 
         if self.spec.push:
             request = remote_asset_pb2.PushBlobRequest()
-            if self.instance_name:
-                request.instance_name = self.instance_name
+            if self.spec.instance_name:
+                request.instance_name = self.spec.instance_name
 
             try:
                 self.push_service.PushBlob(request)
@@ -120,8 +121,8 @@ class AssetRemote(BaseRemote):
     #
     def fetch_blob(self, uris, *, qualifiers=None):
         request = remote_asset_pb2.FetchBlobRequest()
-        if self.instance_name:
-            request.instance_name = self.instance_name
+        if self.spec.instance_name:
+            request.instance_name = self.spec.instance_name
         request.uris.extend(uris)
         if qualifiers:
             request.qualifiers.extend(qualifiers)
@@ -161,8 +162,8 @@ class AssetRemote(BaseRemote):
     #
     def fetch_directory(self, uris, *, qualifiers=None):
         request = remote_asset_pb2.FetchDirectoryRequest()
-        if self.instance_name:
-            request.instance_name = self.instance_name
+        if self.spec.instance_name:
+            request.instance_name = self.spec.instance_name
         request.uris.extend(uris)
         if qualifiers:
             request.qualifiers.extend(qualifiers)
@@ -202,8 +203,8 @@ class AssetRemote(BaseRemote):
     #
     def push_blob(self, uris, blob_digest, *, qualifiers=None, references_blobs=None, references_directories=None):
         request = remote_asset_pb2.PushBlobRequest()
-        if self.instance_name:
-            request.instance_name = self.instance_name
+        if self.spec.instance_name:
+            request.instance_name = self.spec.instance_name
         request.uris.extend(uris)
         request.blob_digest.CopyFrom(blob_digest)
         if qualifiers:
@@ -239,8 +240,8 @@ class AssetRemote(BaseRemote):
         self, uris, directory_digest, *, qualifiers=None, references_blobs=None, references_directories=None
     ):
         request = remote_asset_pb2.PushDirectoryRequest()
-        if self.instance_name:
-            request.instance_name = self.instance_name
+        if self.spec.instance_name:
+            request.instance_name = self.spec.instance_name
         request.uris.extend(uris)
         request.root_directory_digest.CopyFrom(directory_digest)
         if qualifiers:
@@ -332,32 +333,10 @@ class AssetCache:
                 )
 
         for spec_node in artifacts:
-            cache_specs.append(RemoteSpec.new_from_config_node(spec_node))
+            cache_specs.append(RemoteSpec.new_from_node(spec_node))
 
         return cache_specs
 
-    # _configured_remote_cache_specs():
-    #
-    # Return the list of configured remotes for a given project, in priority
-    # order. This takes into account the user and project configuration.
-    #
-    # Args:
-    #     context (Context): The BuildStream context
-    #     project (Project): The BuildStream project
-    #
-    # Returns:
-    #   A list of RemoteSpec instances describing the remote caches.
-    #
-    @classmethod
-    def _configured_remote_cache_specs(cls, context, project):
-        project_overrides = context.get_overrides(project.name)
-        project_extra_specs = cls.specs_from_config_node(project_overrides)
-
-        project_specs = getattr(project, cls.spec_name)
-        context_specs = getattr(context, cls.spec_name)
-
-        return list(utils._deduplicate(project_extra_specs + project_specs + context_specs))
-
     # setup_remotes():
     #
     # Sets up which remotes to use
@@ -381,7 +360,7 @@ class AssetCache:
         # the user config in some cases (for example `bst artifact push --remote=...`).
         has_remote_caches = False
         if remote_url:
-            self._set_remotes([RemoteSpec(remote_url, push=True)])
+            self._set_remotes([RemoteSpec(RemoteType.ALL, remote_url, push=True)])
             has_remote_caches = True
         if use_config:
             for project in self.context.get_projects():
@@ -389,8 +368,13 @@ class AssetCache:
                 if caches:  # caches is a list of RemoteSpec instances
                     self._set_remotes(caches, project=project)
                     has_remote_caches = True
+
+        def remote_failed(remote, error):
+            self.context.messenger.warn("Failed to initialize remote {}: {}".format(remote.url, error))
+
         if has_remote_caches:
-            self._initialize_remotes()
+            with self.context.messenger.timed_activity("Initializing remote caches", silent_nested=True):
+                self.initialize_remotes(on_failure=remote_failed)
 
     # Notify remotes that forking is disabled
     def notify_fork_disabled(self):
@@ -485,6 +469,28 @@ class AssetCache:
     #               Local Private Methods          #
     ################################################
 
+    # _configured_remote_cache_specs():
+    #
+    # Return the list of configured remotes for a given project, in priority
+    # order. This takes into account the user and project configuration.
+    #
+    # Args:
+    #     context (Context): The BuildStream context
+    #     project (Project): The BuildStream project
+    #
+    # Returns:
+    #   A list of RemoteSpec instances describing the remote caches.
+    #
+    @classmethod
+    def _configured_remote_cache_specs(cls, context, project):
+        project_overrides = context.get_overrides(project.name)
+        project_extra_specs = cls.specs_from_config_node(project_overrides)
+
+        project_specs = getattr(project, cls.spec_name)
+        context_specs = getattr(context, cls.spec_name)
+
+        return list(utils._deduplicate(project_extra_specs + project_specs + context_specs))
+
     # _create_remote_instances():
     #
     # Create the global set of Remote instances, including
@@ -563,10 +569,10 @@ class AssetCache:
         # we create two objects here
         index = None
         storage = None
-        if remote_spec.type in [RemoteType.INDEX, RemoteType.ALL]:
+        if remote_spec.remote_type in [RemoteType.INDEX, RemoteType.ALL]:
             index = AssetRemote(remote_spec)  # pylint: disable=not-callable
             index.check()
-        if remote_spec.type in [RemoteType.STORAGE, RemoteType.ALL]:
+        if remote_spec.remote_type in [RemoteType.STORAGE, RemoteType.ALL]:
             storage = CASRemote(remote_spec, self.cas)
             storage.check()
 
@@ -588,18 +594,6 @@ class AssetCache:
         else:
             self.project_remote_specs[project] = remote_specs
 
-    # _initialize_remotes()
-    #
-    # An internal wrapper which calls the abstract method and
-    # reports takes care of messaging
-    #
-    def _initialize_remotes(self):
-        def remote_failed(remote, error):
-            self.context.messenger.warn("Failed to initialize remote {}: {}".format(remote.url, error))
-
-        with self.context.messenger.timed_activity("Initializing remote caches", silent_nested=True):
-            self.initialize_remotes(on_failure=remote_failed)
-
     # _list_refs_mtimes()
     #
     # List refs in a directory, given a base path. Also returns the
diff --git a/src/buildstream/_cas/casremote.py b/src/buildstream/_cas/casremote.py
index 656c08a..3799f95 100644
--- a/src/buildstream/_cas/casremote.py
+++ b/src/buildstream/_cas/casremote.py
@@ -61,12 +61,12 @@ class CASRemote(BaseRemote):
         cas_endpoint.url = self.spec.url
         if self.spec.instance_name:
             cas_endpoint.instance_name = self.spec.instance_name
-        if self.server_cert:
-            cas_endpoint.server_cert = self.server_cert
-        if self.client_key:
-            cas_endpoint.client_key = self.client_key
-        if self.client_cert:
-            cas_endpoint.client_cert = self.client_cert
+        if self.spec.server_cert:
+            cas_endpoint.server_cert = self.spec.server_cert
+        if self.spec.client_key:
+            cas_endpoint.client_key = self.spec.client_key
+        if self.spec.client_cert:
+            cas_endpoint.client_cert = self.spec.client_cert
         try:
             response = local_cas.GetInstanceNameForRemotes(request)
         except grpc.RpcError as e:
diff --git a/src/buildstream/_context.py b/src/buildstream/_context.py
index 34fc6f2..7a81d10 100644
--- a/src/buildstream/_context.py
+++ b/src/buildstream/_context.py
@@ -29,12 +29,12 @@ from ._profile import Topics, PROFILER
 from ._platform import Platform
 from ._artifactcache import ArtifactCache
 from ._elementsourcescache import ElementSourcesCache
+from ._remotespec import RemoteExecutionSpec
 from ._sourcecache import SourceCache
 from ._cas import CASCache, CASLogLevel
 from .types import _CacheBuildTrees, _PipelineSelection, _SchedulerErrorAction
 from ._workspaces import Workspaces, WorkspaceProjectCache
 from .node import Node
-from .sandbox import SandboxRemote
 
 
 # Context()
@@ -339,13 +339,13 @@ class Context:
             self.pull_artifact_files = remote_execution.get_bool("pull-artifact-files", default=True)
             # This stops it being used in the remote service set up
             remote_execution.safe_del("pull-artifact-files")
+
             # Don't pass the remote execution settings if that was the only option
-            if remote_execution.keys() == []:
-                del defaults["remote-execution"]
+            if remote_execution.keys():
+                self.remote_execution_specs = RemoteExecutionSpec.new_from_node(remote_execution)
         else:
             self.pull_artifact_files = True
-
-        self.remote_execution_specs = SandboxRemote.specs_from_config_node(defaults)
+            self.remote_execution_specs = None
 
         # Load pull build trees configuration
         self.pull_buildtrees = cache.get_bool("pull-buildtrees")
diff --git a/src/buildstream/_elementsourcescache.py b/src/buildstream/_elementsourcescache.py
index 194f3fd..873806e 100644
--- a/src/buildstream/_elementsourcescache.py
+++ b/src/buildstream/_elementsourcescache.py
@@ -160,8 +160,8 @@ class ElementSourcesCache(AssetCache):
 
         uri = REMOTE_ASSET_SOURCE_URN_TEMPLATE.format(ref)
 
-        index_remotes = [r for r in self._index_remotes[project] if r.push]
-        storage_remotes = [r for r in self._storage_remotes[project] if r.push]
+        index_remotes = [r for r in self._index_remotes[project] if r.spec.push]
+        storage_remotes = [r for r in self._storage_remotes[project] if r.spec.push]
 
         source_proto = self.load_proto(sources)
         source_digest = self.cas.add_object(buffer=source_proto.SerializeToString())
diff --git a/src/buildstream/_project.py b/src/buildstream/_project.py
index 0bd3984..8e9c9ed 100644
--- a/src/buildstream/_project.py
+++ b/src/buildstream/_project.py
@@ -36,7 +36,6 @@ from ._options import OptionPool
 from ._artifactcache import ArtifactCache
 from ._sourcecache import SourceCache
 from .node import ScalarNode, SequenceNode, _assert_symbol_name
-from .sandbox import SandboxRemote
 from ._pluginfactory import ElementFactory, SourceFactory, load_plugin_origin
 from .types import CoreWarnings
 from ._projectrefs import ProjectRefs, ProjectRefStorage
@@ -44,11 +43,13 @@ from ._loader import Loader, LoadContext
 from .element import Element
 from ._includes import Includes
 from ._workspaces import WORKSPACE_PROJECT_FILE
+from ._remotespec import RemoteSpec, RemoteExecutionSpec
+
 
 if TYPE_CHECKING:
     from .node import ProvenanceInformation, MappingNode
     from ._context import Context
-    from ._remote import RemoteSpec
+
 
 # Project Configuration file
 _PROJECT_CONF_FILE = "project.conf"
@@ -138,9 +139,9 @@ class Project:
         self.base_env_nocache: List[str] = []  # The base nocache mask (list) for the environment
 
         # Remote specs for communicating with remote services
-        self.artifact_cache_specs: List["RemoteSpec"] = []  # Artifact caches
-        self.source_cache_specs: List["RemoteSpec"] = []  # Source caches
-        self.remote_execution_specs: List["RemoteSpec"] = []  # Remote execution services
+        self.artifact_cache_specs: List[RemoteSpec] = []  # Artifact caches
+        self.source_cache_specs: List[RemoteSpec] = []  # Source caches
+        self.remote_execution_specs: Optional[RemoteExecutionSpec] = None  # Remote execution services
 
         self.element_factory: Optional[ElementFactory] = None  # ElementFactory for loading elements
         self.source_factory: Optional[SourceFactory] = None  # SourceFactory for loading sources
@@ -881,13 +882,22 @@ class Project:
         self.source_cache_specs = SourceCache.specs_from_config_node(config, self.directory)
 
         # Load remote-execution configuration for this project
-        project_specs = SandboxRemote.specs_from_config_node(config, self.directory)
-        override_specs = SandboxRemote.specs_from_config_node(self._context.get_overrides(self.name), self.directory)
-
-        if override_specs is not None:
-            self.remote_execution_specs = override_specs
-        elif project_specs is not None:
-            self.remote_execution_specs = project_specs
+        project_re_specs = None
+        project_re_node = config.get_mapping("remote-execution", default=None)
+        if project_re_node:
+            project_re_specs = RemoteExecutionSpec.new_from_node(project_re_node, self.directory)
+
+        override_re_specs = None
+        override_node = self._context.get_overrides(self.name)
+        if override_node:
+            override_re_node = override_node.get_mapping("remote-execution", default=None)
+            if override_re_node:
+                override_re_specs = RemoteExecutionSpec.new_from_node(override_re_node, self.directory)
+
+        if override_re_specs is not None:
+            self.remote_execution_specs = override_re_specs
+        elif project_re_specs is not None:
+            self.remote_execution_specs = project_re_specs
         else:
             self.remote_execution_specs = self._context.remote_execution_specs
 
diff --git a/src/buildstream/_remote.py b/src/buildstream/_remote.py
index 6d52ff5..0d47921 100644
--- a/src/buildstream/_remote.py
+++ b/src/buildstream/_remote.py
@@ -15,109 +15,11 @@
 #  License along with this library. If not, see <http://www.gnu.org/licenses/>.
 #
 
-import os
 import threading
-from collections import namedtuple
-from urllib.parse import urlparse
 
 import grpc
 
-from ._exceptions import LoadError, ImplError, RemoteError
-from .exceptions import LoadErrorReason
-from .types import FastEnum
-
-
-# RemoteType():
-#
-# Defines the different types of remote.
-#
-class RemoteType(FastEnum):
-    INDEX = "index"
-    STORAGE = "storage"
-    ALL = "all"
-
-    def __str__(self):
-        return self.name.lower().replace("_", "-")
-
-
-# RemoteSpec():
-#
-# Defines the basic structure of a remote specification.
-#
-class RemoteSpec(namedtuple("RemoteSpec", "url push server_cert client_key client_cert instance_name type")):
-
-    # new_from_config_node
-    #
-    # Creates a RemoteSpec() from a YAML loaded node.
-    #
-    # Args:
-    #    spec_node (MappingNode): The configuration node describing the spec.
-    #    basedir (str): The base directory from which to find certificates.
-    #
-    # Returns:
-    #    (RemoteSpec) - The described RemoteSpec instance.
-    #
-    # Raises:
-    #    LoadError: If the node is malformed.
-    #
-    @classmethod
-    def new_from_config_node(cls, spec_node, basedir=None):
-        spec_node.validate_keys(["url", "push", "server-cert", "client-key", "client-cert", "instance-name", "type"])
-
-        url = spec_node.get_str("url")
-        if not url:
-            provenance = spec_node.get_node("url").get_provenance()
-            raise LoadError("{}: empty artifact cache URL".format(provenance), LoadErrorReason.INVALID_DATA)
-
-        push = spec_node.get_bool("push", default=False)
-        instance_name = spec_node.get_str("instance-name", default=None)
-
-        def parse_cert(key):
-            cert = spec_node.get_str(key, default=None)
-            if cert:
-                cert = os.path.expanduser(cert)
-
-                if basedir:
-                    cert = os.path.join(basedir, cert)
-
-            return cert
-
-        cert_keys = ("server-cert", "client-key", "client-cert")
-        server_cert, client_key, client_cert = tuple(parse_cert(key) for key in cert_keys)
-
-        if client_key and not client_cert:
-            provenance = spec_node.get_node("client-key").get_provenance()
-            raise LoadError(
-                "{}: 'client-key' was specified without 'client-cert'".format(provenance), LoadErrorReason.INVALID_DATA
-            )
-
-        if client_cert and not client_key:
-            provenance = spec_node.get_node("client-cert").get_provenance()
-            raise LoadError(
-                "{}: 'client-cert' was specified without 'client-key'".format(provenance), LoadErrorReason.INVALID_DATA
-            )
-
-        type_ = spec_node.get_enum("type", RemoteType, default=RemoteType.ALL)
-
-        return cls(url, push, server_cert, client_key, client_cert, instance_name, type_)
-
-
-# FIXME: This can be made much nicer in python 3.7 through the use of
-# defaults - or hell, by replacing it all with a typing.NamedTuple
-#
-# Note that defaults are specified from the right, and ommitted values
-# are considered mandatory.
-#
-# Disable type-checking since "Callable[...] has no attributes __defaults__"
-RemoteSpec.__new__.__defaults__ = (  # type: ignore
-    # mandatory          # url            - The url of the remote
-    # mandatory          # push           - Whether the remote should be used for pushing
-    None,  # server_cert    - The server certificate
-    None,  # client_key     - The (private) client key
-    None,  # client_cert    - The (public) client certificate
-    None,  # instance_name  - The (grpc) instance name of the remote
-    RemoteType.ALL,  # type           - The type of the remote (index, storage, both)
-)
+from ._exceptions import ImplError, RemoteError
 
 
 # BaseRemote():
@@ -135,19 +37,26 @@ class BaseRemote:
 
     def __init__(self, spec):
         self.spec = spec
+        self.channel = None
         self._initialized = False
+        self._lock = threading.Lock()
 
-        self.channel = None
+    ####################################################
+    #                 Dunder methods                   #
+    ####################################################
+    def __enter__(self):
+        return self
 
-        self.server_cert = None
-        self.client_key = None
-        self.client_cert = None
+    def __exit__(self, _exc_type, _exc_value, traceback):
+        self.close()
+        return False
 
-        self.instance_name = spec.instance_name
-        self.push = spec.push
-        self.url = spec.url
+    def __str__(self):
+        return self.spec.url
 
-        self._lock = threading.Lock()
+    ####################################################
+    #                   Remote API                     #
+    ####################################################
 
     # init():
     #
@@ -159,40 +68,10 @@ class BaseRemote:
             if self._initialized:
                 return
 
-            # Set up the communcation channel
-            url = urlparse(self.spec.url)
-            if url.scheme == "http":
-                port = url.port or 80
-                self.channel = grpc.insecure_channel("{}:{}".format(url.hostname, port))
-            elif url.scheme == "https":
-                port = url.port or 443
-                try:
-                    server_cert, client_key, client_cert = _read_files(
-                        self.spec.server_cert, self.spec.client_key, self.spec.client_cert
-                    )
-                except FileNotFoundError as e:
-                    raise RemoteError("Could not read certificates: {}".format(e)) from e
-                self.server_cert = server_cert
-                self.client_key = client_key
-                self.client_cert = client_cert
-                credentials = grpc.ssl_channel_credentials(
-                    root_certificates=self.server_cert, private_key=self.client_key, certificate_chain=self.client_cert
-                )
-                self.channel = grpc.secure_channel("{}:{}".format(url.hostname, port), credentials)
-            else:
-                raise RemoteError("Unsupported URL: {}".format(self.spec.url))
-
+            self.channel = self.spec.open_channel()
             self._configure_protocols()
-
             self._initialized = True
 
-    def __enter__(self):
-        return self
-
-    def __exit__(self, _exc_type, _exc_value, traceback):
-        self.close()
-        return False
-
     def close(self):
         if self.channel:
             self.channel.close()
@@ -200,18 +79,6 @@ class BaseRemote:
 
         self._initialized = False
 
-    # _configure_protocols():
-    #
-    # An abstract method to configure remote-specific protocols. This
-    # is *not* done as super().init() because we want to be able to
-    # set self._initialized *after* initialization completes in the
-    # parent class.
-    #
-    # This method should *never* be called outside of init().
-    #
-    def _configure_protocols(self):
-        raise ImplError("An implementation of a Remote must configure its protocols.")
-
     # check():
     #
     # Check if the remote is functional and has all the required
@@ -234,6 +101,10 @@ class BaseRemote:
         finally:
             self.close()
 
+    ####################################################
+    #                Abstract methods                  #
+    ####################################################
+
     # _check():
     #
     # Check if this remote provides everything required for the
@@ -246,26 +117,14 @@ class BaseRemote:
     def _check(self):
         pass
 
-    def __str__(self):
-        return self.url
-
-
-# _read_files():
-#
-# A helper method to read a bunch of files, ignoring any input
-# arguments that are None.
-#
-# Args:
-#    files (Iterable[str|None]): A list of files to read. Nones are passed back.
-#
-# Returns:
-#    Generator[str|None, None, None] - Strings read from those files.
-#
-def _read_files(*files):
-    def read_file(f):
-        if f:
-            with open(f, "rb") as data:
-                return data.read()
-        return None
-
-    return (read_file(f) for f in files)
+    # _configure_protocols():
+    #
+    # An abstract method to configure remote-specific protocols. This
+    # is *not* done as super().init() because we want to be able to
+    # set self._initialized *after* initialization completes in the
+    # parent class.
+    #
+    # This method should *never* be called outside of init().
+    #
+    def _configure_protocols(self):
+        raise ImplError("An implementation of a Remote must configure its protocols.")
diff --git a/src/buildstream/_remotespec.py b/src/buildstream/_remotespec.py
new file mode 100644
index 0000000..4279962
--- /dev/null
+++ b/src/buildstream/_remotespec.py
@@ -0,0 +1,331 @@
+#
+#  Copyright (C) 2019 Bloomberg Finance LP
+#  Copyright (C) 2020 Codethink Limited
+#
+#  This program is free software; you can redistribute it and/or
+#  modify it under the terms of the GNU Lesser General Public
+#  License as published by the Free Software Foundation; either
+#  version 2 of the License, or (at your option) any later version.
+#
+#  This library is distributed in the hope that it will be useful,
+#  but WITHOUT ANY WARRANTY; without even the implied warranty of
+#  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.	 See the GNU
+#  Lesser General Public License for more details.
+#
+#  You should have received a copy of the GNU Lesser General Public
+#  License along with this library. If not, see <http://www.gnu.org/licenses/>.
+#
+
+import os
+from typing import Optional, cast
+from urllib.parse import urlparse
+import grpc
+from grpc import ChannelCredentials, Channel
+
+from ._exceptions import LoadError, RemoteError
+from .exceptions import LoadErrorReason
+from .types import FastEnum
+from .node import MappingNode
+
+
+# RemoteType():
+#
+# Defines the different types of remote.
+#
+class RemoteType(FastEnum):
+    INDEX = "index"
+    STORAGE = "storage"
+    ENDPOINT = "endpoint"
+    ALL = "all"
+
+    def __str__(self) -> str:
+        if self.name:
+            return self.name.lower().replace("_", "-")
+        return ""
+
+
+# RemoteSpec():
+#
+# This data structure holds all of the details required to
+# connect to and communicate with the various grpc remote
+# services, like the artifact cache, source cache and remote
+# execution service.
+#
+class RemoteSpec:
+    def __init__(
+        self,
+        remote_type: str,
+        url: str,
+        *,
+        push: bool = False,
+        server_cert: str = None,
+        client_key: str = None,
+        client_cert: str = None,
+        instance_name: Optional[str] = None,
+        spec_node: Optional[MappingNode] = None,
+    ) -> None:
+
+        #
+        # Public members
+        #
+
+        # The remote type
+        self.remote_type: str = remote_type
+
+        # Whether we are allowed to push (for asset caches only)
+        self.push: bool = push
+
+        # The url of the remote, this may contain a port number
+        self.url: str = url
+
+        # The name of the grpc service to talk to at this remote url
+        self.instance_name: Optional[str] = instance_name
+
+        # The credentials
+        self.server_cert_file: Optional[str] = server_cert
+        self.client_key_file: Optional[str] = client_key
+        self.client_cert_file: Optional[str] = client_cert
+
+        #
+        # Private members
+        #
+
+        # The provenance node for error reporting
+        self._spec_node: Optional[MappingNode] = spec_node
+
+        # The credentials loaded from disk, and whether they were loaded
+        self._server_cert: Optional[bytes] = None
+        self._client_key: Optional[bytes] = None
+        self._client_cert: Optional[bytes] = None
+        self._cred_files_loaded: bool = False
+
+        # The grpc credentials object
+        self._credentials: Optional[ChannelCredentials] = None
+
+    #
+    # Implement dunder methods to support hashing and
+    # comparisons.
+    #
+    def __eq__(self, other: object) -> bool:
+        return hash(self) == hash(other)
+
+    def __hash__(self) -> int:
+        return hash(
+            (
+                self.remote_type,
+                self.push,
+                self.url,
+                self.instance_name,
+                self.server_cert_file,
+                self.client_key_file,
+                self.client_cert_file,
+            )
+        )
+
+    # server_cert()
+    #
+    @property
+    def server_cert(self) -> Optional[bytes]:
+        self._load_credential_files()
+        return self._server_cert
+
+    # client_key()
+    #
+    @property
+    def client_key(self) -> Optional[bytes]:
+        self._load_credential_files()
+        return self._client_key
+
+    # client_cert()
+    #
+    @property
+    def client_cert(self) -> Optional[bytes]:
+        self._load_credential_files()
+        return self._client_cert
+
+    # credentials()
+    #
+    @property
+    def credentials(self) -> ChannelCredentials:
+        if not self._credentials:
+            self._credentials = grpc.ssl_channel_credentials(
+                root_certificates=self.server_cert, private_key=self.client_key, certificate_chain=self.client_cert,
+            )
+        return self._credentials
+
+    # open_channel()
+    #
+    # Opens a gRPC channel based on this spec.
+    #
+    def open_channel(self) -> Channel:
+        url = urlparse(self.url)
+
+        # Assert port number for RE endpoints
+        #
+        if self.remote_type == RemoteType.ENDPOINT and not url.port:
+            message = (
+                "Remote execution endpoints must specify the port number, for example: http://buildservice:50051."
+            )
+            if self._spec_node:
+                message = "{}: {}".format(self._spec_node.get_provenance(), message)
+            raise RemoteError(message)
+
+        if url.scheme == "http":
+            channel = grpc.insecure_channel("{}:{}".format(url.hostname, url.port or 80))
+        elif url.scheme == "https":
+            channel = grpc.secure_channel("{}:{}".format(url.hostname, url.port or 443), self.credentials)
+        else:
+            message = "Only 'http' and 'https' protocols are supported, but '{}' was supplied.".format(url.scheme)
+            if self._spec_node:
+                message = "{}: {}".format(self._spec_node.get_provenance(), message)
+            raise RemoteError(message)
+
+        return channel
+
+    # new_from_node():
+    #
+    # Creates a RemoteSpec() from a YAML loaded node.
+    #
+    # Args:
+    #    spec_node: The configuration node describing the spec.
+    #    basedir: The base directory from which to find certificates.
+    #    remote_execution: Whether this spec is used for remote execution (some keys are invalid)
+    #
+    # Returns:
+    #    The described RemoteSpec instance.
+    #
+    # Raises:
+    #    LoadError: If the node is malformed.
+    #
+    @classmethod
+    def new_from_node(
+        cls, spec_node: MappingNode, basedir: Optional[str] = None, *, remote_execution: bool = False
+    ) -> "RemoteSpec":
+        valid_keys = ["url", "server-cert", "client-key", "client-cert", "instance-name"]
+
+        if remote_execution:
+            remote_type = RemoteType.ENDPOINT
+            push = False
+        else:
+            remote_type = cast(str, spec_node.get_enum("type", RemoteType, default=RemoteType.ALL))
+            push = spec_node.get_bool("push", default=False)
+            valid_keys += ["push", "type"]
+
+        spec_node.validate_keys(valid_keys)
+
+        # FIXME: This explicit error message should not be necessary, instead
+        #        we should be able to inform Node.get_str() that an empty string
+        #        is not acceptable, and have Node do the work of raising this error.
+        #
+        url = spec_node.get_str("url")
+        if not url:
+            provenance = spec_node.get_node("url").get_provenance()
+            raise LoadError("{}: empty artifact cache URL".format(provenance), LoadErrorReason.INVALID_DATA)
+
+        instance_name = spec_node.get_str("instance-name", default=None)
+
+        def parse_cert(key):
+            cert = spec_node.get_str(key, default=None)
+            if cert:
+                cert = os.path.expanduser(cert)
+
+                if basedir:
+                    cert = os.path.join(basedir, cert)
+
+            return cert
+
+        cert_keys = ("server-cert", "client-key", "client-cert")
+        server_cert, client_key, client_cert = tuple(parse_cert(key) for key in cert_keys)
+
+        if client_key and not client_cert:
+            provenance = spec_node.get_node("client-key").get_provenance()
+            raise LoadError(
+                "{}: 'client-key' was specified without 'client-cert'".format(provenance), LoadErrorReason.INVALID_DATA
+            )
+
+        if client_cert and not client_key:
+            provenance = spec_node.get_node("client-cert").get_provenance()
+            raise LoadError(
+                "{}: 'client-cert' was specified without 'client-key'".format(provenance), LoadErrorReason.INVALID_DATA
+            )
+
+        return cls(
+            remote_type,
+            url,
+            push=push,
+            server_cert=server_cert,
+            client_key=client_key,
+            client_cert=client_cert,
+            instance_name=instance_name,
+        )
+
+    # _load_credential_files():
+    #
+    # A helper method to load the credentials files, ignoring any input
+    # arguments that are None.
+    #
+    def _load_credential_files(self) -> None:
+        def maybe_read_file(filename: Optional[str]) -> Optional[bytes]:
+            if filename:
+                try:
+                    with open(filename, "rb") as f:
+                        return f.read()
+                except IOError as e:
+                    message = "Failed to load credentials file: {}".format(filename)
+                    if self._spec_node:
+                        message = "{}: {}".format(self._spec_node.get_provenance(), message)
+                    raise RemoteError(message, detail=str(e), reason="load-remote-creds-failed") from e
+            return None
+
+        if not self._cred_files_loaded:
+            self._server_cert = maybe_read_file(self.server_cert_file)
+            self._client_key = maybe_read_file(self.client_key_file)
+            self._client_cert = maybe_read_file(self.client_cert_file)
+            self._cred_files_loaded = True
+
+
+# RemoteExecutionSpec():
+#
+# This data structure holds all of the details required to
+# connect to a remote execution cluster, it is essentially
+# comprised of 3 RemoteSpec objects which are used to
+# communicate with various components of an RE build cluster.
+#
+class RemoteExecutionSpec:
+    def __init__(self, exec_spec: RemoteSpec, storage_spec: RemoteSpec, action_spec: Optional[RemoteSpec]) -> None:
+        self.exec_spec: RemoteSpec = exec_spec
+        self.storage_spec: RemoteSpec = storage_spec
+        self.action_spec: Optional[RemoteSpec] = action_spec
+
+    # new_from_node():
+    #
+    # Creates a RemoteExecutionSpec() from a YAML loaded node.
+    #
+    # Args:
+    #    node: The node to parse
+    #    basedir: The base directory from which to find certificates.
+    #
+    # Returns:
+    #    The described RemoteSpec instance.
+    #
+    # Raises:
+    #    LoadError: If the node is malformed.
+    #
+    @classmethod
+    def new_from_node(cls, node: MappingNode, basedir: Optional[str] = None) -> "RemoteExecutionSpec":
+        node.validate_keys(["execution-service", "storage-service", "action-cache-service"])
+
+        exec_node = node.get_mapping("execution-service")
+        storage_node = node.get_mapping("storage-service")
+        action_node = node.get_mapping("action-cache-service", default=None)
+
+        exec_spec = RemoteSpec.new_from_node(exec_node, basedir, remote_execution=True)
+        storage_spec = RemoteSpec.new_from_node(storage_node, basedir, remote_execution=True)
+
+        action_spec: Optional[RemoteSpec]
+        if action_node:
+            action_spec = RemoteSpec.new_from_node(action_node, basedir, remote_execution=True)
+        else:
+            action_spec = None
+
+        return cls(exec_spec, storage_spec, action_spec)
diff --git a/src/buildstream/_sourcecache.py b/src/buildstream/_sourcecache.py
index 37d990b..475a166 100644
--- a/src/buildstream/_sourcecache.py
+++ b/src/buildstream/_sourcecache.py
@@ -177,8 +177,8 @@ class SourceCache(AssetCache):
 
         # find configured push remotes for this source
         if self._has_push_remotes:
-            index_remotes = [r for r in self._index_remotes[project] if r.push]
-            storage_remotes = [r for r in self._storage_remotes[project] if r.push]
+            index_remotes = [r for r in self._index_remotes[project] if r.spec.push]
+            storage_remotes = [r for r in self._storage_remotes[project] if r.spec.push]
 
         pushed_storage = False
         pushed_index = False
diff --git a/src/buildstream/sandbox/_sandboxremote.py b/src/buildstream/sandbox/_sandboxremote.py
index 1174b27..6ffe56b 100644
--- a/src/buildstream/sandbox/_sandboxremote.py
+++ b/src/buildstream/sandbox/_sandboxremote.py
@@ -18,28 +18,18 @@
 #  Authors:
 #        Jim MacArthur <ji...@codethink.co.uk>
 
-import os
 import shutil
-from collections import namedtuple
-from urllib.parse import urlparse
 from functools import partial
 
 import grpc
 
-from ..node import Node
 from ._sandboxreapi import SandboxREAPI
 from .. import _signals
 from .._protos.build.bazel.remote.execution.v2 import remote_execution_pb2, remote_execution_pb2_grpc
 from .._protos.google.rpc import code_pb2
 from .._exceptions import BstError, SandboxError
-from .. import _yaml
 from .._protos.google.longrunning import operations_pb2, operations_pb2_grpc
 from .._cas import CASRemote
-from .._remote import RemoteSpec
-
-
-class RemoteExecutionSpec(namedtuple("RemoteExecutionSpec", "exec_service storage_service action_service")):
-    pass
 
 
 # SandboxRemote()
@@ -53,129 +43,15 @@ class SandboxRemote(SandboxREAPI):
 
         self._output_files_required = kwargs.get("output_files_required", True)
 
-        config = kwargs["specs"]  # This should be a RemoteExecutionSpec
-        if config is None:
+        specs = kwargs["specs"]  # This should be a RemoteExecutionSpec
+        if specs is None:
             return
 
-        self.storage_url = config.storage_service["url"]
-        self.exec_url = config.exec_service["url"]
-
-        exec_certs = {}
-        for key in ["client-cert", "client-key", "server-cert"]:
-            if key in config.exec_service:
-                with open(config.exec_service[key], "rb") as f:
-                    exec_certs[key] = f.read()
-
-        self.exec_credentials = grpc.ssl_channel_credentials(
-            root_certificates=exec_certs.get("server-cert"),
-            private_key=exec_certs.get("client-key"),
-            certificate_chain=exec_certs.get("client-cert"),
-        )
-
-        action_certs = {}
-        for key in ["client-cert", "client-key", "server-cert"]:
-            if key in config.action_service:
-                with open(config.action_service[key], "rb") as f:
-                    action_certs[key] = f.read()
-
-        if config.action_service:
-            self.action_url = config.action_service["url"]
-            self.action_instance = config.action_service.get("instance-name", None)
-            self.action_credentials = grpc.ssl_channel_credentials(
-                root_certificates=action_certs.get("server-cert"),
-                private_key=action_certs.get("client-key"),
-                certificate_chain=action_certs.get("client-cert"),
-            )
-        else:
-            self.action_url = None
-            self.action_instance = None
-            self.action_credentials = None
-
-        self.exec_instance = config.exec_service.get("instance-name", None)
-        self.storage_instance = config.storage_service.get("instance-name", None)
-
-        self.storage_remote_spec = RemoteSpec(
-            self.storage_url,
-            push=True,
-            server_cert=config.storage_service.get("server-cert"),
-            client_key=config.storage_service.get("client-key"),
-            client_cert=config.storage_service.get("client-cert"),
-            instance_name=self.storage_instance,
-        )
+        self.storage_spec = specs.storage_spec
+        self.exec_spec = specs.exec_spec
+        self.action_spec = specs.action_spec
         self.operation_name = None
 
-    @staticmethod
-    def specs_from_config_node(config_node, basedir=None):
-        def require_node(config, keyname):
-            val = config.get_mapping(keyname, default=None)
-            if val is None:
-                provenance = remote_config.get_provenance()
-                raise _yaml.LoadError(
-                    "{}: '{}' was not present in the remote "
-                    "execution configuration (remote-execution). ".format(str(provenance), keyname),
-                    _yaml.LoadErrorReason.INVALID_DATA,
-                )
-            return val
-
-        remote_config = config_node.get_mapping("remote-execution", default=None)
-        if remote_config is None:
-            return None
-
-        service_keys = ["execution-service", "storage-service", "action-cache-service"]
-
-        remote_config.validate_keys(["url", *service_keys])
-
-        exec_config = require_node(remote_config, "execution-service")
-        storage_config = require_node(remote_config, "storage-service")
-        action_config = remote_config.get_mapping("action-cache-service", default={})
-
-        tls_keys = ["client-key", "client-cert", "server-cert"]
-
-        exec_config.validate_keys(["url", "instance-name", *tls_keys])
-        storage_config.validate_keys(["url", "instance-name", *tls_keys])
-        if action_config:
-            action_config.validate_keys(["url", "instance-name", *tls_keys])
-
-        # Maintain some backwards compatibility with older configs, in which
-        # 'url' was the only valid key for remote-execution:
-        if "url" in remote_config:
-            if "execution-service" not in remote_config:
-                exec_config = Node.from_dict({"url": remote_config["url"]})
-            else:
-                provenance = remote_config.get_node("url").get_provenance()
-                raise _yaml.LoadError(
-                    "{}: 'url' and 'execution-service' keys were found in the remote "
-                    "execution configuration (remote-execution). "
-                    "You can only specify one of these.".format(str(provenance)),
-                    _yaml.LoadErrorReason.INVALID_DATA,
-                )
-
-        service_configs = [exec_config, storage_config, action_config]
-
-        def resolve_path(path):
-            if basedir and path:
-                return os.path.join(basedir, path)
-            else:
-                return path
-
-        for config_key, config in zip(service_keys, service_configs):
-            # Either both or none of the TLS client key/cert pair must be specified:
-            if ("client-key" in config) != ("client-cert" in config):
-                provenance = remote_config.get_node(config_key).get_provenance()
-                raise _yaml.LoadError(
-                    "{}: TLS client key/cert pair is incomplete. "
-                    "You must specify both 'client-key' and 'client-cert' "
-                    "for authenticated HTTPS connections.".format(str(provenance)),
-                    _yaml.LoadErrorReason.INVALID_DATA,
-                )
-
-            for tls_key in tls_keys:
-                if tls_key in config:
-                    config[tls_key] = resolve_path(config.get_str(tls_key))
-
-        # TODO: we should probably not be stripping node info and rather load files the safe way
-        return RemoteExecutionSpec(*[conf.strip_node_info() for conf in service_configs])
-
     def run_remote_command(self, channel, action_digest):
         # Sends an execution request to the remote execution server.
         #
@@ -184,7 +60,7 @@ class SandboxRemote(SandboxREAPI):
         # Try to create a communication channel to the BuildGrid server.
         stub = remote_execution_pb2_grpc.ExecutionStub(channel)
         request = remote_execution_pb2.ExecuteRequest(
-            instance_name=self.exec_instance, action_digest=action_digest, skip_cache_lookup=False
+            instance_name=self.exec_spec.instance_name, action_digest=action_digest, skip_cache_lookup=False
         )
 
         def __run_remote_command(stub, execute_request=None, running_operation=None):
@@ -217,7 +93,7 @@ class SandboxRemote(SandboxREAPI):
                 ):
                     raise SandboxError(
                         "Failed contacting remote execution server at {}."
-                        "{}: {}".format(self.exec_url, status_code.name, e.details())
+                        "{}: {}".format(self.exec_spec.url, status_code.name, e.details())
                     )
 
                 if running_operation and status_code == grpc.StatusCode.UNIMPLEMENTED:
@@ -284,7 +160,7 @@ class SandboxRemote(SandboxREAPI):
                     # artifact servers.
                     blobs_to_fetch = artifactcache.find_missing_blobs(project, local_missing_blobs)
 
-                with CASRemote(self.storage_remote_spec, cascache) as casremote:
+                with CASRemote(self.storage_spec, cascache) as casremote:
                     cascache.fetch_blobs(casremote, blobs_to_fetch)
 
     def _execute_action(self, action, flags):
@@ -301,12 +177,12 @@ class SandboxRemote(SandboxREAPI):
         action_result = self._check_action_cache(action_digest)
 
         if not action_result:
-            with CASRemote(self.storage_remote_spec, cascache) as casremote:
+            with CASRemote(self.storage_spec, cascache) as casremote:
                 try:
                     casremote.init()
                 except grpc.RpcError as e:
                     raise SandboxError(
-                        "Failed to contact remote execution CAS endpoint at {}: {}".format(self.storage_url, e)
+                        "Failed to contact remote execution CAS endpoint at {}: {}".format(self.storage_spec.url, e)
                     ) from e
 
                 with self._get_context().messenger.timed_activity(
@@ -338,30 +214,14 @@ class SandboxRemote(SandboxREAPI):
                     except grpc.RpcError as e:
                         raise SandboxError("Failed to push source directory to remote: {}".format(e)) from e
 
-            # Next, try to create a communication channel to the BuildGrid server.
-            url = urlparse(self.exec_url)
-            if not url.port:
-                raise SandboxError(
-                    "You must supply a protocol and port number in the execution-service url, "
-                    "for example: http://buildservice:50051."
-                )
-            if url.scheme == "http":
-                channel = grpc.insecure_channel("{}:{}".format(url.hostname, url.port))
-            elif url.scheme == "https":
-                channel = grpc.secure_channel("{}:{}".format(url.hostname, url.port), self.exec_credentials)
-            else:
-                raise SandboxError(
-                    "Remote execution currently only supports the 'http' protocol "
-                    "and '{}' was supplied.".format(url.scheme)
-                )
-
             # Now request to execute the action
+            channel = self.exec_spec.open_channel()
             with channel:
                 operation = self.run_remote_command(channel, action_digest)
                 action_result = self._extract_action_result(operation)
 
         # Fetch outputs
-        with CASRemote(self.storage_remote_spec, cascache) as casremote:
+        with CASRemote(self.storage_spec, cascache) as casremote:
             for output_directory in action_result.output_directories:
                 tree_digest = output_directory.tree_digest
                 if tree_digest is None or not tree_digest.hash:
@@ -394,22 +254,13 @@ class SandboxRemote(SandboxREAPI):
         #
         # Should return either the action response or None if not found, raise
         # Sandboxerror if other grpc error was raised
-        if not self.action_url:
+        if not self.action_spec:
             return None
-        url = urlparse(self.action_url)
-        if not url.port:
-            raise SandboxError(
-                "You must supply a protocol and port number in the action-cache-service url, "
-                "for example: http://buildservice:50051."
-            )
-        if url.scheme == "http":
-            channel = grpc.insecure_channel("{}:{}".format(url.hostname, url.port))
-        elif url.scheme == "https":
-            channel = grpc.secure_channel("{}:{}".format(url.hostname, url.port), self.action_credentials)
 
+        channel = self.action_spec.open_channel()
         with channel:
             request = remote_execution_pb2.GetActionResultRequest(
-                instance_name=self.action_instance, action_digest=action_digest
+                instance_name=self.action_spec.instance_name, action_digest=action_digest
             )
             stub = remote_execution_pb2_grpc.ActionCacheStub(channel)
             try:
diff --git a/tests/artifactcache/config.py b/tests/artifactcache/config.py
index 414a6fe..89d3753 100644
--- a/tests/artifactcache/config.py
+++ b/tests/artifactcache/config.py
@@ -6,7 +6,7 @@ import os
 
 import pytest
 
-from buildstream._remote import RemoteSpec, RemoteType
+from buildstream._remotespec import RemoteSpec, RemoteType
 from buildstream._artifactcache import ArtifactCache
 from buildstream._project import Project
 from buildstream.utils import _deduplicate
@@ -19,14 +19,14 @@ from tests.testutils import dummy_context
 
 
 DATA_DIR = os.path.dirname(os.path.realpath(__file__))
-cache1 = RemoteSpec(url="https://example.com/cache1", push=True)
-cache2 = RemoteSpec(url="https://example.com/cache2", push=False)
-cache3 = RemoteSpec(url="https://example.com/cache3", push=False)
-cache4 = RemoteSpec(url="https://example.com/cache4", push=False)
-cache5 = RemoteSpec(url="https://example.com/cache5", push=False)
-cache6 = RemoteSpec(url="https://example.com/cache6", push=True, type=RemoteType.ALL)
-cache7 = RemoteSpec(url="https://index.example.com/cache1", push=True, type=RemoteType.INDEX)
-cache8 = RemoteSpec(url="https://storage.example.com/cache1", push=True, type=RemoteType.STORAGE)
+cache1 = RemoteSpec(RemoteType.ALL, url="https://example.com/cache1", push=True)
+cache2 = RemoteSpec(RemoteType.ALL, url="https://example.com/cache2", push=False)
+cache3 = RemoteSpec(RemoteType.ALL, url="https://example.com/cache3", push=False)
+cache4 = RemoteSpec(RemoteType.ALL, url="https://example.com/cache4", push=False)
+cache5 = RemoteSpec(RemoteType.ALL, url="https://example.com/cache5", push=False)
+cache6 = RemoteSpec(RemoteType.ALL, url="https://example.com/cache6", push=True)
+cache7 = RemoteSpec(RemoteType.INDEX, url="https://index.example.com/cache1", push=True)
+cache8 = RemoteSpec(RemoteType.STORAGE, url="https://storage.example.com/cache1", push=True)
 
 
 # Generate cache configuration fragments for the user config and project config files.
@@ -45,11 +45,11 @@ def configure_remote_caches(override_caches, project_caches=None, user_caches=No
         user_config["artifacts"] = {
             "url": user_caches[0].url,
             "push": user_caches[0].push,
-            "type": type_strings[user_caches[0].type],
+            "type": type_strings[user_caches[0].remote_type],
         }
     elif len(user_caches) > 1:
         user_config["artifacts"] = [
-            {"url": cache.url, "push": cache.push, "type": type_strings[cache.type]} for cache in user_caches
+            {"url": cache.url, "push": cache.push, "type": type_strings[cache.remote_type]} for cache in user_caches
         ]
 
     if len(override_caches) == 1:
@@ -58,7 +58,7 @@ def configure_remote_caches(override_caches, project_caches=None, user_caches=No
                 "artifacts": {
                     "url": override_caches[0].url,
                     "push": override_caches[0].push,
-                    "type": type_strings[override_caches[0].type],
+                    "type": type_strings[override_caches[0].remote_type],
                 }
             }
         }
@@ -66,7 +66,7 @@ def configure_remote_caches(override_caches, project_caches=None, user_caches=No
         user_config["projects"] = {
             "test": {
                 "artifacts": [
-                    {"url": cache.url, "push": cache.push, "type": type_strings[cache.type]}
+                    {"url": cache.url, "push": cache.push, "type": type_strings[cache.remote_type]}
                     for cache in override_caches
                 ]
             }
@@ -80,7 +80,7 @@ def configure_remote_caches(override_caches, project_caches=None, user_caches=No
                     "artifacts": {
                         "url": project_caches[0].url,
                         "push": project_caches[0].push,
-                        "type": type_strings[project_caches[0].type],
+                        "type": type_strings[project_caches[0].remote_type],
                     }
                 }
             )
@@ -88,7 +88,7 @@ def configure_remote_caches(override_caches, project_caches=None, user_caches=No
             project_config.update(
                 {
                     "artifacts": [
-                        {"url": cache.url, "push": cache.push, "type": type_strings[cache.type]}
+                        {"url": cache.url, "push": cache.push, "type": type_strings[cache.remote_type]}
                         for cache in project_caches
                     ]
                 }
@@ -256,7 +256,8 @@ def test_paths_for_artifact_config_are_expanded(tmpdir, monkeypatch, artifacts_c
     # Build expected artifact config
     artifacts_config = [
         RemoteSpec(
-            url=config["url"],
+            RemoteType.ALL,
+            config["url"],
             push=False,
             server_cert=os.path.expanduser(config["server-cert"]),
             client_cert=os.path.expanduser(config["client-cert"]),


[buildstream] 01/03: node.pyi: Adding some missing type annotations

Posted by tv...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

tvb pushed a commit to branch tristan/refactor-remote-config
in repository https://gitbox.apache.org/repos/asf/buildstream.git

commit f95761445ab71a666844f1edc550a65b8094199f
Author: Tristan van Berkom <tr...@codethink.co.uk>
AuthorDate: Sat Jan 16 22:59:36 2021 +0900

    node.pyi: Adding some missing type annotations
---
 src/buildstream/node.pyi | 13 ++++++++++++-
 1 file changed, 12 insertions(+), 1 deletion(-)

diff --git a/src/buildstream/node.pyi b/src/buildstream/node.pyi
index 059bec2..34dc363 100644
--- a/src/buildstream/node.pyi
+++ b/src/buildstream/node.pyi
@@ -1,4 +1,4 @@
-from typing import overload, Generic, List, Mapping, Optional, Sequence, TypeVar, Dict, Any
+from typing import overload, Generic, List, Mapping, Optional, Sequence, TypeVar, Type, Dict, Any
 
 from ._project import Project
 
@@ -16,6 +16,7 @@ class MappingNode(Node, Generic[TNode]):
     def __init__(self, file_index: int, line: int, column: int, value: Mapping[str, TValidNodeValue]) -> None: ...
     def clone(self) -> MappingNode[TNode]: ...
     def validate_keys(self, valid_keys: List[str]): ...
+    def get_bool(self, key: str, default: bool) -> bool: ...
     @overload
     def get_str_list(self, key: str) -> List[str]: ...
     @overload
@@ -35,11 +36,21 @@ class MappingNode(Node, Generic[TNode]):
     @overload
     def get_int(self, key: str, default: Optional[int]) -> Optional[int]: ...
     @overload
+    def get_enum(self, key: str, constraint: object) -> object: ...
+    @overload
+    def get_enum(self, key: str, constraint: object, default: Optional[object]) -> Optional[object]: ...
+    @overload
     def get_mapping(self, key: str) -> "MappingNode": ...
     @overload
     def get_mapping(self, key: str, default: "MappingNode") -> "MappingNode": ...
     @overload
     def get_mapping(self, key: str, default: Optional["MappingNode"]) -> Optional["MappingNode"]: ...
+    @overload
+    def get_node(self, key: str) -> Node: ...
+    @overload
+    def get_node(self, key: str, allowed_types: Optional[List[Type[Node]]]) -> Node: ...
+    @overload
+    def get_node(self, key: str, allowed_types: Optional[List[Type[Node]]], allow_none: bool) -> Optional[Node]: ...
 
 class ScalarNode(Node):
     def as_str(self) -> str: ...