diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 1c95e3d..f972cd9 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,40 +1,40 @@ repos: - repo: https://github.com/pre-commit/pre-commit-hooks - rev: v4.1.0 + rev: v4.3.0 hooks: - id: trailing-whitespace - id: check-json - id: check-yaml - - repo: https://gitlab.com/pycqa/flake8 - rev: 4.0.1 + - repo: https://github.com/pycqa/flake8 + rev: 5.0.4 hooks: - id: flake8 - additional_dependencies: [flake8-bugbear==22.3.23] + additional_dependencies: [flake8-bugbear==22.9.23] - repo: https://github.com/codespell-project/codespell - rev: v2.1.0 + rev: v2.2.2 hooks: - id: codespell name: Check source code spelling stages: [commit] - repo: local hooks: - id: mypy name: mypy entry: mypy args: [swh] pass_filenames: false language: system types: [python] - repo: https://github.com/PyCQA/isort rev: 5.10.1 hooks: - id: isort - repo: https://github.com/python/black - rev: 22.3.0 + rev: 22.10.0 hooks: - id: black diff --git a/PKG-INFO b/PKG-INFO index 8f03d16..5f2365e 100644 --- a/PKG-INFO +++ b/PKG-INFO @@ -1,57 +1,57 @@ Metadata-Version: 2.1 Name: swh.loader.svn -Version: 1.3.5 +Version: 1.3.6 Summary: Software Heritage Loader SVN Home-page: https://forge.softwareheritage.org/diffusion/DLDSVN Author: Software Heritage developers Author-email: swh-devel@inria.fr Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest Project-URL: Funding, https://www.softwareheritage.org/donate Project-URL: Source, https://forge.softwareheritage.org/source/swh-loader-svn Project-URL: Documentation, https://docs.softwareheritage.org/devel/swh-loader-svn/ Classifier: Programming Language :: Python :: 3 Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3) Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Requires-Python: >=3.7 Description-Content-Type: text/markdown Provides-Extra: testing License-File: LICENSE License-File: AUTHORS swh-loader-svn ============== The Software Heritage SVN Loader is a tool and a library to walk a remote svn repository and inject into the SWH dataset all contained files that weren't known before. The main entry points are - :class:`swh.loader.svn.loader.SvnLoader` for the main svn loader which ingests content out of a remote svn repository - :class:`swh.loader.svn.loader.SvnLoaderFromDumpArchive` which mounts a repository out of a svn dump prior to ingest it. - :class:`swh.loader.svn.loader.SvnLoaderFromRemoteDump` which mounts a repository with svnrdump prior to ingest its content. # CLI run With the configuration: /tmp/loader_svn.yml: ``` storage: cls: remote args: url: http://localhost:5002/ ``` Run: ``` swh loader --config-file /tmp/loader_svn.yml \ run svn ``` diff --git a/docs/swh-loader-svn.txt b/docs/swh-loader-svn.txt index 2f56aaa..2240044 100644 --- a/docs/swh-loader-svn.txt +++ b/docs/swh-loader-svn.txt @@ -1,195 +1,195 @@ swh-loader-svn ============== The goal is to load a svn repository's lifetime logs to swh-storage. This must be able to deal with: - unknown svn repository (resulting in a new origin) - known svn repository (starting up from the last known svn revision and update from that moment on) For a full detailed comparison between version's speed, please refer to https://forge.softwareheritage.org/diffusion/DLDSVN/browse/master/docs/comparison-git-svn-swh-svn.org. # v1 ## Description This is a first basic implementation, a proof-of-concept of sort. Based on checkout-ing on disk the svn repository at each revision and walking the tree at svn revision to compute the swh hashes and store them in swh-storage. Conclusion: It is possible but it is slow. We use git-svn to check if the hash computations were a match, and they were not. The swh hashes computation are corrects though. It's just not the same assertions as git-svn so the hashes mismatch. git-svn: - does not checkout empty folders - adds metadata at the end of the svn commit message (by default, this can be avoided but then no update, in the swh sense, is possible afterwards) - integrates the svn repository's uuid in the git revision for the commit author (author@) swh-loader-svn: - checkouts empty folder (which are then used in swh hashes) - adds metadata the git way (leveraging git's extra-header slot), so that we can deal with svn repository updates ## Pseudo ``` Checkout/Update/Export on disk the first known revision or 1 if unknown repository When revision is not 1 Check the history is altered (revision hashes won't match) If it is altered, log an error message and stop Otherwise continue Iterate over logs from revision 1 to revision head_revision The revision is now rev checkout/update/export the revision at rev walk the tree directory for that revision and compute hashes compute the revision hash send the blobs for storage in swh send the directories for storage in swh send the revision for storage in swh done Send the occurrence pointing to the last revision seen ``` ## Notes SVN checkout/update instructions are faster than export since they leverage svn diffs. But: - they do keyword expansion (so bad for diffs with external tools so bad for swh) - we need to ignore .svn folder since it's present (this needed some adaptation in code to ignore folder based on pattern so slow as well) SVN export instruction is slower than the 2 previous ones since they don't use diffs. But: - there is one option to ignore keyword expansion (good) - no folder are to be omitted during hash computation from disk (good) All in all, there is a trade-off here to choose from. Still, everything was tested (with much code adapted in the lower level api) and both are slow. # v2 ## Description The v2 is more about: - adding options to match the git-svn's hash computations - trying to improve the performance So, options are added: - remove empty folder when encountered (to ignore during hash computations) - add an extra commit line to the svn commit message - (de)activate the loader svn's update routine - (de)activate the sending of contents/directories/revisions/occurrences/releases to swh-storage - (de)activate the extra-header metadata in revision hash (thus deactivating the svn update options altogether) As this is thought as genuine implementation, we adapted the revision message to also use the repository's uuid in the author's email. Also, optimization are done as well: -- instead of walking the disk from the top leve at each revision (slow +- instead of walking the disk from the top level at each revision (slow for huge repository like svn.apache.org), compute from the svn log's changed paths between the previous revision and the current one, the lowest common path. Then, walk only that path to compute the updated hashes. Then update from that path to the top level the in-memory hashes (less i/o, less RAM are used). - in the loader-core, lifting the existing swh-storage api to filter only the missing entities on the client side (there are already filters on the server side but filtering client-side uses less RAM. Especially for blobs, since we extract the data from disk and store it in RAM, this is now done only for unknown blobs and still before updating the disk with a new revision content) - in the loader-core, cache are added as well Now the computations, with the right options, are a match with git-svn. Still, the performance against git-svn are bad. Taking a closer look at git-svn, they used a remote-access approach, that is discussing directly with the svn server and computing at the same time the hashes. That is the base for the v3 implementation. ## Pseudo Relatively to the v1, the logic does not change, only the inner implementation. # v3 ## Description This one is about performance only. Leveraging another low-level library (subvertpy) to permit the use of the same git-svn approach, the remote-access. The idea is to replay the logs and diffs on disk and compute hashes closely in time (not as close as possible though, cf. ## Note below). ## Pseudo ``` Do we know the repository (with swh-svn-update option on)? Yes, extract the last swh known revision from swh-storage set start-rev to last-swh-known-revision Export on disk the svn at start-rev Compute revision hashes (from top level tree's hashes + commit log for that revision) Does the revision hash match the one in swh-storage? (<=> Is the history altered?) No log an error message and stop Yes keep the current in-memory hashes (for the following updates steps if any) No set start-rev to 1 Set head-revision to latest svn repository's head revision When start-rev is the same as head revision, we are done. Otherwise continue Iterate over the stream of svn-logs from start-rev to head-rev The current revision is rev replay the diffs from previous rev (rev - 1) to rev and compute hashes along compute the revision hash send the blobs for storage in swh send the directories for storage in swh send the revision for storage in swh done Send the occurrence pointing to the last revision seen ``` ## Note There could be margin for improvement in the actual implementation here. We apply the diff on files first and then open the file to compute its hashes afterwards. If we'd apply the diff and compute the hashes directly, we'd gain one round-trip. Depending on the ratio files/directory, this could be significant. This approach has also the following benefits: - no keyword expansion - no need to ignore .svn folder (since it does not exist) diff --git a/requirements-swh.txt b/requirements-swh.txt index 0bbf186..d9b6f87 100644 --- a/requirements-swh.txt +++ b/requirements-swh.txt @@ -1,4 +1,4 @@ swh.storage >= 0.11.3 -swh.model >= 4.3.0 +swh.model >= 6.6.0 swh.scheduler >= 0.0.39 -swh.loader.core >= 3.0.0 +swh.loader.core >= 5.0.0 diff --git a/swh.loader.svn.egg-info/PKG-INFO b/swh.loader.svn.egg-info/PKG-INFO index 8f03d16..5f2365e 100644 --- a/swh.loader.svn.egg-info/PKG-INFO +++ b/swh.loader.svn.egg-info/PKG-INFO @@ -1,57 +1,57 @@ Metadata-Version: 2.1 Name: swh.loader.svn -Version: 1.3.5 +Version: 1.3.6 Summary: Software Heritage Loader SVN Home-page: https://forge.softwareheritage.org/diffusion/DLDSVN Author: Software Heritage developers Author-email: swh-devel@inria.fr Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest Project-URL: Funding, https://www.softwareheritage.org/donate Project-URL: Source, https://forge.softwareheritage.org/source/swh-loader-svn Project-URL: Documentation, https://docs.softwareheritage.org/devel/swh-loader-svn/ Classifier: Programming Language :: Python :: 3 Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3) Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Requires-Python: >=3.7 Description-Content-Type: text/markdown Provides-Extra: testing License-File: LICENSE License-File: AUTHORS swh-loader-svn ============== The Software Heritage SVN Loader is a tool and a library to walk a remote svn repository and inject into the SWH dataset all contained files that weren't known before. The main entry points are - :class:`swh.loader.svn.loader.SvnLoader` for the main svn loader which ingests content out of a remote svn repository - :class:`swh.loader.svn.loader.SvnLoaderFromDumpArchive` which mounts a repository out of a svn dump prior to ingest it. - :class:`swh.loader.svn.loader.SvnLoaderFromRemoteDump` which mounts a repository with svnrdump prior to ingest its content. # CLI run With the configuration: /tmp/loader_svn.yml: ``` storage: cls: remote args: url: http://localhost:5002/ ``` Run: ``` swh loader --config-file /tmp/loader_svn.yml \ run svn ``` diff --git a/swh.loader.svn.egg-info/requires.txt b/swh.loader.svn.egg-info/requires.txt index 289e712..86c1d53 100644 --- a/swh.loader.svn.egg-info/requires.txt +++ b/swh.loader.svn.egg-info/requires.txt @@ -1,17 +1,17 @@ click iso8601 subvertpy>=0.9.4 tenacity>=6.2 typing-extensions swh.storage>=0.11.3 -swh.model>=4.3.0 +swh.model>=6.6.0 swh.scheduler>=0.0.39 -swh.loader.core>=3.0.0 +swh.loader.core>=5.0.0 [testing] pytest pytest-mock pytest-postgresql swh.core[http]>=0.0.61 types-click types-python-dateutil diff --git a/swh/loader/svn/replay.py b/swh/loader/svn/replay.py index cf52f6d..96987ee 100644 --- a/swh/loader/svn/replay.py +++ b/swh/loader/svn/replay.py @@ -1,1035 +1,1021 @@ # Copyright (C) 2016-2022 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information """Remote Access client to svn server. """ from __future__ import annotations import codecs from collections import defaultdict from dataclasses import dataclass, field from distutils.dir_util import copy_tree from itertools import chain import logging import os import shutil import tempfile from typing import ( TYPE_CHECKING, Any, BinaryIO, Callable, Dict, List, Optional, Set, Tuple, Union, cast, ) import click from subvertpy import SubversionException, delta, properties from subvertpy.ra import Auth, RemoteAccess, get_username_provider from swh.model import from_disk, hashutil from swh.model.from_disk import DiskBackedContent from swh.model.model import Content, Directory, SkippedContent if TYPE_CHECKING: from swh.loader.svn.svn import SvnRepo from swh.loader.svn.utils import ( is_recursive_external, parse_external_definition, svn_urljoin, ) _eol_style = {"native": b"\n", "CRLF": b"\r\n", "LF": b"\n", "CR": b"\r"} logger = logging.getLogger(__name__) def _normalize_line_endings(lines: bytes, eol_style: str = "native") -> bytes: r"""Normalize line endings to unix (\\n), windows (\\r\\n) or mac (\\r). Args: lines: The lines to normalize eol_style: The line ending format as defined for svn:eol-style property. Acceptable values are 'native', 'CRLF', 'LF' and 'CR' Returns: Lines with endings normalized """ if eol_style in _eol_style: lines = lines.replace(_eol_style["CRLF"], _eol_style["LF"]).replace( _eol_style["CR"], _eol_style["LF"] ) if _eol_style[eol_style] != _eol_style["LF"]: lines = lines.replace(_eol_style["LF"], _eol_style[eol_style]) return lines def apply_txdelta_handler( sbuf: bytes, target_stream: BinaryIO ) -> Callable[[Any, bytes, BinaryIO], None]: """Return a function that can be called repeatedly with txdelta windows. When done, closes the target_stream. Adapted from subvertpy.delta.apply_txdelta_handler to close the stream when done. Args: sbuf: Source buffer target_stream: Target stream to write to. Returns: Function to be called to apply txdelta windows """ def apply_window( window: Any, sbuf: bytes = sbuf, target_stream: BinaryIO = target_stream ): if window is None: target_stream.close() return # Last call patch = delta.apply_txdelta_window(sbuf, window) target_stream.write(patch) return apply_window def read_svn_link(data: bytes) -> Tuple[bytes, bytes]: """Read the svn link's content. Args: data: svn link's raw content Returns: The tuple of (filetype, destination path) """ split_byte = b" " first_line = data.split(b"\n")[0] filetype, *src = first_line.split(split_byte) target = split_byte.join(src) return filetype, target def is_file_an_svnlink_p(fullpath: bytes) -> Tuple[bool, bytes]: """Determine if a filepath is an svnlink or something else. Args: fullpath: Full path to the potential symlink to check Returns: Tuple containing a boolean value to determine if it's indeed a symlink (as per svn) and the link target. """ if os.path.islink(fullpath): return False, b"" with open(fullpath, "rb") as f: filetype, src = read_svn_link(f.read()) return filetype == b"link", src def _ra_codecs_error_handler(e: UnicodeError) -> Tuple[Union[str, bytes], int]: """Subvertpy may fail to decode to utf-8 the user svn properties. As they are not used by the loader, return an empty string instead of the decoded content. Args: e: exception raised during the svn properties decoding. """ return "", cast(UnicodeDecodeError, e).end DEFAULT_FLAG = 0 EXEC_FLAG = 1 NOEXEC_FLAG = 2 SVN_PROPERTY_EOL = "svn:eol-style" @dataclass class FileState: """Persists some file states (eg. end of lines style) across revisions while replaying them.""" eol_style: Optional[str] = None """EOL state check mess""" svn_special_path_non_link_data: Optional[bytes] = None """keep track of non link file content with svn:special property set""" # default value: 0, 1: set the flag, 2: remove the exec flag executable: int = DEFAULT_FLAG """keep track if file is executable when setting svn:executable property""" link: bool = False """keep track if file is a svn link when setting svn:special property""" class FileEditor: """File Editor in charge of updating file on disk and memory objects.""" __slots__ = [ "directory", "path", "fullpath", "executable", "link", "state", "svnrepo", "editor", ] def __init__( self, directory: from_disk.Directory, rootpath: bytes, path: bytes, state: FileState, svnrepo: SvnRepo, ): self.directory = directory self.path = path self.fullpath = os.path.join(rootpath, path) self.state = state self.svnrepo = svnrepo self.editor = svnrepo.swhreplay.editor - self.editor.modified_paths.add(path) - def change_prop(self, key: str, value: str) -> None: if key == properties.PROP_EXECUTABLE: if value is None: # bit flip off self.state.executable = NOEXEC_FLAG else: self.state.executable = EXEC_FLAG elif key == properties.PROP_SPECIAL: # Possibly a symbolic link. We cannot check further at # that moment though, patch(s) not being applied yet self.state.link = value is not None elif key == SVN_PROPERTY_EOL: # backup end of line style for file self.state.eol_style = value def __make_symlink(self, src: bytes) -> None: """Convert the svnlink to a symlink on disk. This function expects self.fullpath to be a svn link. Args: src: Path to the link's source Return: tuple: The svnlink's data tuple: - type (should be only 'link') - """ os.remove(self.fullpath) os.symlink(src=src, dst=self.fullpath) def __make_svnlink(self) -> bytes: """Convert the symlink to a svnlink on disk. Return: The symlink's svnlink data (``b'type '``) """ # we replace the symlink by a svnlink # to be able to patch the file on future commits src = os.readlink(self.fullpath) os.remove(self.fullpath) sbuf = b"link " + src with open(self.fullpath, "wb") as f: f.write(sbuf) return sbuf def apply_textdelta(self, base_checksum) -> Callable[[Any, bytes, BinaryIO], None]: # if the filepath matches an external, do not apply local patch if self.path in self.editor.external_paths: return lambda *args: None if os.path.lexists(self.fullpath): if os.path.islink(self.fullpath): # svn does not deal with symlink so we transform into # real svn symlink for potential patching in later # commits sbuf = self.__make_svnlink() self.state.link = True else: with open(self.fullpath, "rb") as f: sbuf = f.read() else: sbuf = b"" t = open(self.fullpath, "wb") return apply_txdelta_handler(sbuf, target_stream=t) def close(self) -> None: """When done with the file, this is called. So the file exists and is updated, we can: - adapt accordingly its execution flag if any - compute the objects' checksums - replace the svnlink with a real symlink (for disk computation purposes) """ if self.state.link: # can only check now that the link is a real one # since patch has been applied is_link, src = is_file_an_svnlink_p(self.fullpath) if is_link: self.__make_symlink(src) elif not os.path.isdir(self.fullpath): # not a real link ... # when a file with the svn:special property set is not a svn link, # the svn export operation might extract a truncated version of it # if it is a binary file, so ensure to produce the same file as the # export operation. with open(self.fullpath, "rb") as f: content = f.read() self.svnrepo.export( os.path.join(self.svnrepo.remote_url.encode(), self.path), to=self.fullpath, peg_rev=self.editor.revnum, ignore_keywords=True, overwrite=True, ) with open(self.fullpath, "rb") as f: exported_data = f.read() if exported_data != content: # keep track of original file content in order to restore # it if the svn:special property gets unset in another revision self.state.svn_special_path_non_link_data = content elif os.path.islink(self.fullpath): # path was a symbolic link in previous revision but got the property # svn:special unset in current one, revert its content to svn link format self.__make_svnlink() elif self.state.svn_special_path_non_link_data is not None: # path was a non link file with the svn:special property previously set # and got truncated on export, restore its original content with open(self.fullpath, "wb") as f: f.write(self.state.svn_special_path_non_link_data) self.state.svn_special_path_non_link_data = None is_link = os.path.islink(self.fullpath) if not is_link: # if a link, do nothing regarding flag if self.state.executable == EXEC_FLAG: os.chmod(self.fullpath, 0o755) elif self.state.executable == NOEXEC_FLAG: os.chmod(self.fullpath, 0o644) # And now compute file's checksums if self.state.eol_style and not is_link: # ensure to normalize line endings as defined by svn:eol-style # property to get the same file checksum as after an export # or checkout operation with subversion with open(self.fullpath, "rb") as f: data = f.read() data = _normalize_line_endings(data, self.state.eol_style) mode = os.lstat(self.fullpath).st_mode self.directory[self.path] = from_disk.Content.from_bytes( mode=mode, data=data ) else: self.directory[self.path] = from_disk.Content.from_file(path=self.fullpath) ExternalDefinition = Tuple[str, Optional[int], bool] @dataclass class DirState: """Persists some directory states (eg. externals) across revisions while replaying them.""" externals: Dict[str, List[ExternalDefinition]] = field(default_factory=dict) """Map a path in the directory to a list of (external_url, revision, relative_url) targeting it""" class DirEditor: """Directory Editor in charge of updating directory hashes computation. This implementation includes empty folder in the hash computation. """ __slots__ = [ "directory", "rootpath", "path", "file_states", "dir_states", "svnrepo", "editor", "externals", ] def __init__( self, directory: from_disk.Directory, rootpath: bytes, path: bytes, file_states: Dict[bytes, FileState], dir_states: Dict[bytes, DirState], svnrepo: SvnRepo, ): self.directory = directory self.rootpath = rootpath self.path = path # build directory on init os.makedirs(rootpath, exist_ok=True) self.file_states = file_states self.dir_states = dir_states self.svnrepo = svnrepo self.editor = svnrepo.swhreplay.editor self.externals: Dict[str, List[ExternalDefinition]] = {} - # repository root dir has empty path - if path: - self.editor.modified_paths.add(path) - def remove_child(self, path: bytes) -> None: """Remove a path from the current objects. The path can be resolved as link, file or directory. This function takes also care of removing the link between the child and the parent. Args: path: to remove from the current objects. """ try: entry_removed = self.directory[path] except KeyError: entry_removed = None else: del self.directory[path] fpath = os.path.join(self.rootpath, path) if isinstance(entry_removed, from_disk.Directory): shutil.rmtree(fpath) else: os.remove(fpath) # when deleting a directory ensure to remove any svn property for the # file it contains as they can be added again later in another revision # without the same property set fullpath = os.path.join(self.rootpath, path) for state_path in list(self.file_states): if state_path.startswith(fullpath + b"/"): del self.file_states[state_path] - self.editor.modified_paths.discard(path) - def open_directory(self, path: str, *args) -> DirEditor: """Updating existing directory.""" return DirEditor( self.directory, rootpath=self.rootpath, path=os.fsencode(path), file_states=self.file_states, dir_states=self.dir_states, svnrepo=self.svnrepo, ) def add_directory(self, path: str, *args) -> DirEditor: """Adding a new directory.""" path_bytes = os.fsencode(path) os.makedirs(os.path.join(self.rootpath, path_bytes), exist_ok=True) if path_bytes and path_bytes not in self.directory: self.dir_states[path_bytes] = DirState() self.directory[path_bytes] = from_disk.Directory() return DirEditor( self.directory, self.rootpath, path_bytes, self.file_states, self.dir_states, svnrepo=self.svnrepo, ) def open_file(self, path: str, *args) -> FileEditor: """Updating existing file.""" path_bytes = os.fsencode(path) self.directory[path_bytes] = from_disk.Content() fullpath = os.path.join(self.rootpath, path_bytes) return FileEditor( self.directory, rootpath=self.rootpath, path=path_bytes, state=self.file_states[fullpath], svnrepo=self.svnrepo, ) def add_file(self, path: str, *args) -> FileEditor: """Creating a new file.""" path_bytes = os.fsencode(path) self.directory[path_bytes] = from_disk.Content() fullpath = os.path.join(self.rootpath, path_bytes) self.file_states[fullpath] = FileState() return FileEditor( self.directory, self.rootpath, path_bytes, state=self.file_states[fullpath], svnrepo=self.svnrepo, ) def change_prop(self, key: str, value: str) -> None: """Change property callback on directory.""" if key == properties.PROP_EXTERNALS: logger.debug( "Setting '%s' property with value '%s' on path %s", key, value, self.path, ) self.externals = defaultdict(list) if value is not None: try: # externals are set on that directory path, parse and store them # for later processing in the close method for external in value.split("\n"): external = external.rstrip("\r") # skip empty line or comment if not external or external.startswith("#"): continue ( path, external_url, revision, relative_url, ) = parse_external_definition( external, os.fsdecode(self.path), self.svnrepo.origin_url ) self.externals[path].append( (external_url, revision, relative_url) ) except ValueError: logger.debug( "Failed to parse external: %s\n" "Externals defined on path %s will not be processed", external, self.path, ) # as the official subversion client, do not process externals in case # of parsing error self.externals = {} if not self.externals: # externals might have been unset on that directory path, # remove associated paths from the reconstructed filesystem externals = self.dir_states[self.path].externals for path in externals.keys(): self.remove_external_path(os.fsencode(path)) self.dir_states[self.path].externals = {} def delete_entry(self, path: str, revision: int) -> None: """Remove a path.""" path_bytes = os.fsencode(path) if path_bytes not in self.editor.external_paths: fullpath = os.path.join(self.rootpath, path_bytes) self.file_states.pop(fullpath, None) self.remove_child(path_bytes) def close(self): """Function called when we finish processing a repository. SVN external definitions are processed by it. """ prev_externals = self.dir_states[self.path].externals if self.externals: # externals definition list might have changed in the current replayed # revision, we need to determine if some were removed and delete the # associated paths externals = self.externals prev_externals_set = { (path, url, rev) for path in prev_externals.keys() for (url, rev, _) in prev_externals[path] } externals_set = { (path, url, rev) for path in externals.keys() for (url, rev, _) in externals[path] } old_externals = prev_externals_set - externals_set for path, _, _ in old_externals: self.remove_external_path(os.fsencode(path)) else: # some external paths might have been removed in the current replayed # revision by a delete operation on an overlapping versioned path so we # need to restore them externals = prev_externals # For each external, try to export it in reconstructed filesystem for path, externals_def in externals.items(): for i, external in enumerate(externals_def): external_url, revision, relative_url = external self.process_external( path, external_url, revision, relative_url, remove_target_path=i == 0, ) # backup externals in directory state if self.externals: self.dir_states[self.path].externals = self.externals # do operations below only when closing the root directory if self.path == b"": self.svnrepo.has_relative_externals = any( relative_url for (_, relative_url) in self.editor.valid_externals.values() ) self.svnrepo.has_recursive_externals = any( is_recursive_external( self.svnrepo.origin_url, os.fsdecode(path), external_path, external_url, ) for path, dir_state in self.dir_states.items() for external_path in dir_state.externals.keys() for (external_url, _, _) in dir_state.externals[external_path] ) if self.svnrepo.has_recursive_externals: # If the repository has recursive externals, we stop processing # externals and remove those already exported, # We will then ignore externals when exporting the revision to # check for divergence with the reconstructed filesystem. for external_path in list(self.editor.external_paths): self.remove_external_path(external_path, force=True) def process_external( self, path: str, external_url: str, revision: Optional[int], relative_url: bool, remove_target_path: bool = True, ) -> None: external = (external_url, revision, relative_url) dest_path = os.fsencode(path) dest_fullpath = os.path.join(self.path, dest_path) prev_externals = self.dir_states[self.path].externals if ( path in prev_externals and external in prev_externals[path] and dest_fullpath in self.directory ): # external already exported, nothing to do return if is_recursive_external( self.svnrepo.origin_url, os.fsdecode(self.path), path, external_url ): # recursive external, skip it return logger.debug( "Exporting external %s%s to path %s", external_url, f"@{revision}" if revision else "", dest_fullpath, ) if external not in self.editor.externals_cache: try: # try to export external in a temporary path, destination path could # be versioned and must be overridden only if the external URL is # still valid temp_dir = os.fsencode( tempfile.mkdtemp(dir=self.editor.externals_cache_dir) ) temp_path = os.path.join(temp_dir, dest_path) os.makedirs(b"/".join(temp_path.split(b"/")[:-1]), exist_ok=True) if external_url not in self.editor.dead_externals: url = external_url.rstrip("/") origin_url = self.svnrepo.origin_url.rstrip("/") if ( url.startswith(origin_url + "/") and not self.svnrepo.has_relative_externals ): url = url.replace(origin_url, self.svnrepo.remote_url) self.svnrepo.export( url, to=temp_path, peg_rev=revision, ignore_keywords=True, ) self.editor.externals_cache[external] = temp_path except SubversionException as se: # external no longer available (404) logger.debug(se) self.editor.dead_externals.add(external_url) else: temp_path = self.editor.externals_cache[external] # subversion export will always create the subdirectories of the external # path regardless the validity of the remote URL dest_path_split = dest_path.split(b"/") current_path = self.path self.add_directory(os.fsdecode(current_path)) for subpath in dest_path_split[:-1]: current_path = os.path.join(current_path, subpath) self.add_directory(os.fsdecode(current_path)) if os.path.exists(temp_path): # external successfully exported if remove_target_path: # remove previous path in from_disk model self.remove_external_path(dest_path, remove_subpaths=False) # mark external as valid self.editor.valid_externals[dest_fullpath] = ( external_url, relative_url, ) # copy exported path to reconstructed filesystem fullpath = os.path.join(self.rootpath, dest_fullpath) # update from_disk model and store external paths self.editor.external_paths[dest_fullpath] += 1 - self.editor.modified_paths.add(dest_fullpath) if os.path.isfile(temp_path): if os.path.islink(fullpath): # remove destination file if it is a link os.remove(fullpath) shutil.copy(os.fsdecode(temp_path), os.fsdecode(fullpath)) self.directory[dest_fullpath] = from_disk.Content.from_file( path=fullpath ) else: self.add_directory(os.fsdecode(dest_fullpath)) # copy_tree needs sub-directories to exist in destination for root, dirs, files in os.walk(temp_path): for dir in dirs: temp_dir_fullpath = os.path.join(root, dir) if os.path.islink(temp_dir_fullpath): # do not create folder if it's a link or copy_tree will fail continue subdir = temp_dir_fullpath.replace(temp_path + b"/", b"") self.add_directory( os.fsdecode(os.path.join(dest_fullpath, subdir)) ) copy_tree( os.fsdecode(temp_path), os.fsdecode(fullpath), preserve_symlinks=True, ) # TODO: replace code above by the line below once we use Python >= 3.8 in production # noqa # shutil.copytree(temp_path, fullpath, symlinks=True, dirs_exist_ok=True) # noqa self.directory[dest_fullpath] = from_disk.Directory.from_disk( path=fullpath ) external_paths = set() for root, dirs, files in os.walk(fullpath): external_paths.update( [ os.path.join(root.replace(self.rootpath + b"/", b""), p) for p in chain(dirs, files) ] ) for external_path in external_paths: self.editor.external_paths[external_path] += 1 - self.editor.modified_paths.update(external_paths) - # ensure hash update for the directory with externals set self.directory[self.path].update_hash(force=True) def remove_external_path( self, external_path: bytes, remove_subpaths: bool = True, force: bool = False ) -> None: """Remove a previously exported SVN external path from the reconstructed filesystem. """ fullpath = os.path.join(self.path, external_path) # decrement number of references for external path when we really remove it # (when remove_subpaths is False, we just cleanup the external path before # copying exported paths in it) if fullpath in self.editor.external_paths and remove_subpaths: self.editor.external_paths[fullpath] -= 1 if ( force or fullpath in self.editor.external_paths and self.editor.external_paths[fullpath] == 0 ): self.remove_child(fullpath) self.editor.external_paths.pop(fullpath, None) self.editor.valid_externals.pop(fullpath, None) for path in list(self.editor.external_paths): if path.startswith(fullpath + b"/"): self.editor.external_paths[path] -= 1 if self.editor.external_paths[path] == 0: self.editor.external_paths.pop(path) if remove_subpaths: subpath_split = external_path.split(b"/")[:-1] for i in reversed(range(1, len(subpath_split) + 1)): # delete external sub-directory only if it is not versioned subpath = os.path.join(self.path, b"/".join(subpath_split[0:i])) try: self.svnrepo.client.info( svn_urljoin(self.svnrepo.remote_url, os.fsdecode(subpath)), peg_revision=self.editor.revnum, revision=self.editor.revnum, ) except SubversionException: self.remove_child(subpath) else: break try: # externals can overlap with versioned files so we must restore # them after removing the path above dest_path = os.path.join(self.rootpath, fullpath) self.svnrepo.client.export( svn_urljoin(self.svnrepo.remote_url, os.fsdecode(fullpath)), to=dest_path, peg_rev=self.editor.revnum, ignore_keywords=True, ) if os.path.isfile(dest_path) or os.path.islink(dest_path): self.directory[fullpath] = from_disk.Content.from_file(path=dest_path) else: self.directory[fullpath] = from_disk.Directory.from_disk(path=dest_path) except SubversionException: pass class Editor: """Editor in charge of replaying svn events and computing objects along. This implementation accounts for empty folder during hash computations. """ def __init__( self, rootpath: bytes, directory: from_disk.Directory, svnrepo: SvnRepo, temp_dir: str, ): self.rootpath = rootpath self.directory = directory self.file_states: Dict[bytes, FileState] = defaultdict(FileState) self.dir_states: Dict[bytes, DirState] = defaultdict(DirState) self.external_paths: Dict[bytes, int] = defaultdict(int) self.valid_externals: Dict[bytes, Tuple[str, bool]] = {} self.dead_externals: Set[str] = set() self.externals_cache_dir = tempfile.mkdtemp(dir=temp_dir) self.externals_cache: Dict[ExternalDefinition, bytes] = {} self.svnrepo = svnrepo self.revnum = None - # to store the set of paths added or modified when replaying a revision - self.modified_paths: Set[bytes] = set() def set_target_revision(self, revnum) -> None: self.revnum = revnum def abort(self) -> None: pass def close(self) -> None: pass def open_root(self, base_revnum: int) -> DirEditor: - # a new revision is being replayed so clear the modified_paths set - self.modified_paths.clear() return DirEditor( self.directory, rootpath=self.rootpath, path=b"", file_states=self.file_states, dir_states=self.dir_states, svnrepo=self.svnrepo, ) class Replay: """Replay class.""" def __init__( self, conn: RemoteAccess, rootpath: bytes, svnrepo: SvnRepo, temp_dir: str, directory: Optional[from_disk.Directory] = None, ): self.conn = conn self.rootpath = rootpath if directory is None: directory = from_disk.Directory() self.directory = directory self.editor = Editor( rootpath=rootpath, directory=directory, svnrepo=svnrepo, temp_dir=temp_dir ) def replay(self, rev: int) -> from_disk.Directory: """Replay svn actions between rev and rev+1. This method updates in place the self.editor.directory, as well as the filesystem. Returns: The updated root directory """ codecs.register_error("strict", _ra_codecs_error_handler) self.conn.replay(rev, rev + 1, self.editor) codecs.register_error("strict", codecs.strict_errors) return self.editor.directory def compute_objects( self, rev: int ) -> Tuple[List[Content], List[SkippedContent], List[Directory]]: """Compute objects added or modified at revisions rev. Expects the state to be at previous revision's objects. Args: rev: The revision to start the replay from. Returns: The updated objects between rev and rev+1. Beware that this mutates the filesystem at rootpath accordingly. """ self.replay(rev) contents: List[Content] = [] skipped_contents: List[SkippedContent] = [] directories: List[Directory] = [] - directories.append(self.editor.directory.to_model()) - for path in self.editor.modified_paths: - obj = self.directory[path].to_model() + for obj_node in self.directory.collect(): + obj = obj_node.to_model() # type: ignore obj_type = obj.object_type if obj_type in (Content.object_type, DiskBackedContent.object_type): contents.append(obj.with_data()) elif obj_type == SkippedContent.object_type: skipped_contents.append(obj) elif obj_type == Directory.object_type: directories.append(obj) + else: + assert False, obj_type return contents, skipped_contents, directories @click.command() @click.option("--local-url", default="/tmp", help="local svn working copy") @click.option( "--svn-url", default="file:///home/storage/svn/repos/pkg-fox", help="svn repository's url.", ) @click.option( "--revision-start", default=1, type=click.INT, help="svn repository's starting revision.", ) @click.option( "--revision-end", default=-1, type=click.INT, help="svn repository's ending revision.", ) @click.option( "--debug/--nodebug", default=True, help="Indicates if the server should run in debug mode.", ) @click.option( "--cleanup/--nocleanup", default=True, help="Indicates whether to cleanup disk when done or not.", ) def main(local_url, svn_url, revision_start, revision_end, debug, cleanup): """Script to present how to use Replay class.""" conn = RemoteAccess(svn_url.encode("utf-8"), auth=Auth([get_username_provider()])) os.makedirs(local_url, exist_ok=True) rootpath = tempfile.mkdtemp( prefix=local_url, suffix="-" + os.path.basename(svn_url) ) rootpath = os.fsencode(rootpath) # Do not go beyond the repository's latest revision revision_end_max = conn.get_latest_revnum() if revision_end == -1: revision_end = revision_end_max revision_end = min(revision_end, revision_end_max) try: replay = Replay(conn, rootpath) for rev in range(revision_start, revision_end + 1): contents, skipped_contents, directories = replay.compute_objects(rev) print( "r%s %s (%s new contents, %s new directories)" % ( rev, hashutil.hash_to_hex(replay.directory.hash), len(contents) + len(skipped_contents), len(directories), ) ) if debug: print("%s" % rootpath.decode("utf-8")) finally: if cleanup: if os.path.exists(rootpath): shutil.rmtree(rootpath) if __name__ == "__main__": main() diff --git a/swh/loader/svn/tests/test_task.py b/swh/loader/svn/tests/test_task.py index efd88e0..ec92c35 100644 --- a/swh/loader/svn/tests/test_task.py +++ b/swh/loader/svn/tests/test_task.py @@ -1,156 +1,78 @@ # Copyright (C) 2019-2022 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information import uuid import pytest from swh.scheduler.model import ListedOrigin, Lister -from swh.scheduler.utils import create_origin_task_dict - -@pytest.fixture(autouse=True) -def celery_worker_and_swh_config(swh_scheduler_celery_worker, swh_config): - pass +NAMESPACE = "swh.loader.svn" @pytest.fixture def svn_lister(): return Lister(name="svn-lister", instance_name="example", id=uuid.uuid4()) @pytest.fixture def svn_listed_origin(svn_lister): return ListedOrigin( lister_id=svn_lister.id, url="svn://example.org/repo", visit_type="svn" ) -@pytest.fixture -def task_dict(svn_lister, svn_listed_origin): - return create_origin_task_dict(svn_listed_origin, svn_lister) - - -def test_svn_loader( - mocker, - swh_scheduler_celery_app, -): - mock_loader = mocker.patch("swh.loader.svn.loader.SvnLoader.load") - mock_loader.return_value = {"status": "eventful"} - - res = swh_scheduler_celery_app.send_task( - "swh.loader.svn.tasks.LoadSvnRepository", - kwargs=dict( - url="some-technical-url", origin_url="origin-url", visit_date="now" - ), - ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "eventful"} - - +@pytest.mark.parametrize("extra_loader_arguments", [{}, {"visit_date": "now"}]) def test_svn_loader_for_listed_origin( - mocker, - swh_scheduler_celery_app, - task_dict, -): - mock_loader = mocker.patch("swh.loader.svn.loader.SvnLoader.load") - mock_loader.return_value = {"status": "eventful"} - - res = swh_scheduler_celery_app.send_task( - "swh.loader.svn.tasks.LoadSvnRepository", - args=task_dict["arguments"]["args"], - kwargs=task_dict["arguments"]["kwargs"], - ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "eventful"} - - -def test_svn_loader_from_dump( - mocker, - swh_scheduler_celery_app, + loading_task_creation_for_listed_origin_test, + svn_lister, + svn_listed_origin, + extra_loader_arguments, ): - mock_loader = mocker.patch("swh.loader.svn.loader.SvnLoaderFromDumpArchive.load") - mock_loader.return_value = {"status": "eventful"} + svn_listed_origin.extra_loader_arguments = extra_loader_arguments - res = swh_scheduler_celery_app.send_task( - "swh.loader.svn.tasks.MountAndLoadSvnRepository", - kwargs=dict(url="some-url", archive_path="some-path", visit_date="now"), + loading_task_creation_for_listed_origin_test( + loader_class_name=f"{NAMESPACE}.loader.SvnLoader", + task_function_name=f"{NAMESPACE}.tasks.LoadSvnRepository", + lister=svn_lister, + listed_origin=svn_listed_origin, ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "eventful"} +@pytest.mark.parametrize( + "extra_loader_arguments", + [{"archive_path": "some-path"}, {"archive_path": "some-path", "visit_date": "now"}], +) def test_svn_loader_from_dump_for_listed_origin( - mocker, - swh_scheduler_celery_app, + loading_task_creation_for_listed_origin_test, svn_lister, svn_listed_origin, + extra_loader_arguments, ): - mock_loader = mocker.patch("swh.loader.svn.loader.SvnLoaderFromDumpArchive.load") - mock_loader.return_value = {"status": "eventful"} - - svn_listed_origin.extra_loader_arguments = {"archive_path": "some-path"} - - task_dict = create_origin_task_dict(svn_listed_origin, svn_lister) - - res = swh_scheduler_celery_app.send_task( - "swh.loader.svn.tasks.MountAndLoadSvnRepository", - args=task_dict["arguments"]["args"], - kwargs=task_dict["arguments"]["kwargs"], - ) - assert res - res.wait() - assert res.successful() + svn_listed_origin.extra_loader_arguments = extra_loader_arguments - assert res.result == {"status": "eventful"} - - -def test_svn_loader_from_remote_dump( - mocker, - swh_scheduler_celery_app, -): - mock_loader = mocker.patch("swh.loader.svn.loader.SvnLoaderFromRemoteDump.load") - mock_loader.return_value = {"status": "eventful"} - - res = swh_scheduler_celery_app.send_task( - "swh.loader.svn.tasks.DumpMountAndLoadSvnRepository", - kwargs=dict( - url="some-remote-dump-url", origin_url="origin-url", visit_date="now" - ), + loading_task_creation_for_listed_origin_test( + loader_class_name=f"{NAMESPACE}.loader.SvnLoaderFromDumpArchive", + task_function_name=f"{NAMESPACE}.tasks.MountAndLoadSvnRepository", + lister=svn_lister, + listed_origin=svn_listed_origin, ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "eventful"} +@pytest.mark.parametrize("extra_loader_arguments", [{}, {"visit_date": "now"}]) def test_svn_loader_from_remote_dump_for_listed_origin( - mocker, - swh_scheduler_celery_app, - task_dict, + loading_task_creation_for_listed_origin_test, + svn_lister, + svn_listed_origin, + extra_loader_arguments, ): - mock_loader = mocker.patch("swh.loader.svn.loader.SvnLoaderFromRemoteDump.load") - mock_loader.return_value = {"status": "eventful"} + svn_listed_origin.extra_loader_arguments = extra_loader_arguments - res = swh_scheduler_celery_app.send_task( - "swh.loader.svn.tasks.DumpMountAndLoadSvnRepository", - args=task_dict["arguments"]["args"], - kwargs=task_dict["arguments"]["kwargs"], + loading_task_creation_for_listed_origin_test( + loader_class_name=f"{NAMESPACE}.loader.SvnLoaderFromRemoteDump", + task_function_name=f"{NAMESPACE}.tasks.DumpMountAndLoadSvnRepository", + lister=svn_lister, + listed_origin=svn_listed_origin, ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "eventful"} diff --git a/swh/loader/svn/tests/test_utils.py b/swh/loader/svn/tests/test_utils.py index ccc3c2a..b85cb73 100644 --- a/swh/loader/svn/tests/test_utils.py +++ b/swh/loader/svn/tests/test_utils.py @@ -1,428 +1,441 @@ # Copyright (C) 2016-2022 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information import logging import os +from pathlib import Path import pty import shutil from subprocess import Popen import pytest from swh.loader.svn import utils def test_outputstream(): stdout_r, stdout_w = pty.openpty() echo = Popen(["echo", "-e", "foo\nbar\nbaz"], stdout=stdout_w) os.close(stdout_w) stdout_stream = utils.OutputStream(stdout_r) lines = [] while True: current_lines, readable = stdout_stream.read_lines() lines += current_lines if not readable: break echo.wait() os.close(stdout_r) assert lines == ["foo", "bar", "baz"] def test_init_svn_repo_from_dump(datadir, tmp_path): """Mounting svn repository out of a dump is ok""" dump_name = "penguinsdbtools2018.dump.gz" dump_path = os.path.join(datadir, dump_name) tmp_repo, repo_path = utils.init_svn_repo_from_dump( dump_path, gzip=True, cleanup_dump=False, root_dir=tmp_path ) assert os.path.exists(dump_path), "Dump path should still exists" assert os.path.exists(repo_path), "Repository should exists" +def test_init_svn_repo_from_dump_svnadmin_error(tmp_path): + """svnadmin load error should be reported in exception text""" + dump_path = os.path.join(tmp_path, "foo") + Path(dump_path).touch() + + with pytest.raises( + ValueError, + match="svnadmin: E200003: Premature end of content data in dumpstream", + ): + utils.init_svn_repo_from_dump(dump_path, cleanup_dump=False, root_dir=tmp_path) + + def test_init_svn_repo_from_dump_and_cleanup(datadir, tmp_path): """Mounting svn repository with a dump cleanup after is ok""" dump_name = "penguinsdbtools2018.dump.gz" dump_ori_path = os.path.join(datadir, dump_name) dump_path = os.path.join(tmp_path, dump_name) shutil.copyfile(dump_ori_path, dump_path) assert os.path.exists(dump_path) assert os.path.exists(dump_ori_path) tmp_repo, repo_path = utils.init_svn_repo_from_dump( dump_path, gzip=True, root_dir=tmp_path ) assert not os.path.exists(dump_path), "Dump path should no longer exists" assert os.path.exists(repo_path), "Repository should exists" assert os.path.exists(dump_ori_path), "Original dump path should still exists" def test_init_svn_repo_from_dump_and_cleanup_already_done( datadir, tmp_path, mocker, caplog ): """Mounting svn repository out of a dump is ok""" caplog.set_level(logging.INFO, "swh.loader.svn.utils") dump_name = "penguinsdbtools2018.dump.gz" dump_ori_path = os.path.join(datadir, dump_name) mock_remove = mocker.patch("os.remove") mock_remove.side_effect = FileNotFoundError dump_path = os.path.join(tmp_path, dump_name) shutil.copyfile(dump_ori_path, dump_path) assert os.path.exists(dump_path) assert os.path.exists(dump_ori_path) tmp_repo, repo_path = utils.init_svn_repo_from_dump( dump_path, gzip=True, root_dir=tmp_path ) assert os.path.exists(repo_path), "Repository should exists" assert os.path.exists(dump_ori_path), "Original dump path should still exists" assert len(caplog.record_tuples) == 1 assert "Failure to remove" in caplog.record_tuples[0][2] assert mock_remove.called def test_init_svn_repo_from_archive_dump(datadir, tmp_path): """Mounting svn repository out of an archive dump is ok""" dump_name = "penguinsdbtools2018.dump.gz" dump_path = os.path.join(datadir, dump_name) tmp_repo, repo_path = utils.init_svn_repo_from_archive_dump( dump_path, cleanup_dump=False, root_dir=tmp_path ) assert os.path.exists(dump_path), "Dump path should still exists" assert os.path.exists(repo_path), "Repository should exists" def test_init_svn_repo_from_archive_dump_and_cleanup(datadir, tmp_path): """Mounting svn repository out of a dump is ok""" dump_name = "penguinsdbtools2018.dump.gz" dump_ori_path = os.path.join(datadir, dump_name) dump_path = os.path.join(tmp_path, dump_name) shutil.copyfile(dump_ori_path, dump_path) assert os.path.exists(dump_path) assert os.path.exists(dump_ori_path) tmp_repo, repo_path = utils.init_svn_repo_from_archive_dump( dump_path, root_dir=tmp_path ) assert not os.path.exists(dump_path), "Dump path should no longer exists" assert os.path.exists(repo_path), "Repository should exists" assert os.path.exists(dump_ori_path), "Original dump path should still exists" @pytest.mark.parametrize( "base_url, paths_to_join, expected_result", [ ( "https://svn.example.org", ["repos", "test"], "https://svn.example.org/repos/test", ), ( "https://svn.example.org/", ["repos", "test"], "https://svn.example.org/repos/test", ), ( "https://svn.example.org/foo", ["repos", "test"], "https://svn.example.org/foo/repos/test", ), ( "https://svn.example.org/foo/", ["/repos", "test/"], "https://svn.example.org/foo/repos/test", ), ( "https://svn.example.org/foo", ["../bar"], "https://svn.example.org/bar", ), ], ) def test_svn_urljoin(base_url, paths_to_join, expected_result): assert utils.svn_urljoin(base_url, *paths_to_join) == expected_result @pytest.mark.parametrize( "external, dir_path, repo_url, expected_result", [ # subversion < 1.5 ( "third-party/sounds http://svn.example.com/repos/sounds", "trunk/externals", "http://svn.example.org/repos/test", ("third-party/sounds", "http://svn.example.com/repos/sounds", None, False), ), ( "third-party/skins -r148 http://svn.example.com/skinproj", "trunk/externals", "http://svn.example.org/repos/test", ("third-party/skins", "http://svn.example.com/skinproj", 148, False), ), ( "third-party/skins/toolkit -r21 http://svn.example.com/skin-maker", "trunk/externals", "http://svn.example.org/repos/test", ( "third-party/skins/toolkit", "http://svn.example.com/skin-maker", 21, False, ), ), # subversion >= 1.5 ( " http://svn.example.com/repos/sounds third-party/sounds", "trunk/externals", "http://svn.example.org/repos/test", ("third-party/sounds", "http://svn.example.com/repos/sounds", None, False), ), ( "-r148 http://svn.example.com/skinproj third-party/skins", "trunk/externals", "http://svn.example.org/repos/test", ("third-party/skins", "http://svn.example.com/skinproj", 148, False), ), ( "-r 21 http://svn.example.com/skin-maker third-party/skins/toolkit", "trunk/externals", "http://svn.example.org/repos/test", ( "third-party/skins/toolkit", "http://svn.example.com/skin-maker", 21, False, ), ), ( "http://svn.example.com/repos/sounds third-party/sounds", "trunk/externals", "http://svn.example.org/repos/test", ("third-party/sounds", "http://svn.example.com/repos/sounds", None, False), ), ( "http://svn.example.com/skinproj@148 third-party/skins", "trunk/externals", "http://svn.example.org/repos/test", ("third-party/skins", "http://svn.example.com/skinproj", 148, False), ), ( "http://anon:anon@svn.example.com/skin-maker@21 third-party/skins/toolkit", "trunk/externals", "http://svn.example.org/repos/test", ( "third-party/skins/toolkit", "http://anon:anon@svn.example.com/skin-maker", 21, False, ), ), ( "-r21 http://anon:anon@svn.example.com/skin-maker third-party/skins/toolkit", # noqa "trunk/externals", "http://svn.example.org/repos/test", ( "third-party/skins/toolkit", "http://anon:anon@svn.example.com/skin-maker", 21, False, ), ), ( "-r21 http://anon:anon@svn.example.com/skin-maker@21 third-party/skins/toolkit", # noqa "trunk/externals", "http://svn.example.org/repos/test", ( "third-party/skins/toolkit", "http://anon:anon@svn.example.com/skin-maker", 21, False, ), ), # subversion >= 1.5, relative external definitions ( "^/sounds third-party/sounds", "trunk/externals", "http://svn.example.org/repos/test", ( "third-party/sounds", "http://svn.example.org/repos/test/sounds", None, False, ), ), ( "/skinproj@148 third-party/skins", "trunk/externals", "http://svn.example.org/repos/test", ("third-party/skins", "http://svn.example.org/skinproj", 148, True), ), ( "//svn.example.com/skin-maker@21 third-party/skins/toolkit", "trunk/externals", "http://svn.example.org/repos/test", ( "third-party/skins/toolkit", "http://svn.example.com/skin-maker", 21, True, ), ), ( "^/../../skin-maker@21 third-party/skins/toolkit", "trunk/externals", "http://svn.example.org/repos/test", ( "third-party/skins/toolkit", "http://svn.example.org/skin-maker", 21, True, ), ), ( "../skins skins", "trunk/externals", "http://svn.example.org/repos/test", ("skins", "http://svn.example.org/repos/test/trunk/skins", None, False), ), ( "../skins skins", "trunk/externals", "http://svn.example.org/repos/test", ("skins", "http://svn.example.org/repos/test/trunk/skins", None, False), ), # subversion >= 1.6 ( 'http://svn.thirdparty.com/repos/My%20Project "My Project"', "trunk/externals", "http://svn.example.org/repos/test", ("My Project", "http://svn.thirdparty.com/repos/My%20Project", None, False), ), ( 'http://svn.thirdparty.com/repos/My%20%20%20Project "My Project"', "trunk/externals", "http://svn.example.org/repos/test", ( "My Project", "http://svn.thirdparty.com/repos/My%20%20%20Project", None, False, ), ), ( 'http://svn.thirdparty.com/repos/%22Quotes%20Too%22 \\"Quotes\\ Too\\"', "trunk/externals", "http://svn.example.org/repos/test", ( '"Quotes Too"', "http://svn.thirdparty.com/repos/%22Quotes%20Too%22", None, False, ), ), ( 'http://svn.thirdparty.com/repos/%22Quotes%20%20%20Too%22 \\"Quotes\\ \\ \\ Too\\"', # noqa "trunk/externals", "http://svn.example.org/repos/test", ( '"Quotes Too"', "http://svn.thirdparty.com/repos/%22Quotes%20%20%20Too%22", None, False, ), ), # edge cases ( '-r1 http://svn.thirdparty.com/repos/test "trunk/PluginFramework"', "trunk/externals", "http://svn.example.org/repos/test", ("trunk/PluginFramework", "http://svn.thirdparty.com/repos/test", 1, False), ), ( "external -r 9 http://svn.thirdparty.com/repos/test", "tags", "http://svn.example.org/repos/test", ("external", "http://svn.thirdparty.com/repos/test", 9, False), ), ( "./external http://svn.thirdparty.com/repos/test", "tags", "http://svn.example.org/repos/test", ("external", "http://svn.thirdparty.com/repos/test", None, False), ), ( ".external http://svn.thirdparty.com/repos/test", "tags", "http://svn.example.org/repos/test", (".external", "http://svn.thirdparty.com/repos/test", None, False), ), ( "external/ http://svn.thirdparty.com/repos/test", "tags", "http://svn.example.org/repos/test", ("external", "http://svn.thirdparty.com/repos/test", None, False), ), ( "external ttp://svn.thirdparty.com/repos/test", "tags", "http://svn.example.org/repos/test", ("external", "ttp://svn.thirdparty.com/repos/test", None, False), ), ( "external http//svn.thirdparty.com/repos/test", "tags", "http://svn.example.org/repos/test", ("external", "http//svn.thirdparty.com/repos/test", None, False), ), ( "C:\\code\\repo\\external http://svn.thirdparty.com/repos/test", "tags", "http://svn.example.org/repos/test", ("C:coderepoexternal", "http://svn.thirdparty.com/repos/test", None, False), ), ( "C:\\\\code\\\\repo\\\\external http://svn.thirdparty.com/repos/test", "tags", "http://svn.example.org/repos/test", ( "C:\\code\\repo\\external", "http://svn.thirdparty.com/repos/test", None, False, ), ), ( "-r 123 http://svn.example.com/repos/sounds@100 third-party/sounds", "trunk/externals", "http://svn.example.org/repos/test", ("third-party/sounds", "http://svn.example.com/repos/sounds", 123, False), ), ( "-r 123 http://svn.example.com/repos/sounds@150 third-party/sounds", "trunk/externals", "http://svn.example.org/repos/test", ("third-party/sounds", "http://svn.example.com/repos/sounds", 123, False), ), ], ) def test_parse_external_definition(external, dir_path, repo_url, expected_result): assert ( utils.parse_external_definition(external, dir_path, repo_url) == expected_result ) diff --git a/swh/loader/svn/utils.py b/swh/loader/svn/utils.py index 3319efe..b188995 100644 --- a/swh/loader/svn/utils.py +++ b/swh/loader/svn/utils.py @@ -1,325 +1,328 @@ # Copyright (C) 2016-2022 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information import errno import logging import os import re import shutil -from subprocess import PIPE, Popen, call +from subprocess import PIPE, Popen, call, run import tempfile from typing import Optional, Tuple from urllib.parse import quote, urlparse, urlunparse logger = logging.getLogger(__name__) class OutputStream: """Helper class to read lines from a program output while it is running Args: fileno (int): File descriptor of a program output stream opened in text mode """ def __init__(self, fileno): self._fileno = fileno self._buffer = "" def read_lines(self): """ Read available lines from the output stream and return them. Returns: Tuple[List[str], bool]: A tuple whose first member is the read lines and second member a boolean indicating if there are still some other lines available to read. """ try: output = os.read(self._fileno, 1000).decode() except OSError as e: if e.errno != errno.EIO: raise output = "" output = output.replace("\r\n", "\n") lines = output.split("\n") lines[0] = self._buffer + lines[0] if output: self._buffer = lines[-1] return (lines[:-1], True) else: self._buffer = "" if len(lines) == 1 and not lines[0]: lines = [] return (lines, False) def init_svn_repo_from_dump( dump_path: str, prefix: Optional[str] = None, suffix: Optional[str] = None, root_dir: str = "/tmp", gzip: bool = False, cleanup_dump: bool = True, ) -> Tuple[str, str]: """Given a path to a svn dump, initialize an svn repository with the content of said dump. Args: dump_path: The dump to the path prefix: optional prefix file name for the working directory suffix: optional suffix file name for the working directory root_dir: the root directory where the working directory is created gzip: Boolean to determine whether we treat the dump as compressed or not. cleanup_dump: Whether we want this function call to clean up the dump at the end of the repository initialization. Raises: ValueError in case of failure to run the command to uncompress and load the dump. Returns: A tuple: - temporary folder: containing the mounted repository - repo_path: path to the mounted repository inside the temporary folder """ project_name = os.path.basename(os.path.dirname(dump_path)) temp_dir = tempfile.mkdtemp(prefix=prefix, suffix=suffix, dir=root_dir) try: repo_path = os.path.join(temp_dir, project_name) # create the repository that will be loaded with the dump cmd = ["svnadmin", "create", repo_path] r = call(cmd) if r != 0: raise ValueError( "Failed to initialize empty svn repo for %s" % project_name ) read_dump_cmd = ["cat", dump_path] if gzip: read_dump_cmd = ["gzip", "-dc", dump_path] with Popen(read_dump_cmd, stdout=PIPE) as dump: # load dump and bypass properties validation as Unicode decoding errors # are already handled in loader implementation (see _ra_codecs_error_handler # in ra.py) cmd = ["svnadmin", "load", "-q", "--bypass-prop-validation", repo_path] - r = call(cmd, stdin=dump.stdout) - if r != 0: + completed_process = run( + cmd, stdin=dump.stdout, capture_output=True, text=True + ) + if completed_process.returncode != 0: raise ValueError( - "Failed to mount the svn dump for project %s" % project_name + f"Failed to mount the svn dump for project {project_name}\n" + + completed_process.stderr ) return temp_dir, repo_path except Exception as e: shutil.rmtree(temp_dir) raise e finally: if cleanup_dump: try: # At this time, the temporary svn repository is mounted from the dump or # the svn repository failed to mount. Either way, we can drop the dump. os.remove(dump_path) assert not os.path.exists(dump_path) except OSError as e: logger.warn("Failure to remove the dump %s: %s", dump_path, e) def init_svn_repo_from_archive_dump( archive_path: str, prefix: Optional[str] = None, suffix: Optional[str] = None, root_dir: str = "/tmp", cleanup_dump: bool = True, ) -> Tuple[str, str]: """Given a path to an archive containing an svn dump, initializes an svn repository with the content of the uncompressed dump. Args: archive_path: The archive svn dump path prefix: optional prefix file name for the working directory suffix: optional suffix file name for the working directory root_dir: the root directory where the working directory is created gzip: Boolean to determine whether we treat the dump as compressed or not. cleanup_dump: Whether we want this function call to clean up the dump at the end of the repository initialization. Raises: ValueError in case of failure to run the command to uncompress and load the dump. Returns: A tuple: - temporary folder: containing the mounted repository - repo_path: path to the mounted repository inside the temporary folder """ return init_svn_repo_from_dump( archive_path, prefix=prefix, suffix=suffix, root_dir=root_dir, gzip=True, cleanup_dump=cleanup_dump, ) def svn_urljoin(base_url: str, *args) -> str: """Join a base URL and a list of paths in a SVN way. For instance: - svn_urljoin("http://example.org", "foo", "bar") will return "https://example.org/foo/bar - svn_urljoin("http://example.org/foo", "../bar") will return "https://example.org/bar Args: base_url: Base URL to join paths with args: path components Returns: The joined URL """ parsed_url = urlparse(base_url) path = os.path.abspath( os.path.join(parsed_url.path or "/", *[arg.strip("/") for arg in args]) ) return f"{parsed_url.scheme}://{parsed_url.netloc}{path}" def parse_external_definition( external: str, dir_path: str, repo_url: str ) -> Tuple[str, str, Optional[int], bool]: """Parse a subversion external definition. Args: external: an external definition, extracted from the lines split of a svn:externals property value dir_path: The path of the directory in the subversion repository where the svn:externals property was set repo_url: URL of the subversion repository Returns: A tuple with the following members: - path relative to dir_path where the external should be exported - URL of the external to export - optional revision of the external to export - boolean indicating if the external URL is relative to the repository URL and targets a path not in the repository """ path = "" external_url = "" revision = None relative_url = False prev_part = None # turn multiple spaces into a single one and split on space for external_part in external.split(): if prev_part == "-r": # parse revision in the form "-r XXX" revision = int(external_part) elif external_part.startswith("-r") and external_part != "-r": # parse revision in the form "-rXXX" revision = int(external_part[2:]) elif external_part.startswith("^/"): # URL relative to the root of the repository in which the svn:externals # property is versioned external_url = svn_urljoin(repo_url, external_part[2:]) relative_url = not external_url.startswith(repo_url) elif external_part.startswith("//"): # URL relative to the scheme of the URL of the directory on which the # svn:externals property is set scheme = urlparse(repo_url).scheme external_url = f"{scheme}:{external_part}" relative_url = not external_url.startswith(repo_url) elif external_part.startswith("/"): # URL relative to the root URL of the server on which the svn:externals # property is versioned parsed_url = urlparse(repo_url) root_url = f"{parsed_url.scheme}://{parsed_url.netloc}" external_url = svn_urljoin(root_url, external_part) relative_url = not external_url.startswith(repo_url) elif external_part.startswith("../"): # URL relative to the URL of the directory on which the svn:externals # property is set external_url = svn_urljoin(repo_url, dir_path, external_part) relative_url = not external_url.startswith(repo_url) elif re.match(r"^.*:*//.*", external_part): # absolute external URL external_url = external_part # subversion >= 1.6 added a quoting and escape mechanism to the syntax so # that the path of the external working copy may contain whitespace. elif external_part.startswith('\\"'): external_split = external.split('\\"') path = [ e.replace("\\ ", " ") for e in external_split if e.startswith(external_part[2:]) ][0] path = f'"{path}"' elif external_part.endswith('\\"'): continue elif external_part.startswith('"'): external_split = external.split('"') path_prefix = external_part.strip('"') path = next(iter([e for e in external_split if e.startswith(path_prefix)])) elif external_part.endswith('"'): continue elif not external_part.startswith("\\") and external_part != "-r": # path of the external relative to dir_path path = external_part.replace("\\\\", "\\") if path == external_part: path = external_part.replace("\\", "") if path.startswith("./"): path = path.replace("./", "", 1) prev_part = external_part if "@" in external_url: # try to extract revision number if external URL is in the form # http://svn.example.org/repos/test/path@XXX url, revision_s = external_url.rsplit("@", maxsplit=1) try: # ensure revision_s can be parsed to int rev = int(revision_s) # -r XXX takes priority over @XXX revision = revision or rev external_url = url except ValueError: # handle URL like http://user@svn.example.org/ pass return (path.rstrip("/"), external_url, revision, relative_url) def is_recursive_external( origin_url: str, dir_path: str, external_path: str, external_url: str ) -> bool: """ Check if an external definition can lead to a recursive subversion export operation (https://issues.apache.org/jira/browse/SVN-1703). Args: origin_url: repository URL dir_path: path of the directory where external is defined external_path: path of the external relative to the directory external_url: external URL Returns: Whether the external definition is recursive """ parsed_origin_url = urlparse(origin_url) parsed_external_url = urlparse(external_url) external_url = urlunparse( parsed_external_url._replace(scheme=parsed_origin_url.scheme) ) return svn_urljoin(origin_url, quote(dir_path), quote(external_path)).startswith( external_url ) diff --git a/tox.ini b/tox.ini index 3bcbc8f..3505183 100644 --- a/tox.ini +++ b/tox.ini @@ -1,75 +1,76 @@ [tox] envlist=black,flake8,mypy,py3 [testenv] extras = testing deps = pytest-cov swh.scheduler[testing] >= 0.5.0 dev: pdbpp commands = pytest --cov={envsitepackagesdir}/swh/loader/svn \ {envsitepackagesdir}/swh/loader/svn \ --cov-branch {posargs} [testenv:black] skip_install = true deps = - black==22.3.0 + black==22.10.0 commands = {envpython} -m black --check swh [testenv:flake8] skip_install = true deps = - flake8==4.0.1 - flake8-bugbear==22.3.23 + flake8==5.0.4 + flake8-bugbear==22.9.23 + pycodestyle==2.9.1 commands = {envpython} -m flake8 [testenv:mypy] extras = testing deps = mypy==0.942 commands = mypy swh # build documentation outside swh-environment using the current # git HEAD of swh-docs, is executed on CI for each diff to prevent # breaking doc build [testenv:sphinx] whitelist_externals = make usedevelop = true extras = testing deps = # fetch and install swh-docs in develop mode -e git+https://forge.softwareheritage.org/source/swh-docs#egg=swh.docs setenv = SWH_PACKAGE_DOC_TOX_BUILD = 1 # turn warnings into errors SPHINXOPTS = -W commands = make -I ../.tox/sphinx/src/swh-docs/swh/ -C docs # build documentation only inside swh-environment using local state # of swh-docs package [testenv:sphinx-dev] whitelist_externals = make usedevelop = true extras = testing deps = # install swh-docs in develop mode -e ../swh-docs setenv = SWH_PACKAGE_DOC_TOX_BUILD = 1 # turn warnings into errors SPHINXOPTS = -W commands = make -I ../.tox/sphinx-dev/src/swh-docs/swh/ -C docs