diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 1c95e3d..f972cd9 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,40 +1,40 @@ repos: - repo: https://github.com/pre-commit/pre-commit-hooks - rev: v4.1.0 + rev: v4.3.0 hooks: - id: trailing-whitespace - id: check-json - id: check-yaml - - repo: https://gitlab.com/pycqa/flake8 - rev: 4.0.1 + - repo: https://github.com/pycqa/flake8 + rev: 5.0.4 hooks: - id: flake8 - additional_dependencies: [flake8-bugbear==22.3.23] + additional_dependencies: [flake8-bugbear==22.9.23] - repo: https://github.com/codespell-project/codespell - rev: v2.1.0 + rev: v2.2.2 hooks: - id: codespell name: Check source code spelling stages: [commit] - repo: local hooks: - id: mypy name: mypy entry: mypy args: [swh] pass_filenames: false language: system types: [python] - repo: https://github.com/PyCQA/isort rev: 5.10.1 hooks: - id: isort - repo: https://github.com/python/black - rev: 22.3.0 + rev: 22.10.0 hooks: - id: black diff --git a/PKG-INFO b/PKG-INFO index 3509a2a..400925a 100644 --- a/PKG-INFO +++ b/PKG-INFO @@ -1,103 +1,103 @@ Metadata-Version: 2.1 Name: swh.loader.git -Version: 1.10.0 +Version: 1.10.1 Summary: Software Heritage git loader Home-page: https://forge.softwareheritage.org/diffusion/DLDG/ Author: Software Heritage developers Author-email: swh-devel@inria.fr Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest Project-URL: Funding, https://www.softwareheritage.org/donate Project-URL: Source, https://forge.softwareheritage.org/source/swh-loader-git Project-URL: Documentation, https://docs.softwareheritage.org/devel/swh-loader-git/ Classifier: Programming Language :: Python :: 3 Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3) Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Requires-Python: >=3.7 Description-Content-Type: text/markdown Provides-Extra: testing License-File: LICENSE License-File: AUTHORS swh-loader-git ============== The Software Heritage Git Loader is a tool and a library to walk a local Git repository and inject into the SWH dataset all contained files that weren't known before. The main entry points are: - :class:`swh.loader.git.loader.GitLoader` for the main loader which can ingest either local or remote git repository's contents. This is the main implementation deployed in production. - :class:`swh.loader.git.from_disk.GitLoaderFromDisk` which ingests only local git clone repository. - :class:`swh.loader.git.loader.GitLoaderFromArchive` which ingests a git repository wrapped in an archive. License ------- This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. See top-level LICENSE file for the full text of the GNU General Public License along with this program. Dependencies ------------ ### Runtime - python3 - python3-dulwich - python3-retrying - python3-swh.core - python3-swh.model - python3-swh.storage - python3-swh.scheduler ### Test - python3-nose Requirements ------------ - implementation language, Python3 - coding guidelines: conform to PEP8 - Git access: via dulwich CLI Run ---------- You can run the loader from a remote origin (*loader*) or from an origin on disk (*from_disk*) directly by calling: ``` swh loader -C run git ``` or "git_disk". ## Configuration sample /tmp/git.yml: ``` storage: cls: remote args: url: http://localhost:5002/ ``` diff --git a/docs/attic/api-backend-protocol.txt b/docs/attic/api-backend-protocol.txt index e0b39ea..56d82e9 100644 --- a/docs/attic/api-backend-protocol.txt +++ b/docs/attic/api-backend-protocol.txt @@ -1,195 +1,195 @@ Design considerations ===================== # Goal Load the representation of a git, svn, csv, tarball, et al. repository in software heritage's backend. # Nomenclature cf. swh-sql/swh.sql comments -> FIXME: find a means to compute docs from sql From this point on, `signatures` means: - the git sha1s, the sha1 and sha256 the object's content for object of type content - the git sha1s for all other object types (directories, contents, revisions, occurrences, releases) A worker is one instance running swh-loader-git to parse and load a repository in the backend. It is not distributed. The backend api discuss with one or many workers. It is distributed. # Scenario In the following, we will describe with different granularities what will happen between 1 worker and the backend api. ## 1 A worker parses a repository. -It sends the parsing result to the backend in muliple requests/responses. +It sends the parsing result to the backend in multiple requests/responses. The worker sends list of sha1s (git sha1s) encountered. The server responds with an unknowns sha1s list. The worker sends those sha1s and their associated data to the server. The server store what it receives. ## 2 01. Worker parses local repository and build a memory model of it. 02. HAVE: Worker sends repository's contents signatures to the backend for it to filter what it knows. 03. WANT: Backend replies with unknown contents sha1s. 04. SAVE: Worker sends all `content` data through 1 (or more) request(s). 05. SAVED: Backend stores them and finish the transaction(s). 06. HAVE: Worker sends repository's directories' signatures to the backend for it to filter. 07. WANT: Backend replies with unknown directory sha1s. 08. SAVE: Worker sends all `directory`s' data through 1 (or more) request(s). 09. SAVED: Backend stores them and finish the transaction(s). 10. HAVE: Worker sends repository's revisions' signatures to the backend. 11. WANT: Backend replies with unknown revisions' sha1s. 12. SAVE: Worker sends the `revision`s' data through 1 (or more) request(s). 13. SAVED: Backend stores them and finish the transaction(s). 14. SAVE: Worker sends repository's occurrences for the backend to save what it does not know yet. 15. SAVE: Worker sends repository's releases for the backend to save what it does not know yet. 16. Worker is done. ## 3 01. Worker parses repository and builds a data memory model. The data memory model has the following structure for each possible type: - signatures list - map indexed by git sha1, object representation. Type of object; content, directory, revision, release, occurrence is kept. 02. Worker sends in the api backend's protocol the sha1s. 03. Api Backend receives the list of sha1s, filters out unknown sha1s and replies to the worker. 04. Worker receives the list of unknown sha1s. The worker builds the unknowns `content`s' list. A list of contents, for each content: - git's sha1 (when parsing git repository) - sha1 content (as per content's sha1) - sha256 content - content's size - content And sends it to the api's backend. 05. Backend receives the data and: - computes from the `content` the signatures (sha1, sha256). FIXME: Not implemented yet - checks the signatures match the client's data FIXME: Not Implemented yet - Stores the content on the file storage - Persist in the db the received data If any errors is detected during the process (checksum do not match, writing error, ...), the db transaction is rollbacked and a failure is sent to the client. Otherwise, the db transaction is committed and a success is sent back to the client. *Note* Optimization possible: slice in multiple queries. 06. Worker receives the result from the api. If failure, worker stops. The task is done. Otherwise, the worker continues by sending the list of `directory` structure. A list of directories, for each directory: - sha1 - directory's content - list of directory entries: - name : relative path to parent entry or root - sha1 : pointer to the object this directory points to - type : whether entry is a file or a dir - perms : unix-like permissions - atime : time of last access FIXME: Not the right time yet - mtime : time of last modification FIXME: Not the right time yet - ctime : time of last status change FIXME: Not the right time yet - directory: parent directory sha1 And sends it to the api's backend. *Note* Optimization possible: slice in multiple queries. 07. Api backend receives the data. Persists the directory's content on the file storage. Persist the directory and directory entries on the db's side in respect to the previous directories and contents stored. If any error is raised, the transaction is rollbacked and an error is sent back to the client (worker). Otherwise, the transaction is committed and the success is sent back to the client. 08. Worker receives the result from the api. If failure, worker stops. The task is done. Otherwise, the worker continues by building the list of unknown `revision`s. A list of revisions, for each revision: - sha1, the revision's sha1 - revision's parent sha1s, the list of revision parents - content, the revision's content - revision's date - directory id the revision points to - message, the revision's message - author - committer And sends it to the api's backend. *Note* Optimization possible: slice in multiple queries. 09. Api backend receives data. Persists the revisions' content on the file storage. Persist the directory and directory entries on the db's side in respect to the previous directories and contents stored. If any error is raised, the transaction is rollbacked and an error is sent back to the client (worker). Otherwise, the transaction is committed and the success is sent back to the client. 10. Worker receives the result. Worker sends the complete occurrences list. A list of occurrences, for each occurrence: - sha1, the sha1 the occurrences points to - reference, the occurrence's name - url-origin, the origin of the repository 11. The backend receives the list of occurrences and persist only what it does not know. Acks the result to the backend. 12. Worker sends the complete releases list. A list of releases, for each release: - sha1, the release sha1 - content, the content of the appointed commit - revision, the sha1 the release points to - name, the release's name - date, the release's date # FIXME: find the tag's date, - author, the release's author information - comment, the release's message 13. The backend receives the list of releases and persists only what it does not know. Acks the result to the backend. 14. Worker received the result and stops anyway. The task is done. ## Protocol details - worker serializes the content's payload (python data structure) as pickle format - backend unserializes the request's payload as python data structure diff --git a/requirements-swh.txt b/requirements-swh.txt index 440669d..2a7349a 100644 --- a/requirements-swh.txt +++ b/requirements-swh.txt @@ -1,5 +1,5 @@ swh.core >= 0.0.7 -swh.loader.core >= 3.5.0 +swh.loader.core >= 5.0.0 swh.model >= 4.3.0 swh.scheduler >= 0.0.39 swh.storage >= 0.22.0 diff --git a/swh.loader.git.egg-info/PKG-INFO b/swh.loader.git.egg-info/PKG-INFO index 3509a2a..400925a 100644 --- a/swh.loader.git.egg-info/PKG-INFO +++ b/swh.loader.git.egg-info/PKG-INFO @@ -1,103 +1,103 @@ Metadata-Version: 2.1 Name: swh.loader.git -Version: 1.10.0 +Version: 1.10.1 Summary: Software Heritage git loader Home-page: https://forge.softwareheritage.org/diffusion/DLDG/ Author: Software Heritage developers Author-email: swh-devel@inria.fr Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest Project-URL: Funding, https://www.softwareheritage.org/donate Project-URL: Source, https://forge.softwareheritage.org/source/swh-loader-git Project-URL: Documentation, https://docs.softwareheritage.org/devel/swh-loader-git/ Classifier: Programming Language :: Python :: 3 Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3) Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Requires-Python: >=3.7 Description-Content-Type: text/markdown Provides-Extra: testing License-File: LICENSE License-File: AUTHORS swh-loader-git ============== The Software Heritage Git Loader is a tool and a library to walk a local Git repository and inject into the SWH dataset all contained files that weren't known before. The main entry points are: - :class:`swh.loader.git.loader.GitLoader` for the main loader which can ingest either local or remote git repository's contents. This is the main implementation deployed in production. - :class:`swh.loader.git.from_disk.GitLoaderFromDisk` which ingests only local git clone repository. - :class:`swh.loader.git.loader.GitLoaderFromArchive` which ingests a git repository wrapped in an archive. License ------- This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. See top-level LICENSE file for the full text of the GNU General Public License along with this program. Dependencies ------------ ### Runtime - python3 - python3-dulwich - python3-retrying - python3-swh.core - python3-swh.model - python3-swh.storage - python3-swh.scheduler ### Test - python3-nose Requirements ------------ - implementation language, Python3 - coding guidelines: conform to PEP8 - Git access: via dulwich CLI Run ---------- You can run the loader from a remote origin (*loader*) or from an origin on disk (*from_disk*) directly by calling: ``` swh loader -C run git ``` or "git_disk". ## Configuration sample /tmp/git.yml: ``` storage: cls: remote args: url: http://localhost:5002/ ``` diff --git a/swh.loader.git.egg-info/requires.txt b/swh.loader.git.egg-info/requires.txt index f99fc46..aa9710e 100644 --- a/swh.loader.git.egg-info/requires.txt +++ b/swh.loader.git.egg-info/requires.txt @@ -1,16 +1,16 @@ dulwich>=0.20.43 retrying click swh.core>=0.0.7 -swh.loader.core>=3.5.0 +swh.loader.core>=5.0.0 swh.model>=4.3.0 swh.scheduler>=0.0.39 swh.storage>=0.22.0 [testing] pytest pytest-mock swh.scheduler[testing]>=0.5.0 swh.storage[testing] types-click types-Deprecated diff --git a/swh/loader/git/converters.py b/swh/loader/git/converters.py index 0b7e8fa..ea9ccf3 100644 --- a/swh/loader/git/converters.py +++ b/swh/loader/git/converters.py @@ -1,327 +1,330 @@ # Copyright (C) 2015-2022 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information """Convert dulwich objects to dictionaries suitable for swh.storage""" import logging import re from typing import Any, Dict, Optional, cast import attr from dulwich.objects import Blob, Commit, ShaFile, Tag, Tree, _parse_message from swh.model.hashutil import ( DEFAULT_ALGORITHMS, MultiHash, git_object_header, hash_to_bytes, hash_to_hex, ) from swh.model.model import ( BaseContent, Content, Directory, DirectoryEntry, HashableObject, ObjectType, Person, Release, Revision, RevisionType, SkippedContent, TargetType, Timestamp, TimestampWithTimezone, ) COMMIT_MODE_MASK = 0o160000 """Mode/perms of tree entries that point to a commit. They are normally equal to this mask, but may have more bits set to 1.""" TREE_MODE_MASK = 0o040000 """Mode/perms of tree entries that point to a tree. They are normally equal to this mask, but may have more bits set to 1.""" AUTHORSHIP_LINE_RE = re.compile(rb"^.*> (?P\S+) (?P\S+)$") logger = logging.getLogger(__name__) class HashMismatch(Exception): pass def check_id(obj: HashableObject) -> None: real_id = obj.compute_hash() if obj.id != real_id: raise HashMismatch( f"Expected {type(obj).__name__} hash to be {obj.id.hex()}, " f"got {real_id.hex()}" ) def dulwich_blob_to_content_id(obj: ShaFile) -> Dict[str, Any]: """Convert a dulwich blob to a Software Heritage content id""" if obj.type_name != b"blob": raise ValueError("Argument is not a blob.") blob = cast(Blob, obj) size = blob.raw_length() data = blob.as_raw_string() hashes = MultiHash.from_data(data, DEFAULT_ALGORITHMS).digest() if hashes["sha1_git"] != blob.sha().digest(): raise HashMismatch( f"Expected Content hash to be {blob.sha().digest().hex()}, " f"got {hashes['sha1_git'].hex()}" ) hashes["length"] = size return hashes def dulwich_blob_to_content(obj: ShaFile, max_content_size=None) -> BaseContent: """Convert a dulwich blob to a Software Heritage content""" if obj.type_name != b"blob": raise ValueError("Argument is not a blob.") blob = cast(Blob, obj) hashes = dulwich_blob_to_content_id(blob) if max_content_size is not None and hashes["length"] >= max_content_size: return SkippedContent( status="absent", reason="Content too large", **hashes, ) else: return Content( data=blob.as_raw_string(), status="visible", **hashes, ) def dulwich_tree_to_directory(obj: ShaFile) -> Directory: """Format a tree as a directory""" if obj.type_name != b"tree": raise ValueError("Argument is not a tree.") tree = cast(Tree, obj) entries = [] for entry in tree.iteritems(): if entry.mode & COMMIT_MODE_MASK == COMMIT_MODE_MASK: type_ = "rev" elif entry.mode & TREE_MODE_MASK == TREE_MODE_MASK: type_ = "dir" else: type_ = "file" entries.append( DirectoryEntry( type=type_, perms=entry.mode, - name=entry.path, + name=entry.path.replace( + b"/", b"_" + ), # '/' is very rare, and invalid in SWH. target=hash_to_bytes(entry.sha.decode("ascii")), ) ) dir_ = Directory( id=tree.sha().digest(), entries=tuple(entries), ) if dir_.compute_hash() != dir_.id: expected_id = dir_.id actual_id = dir_.compute_hash() logger.warning( "Expected directory to have id %s, but got %s. Recording raw_manifest.", hash_to_hex(expected_id), hash_to_hex(actual_id), ) raw_string = tree.as_raw_string() dir_ = attr.evolve( dir_, raw_manifest=git_object_header("tree", len(raw_string)) + raw_string ) check_id(dir_) return dir_ def parse_author(name_email: bytes) -> Person: """Parse an author line""" return Person.from_fullname(name_email) def dulwich_tsinfo_to_timestamp( timestamp, timezone: int, timezone_neg_utc: bool, timezone_bytes: Optional[bytes], ) -> TimestampWithTimezone: """Convert the dulwich timestamp information to a structure compatible with Software Heritage.""" ts = Timestamp( seconds=int(timestamp), microseconds=0, ) if timezone_bytes is None: # Failed to parse from the raw manifest, fallback to what Dulwich managed to # parse. return TimestampWithTimezone.from_numeric_offset( timestamp=ts, offset=timezone // 60, negative_utc=timezone_neg_utc, ) else: return TimestampWithTimezone(timestamp=ts, offset_bytes=timezone_bytes) def dulwich_commit_to_revision(obj: ShaFile) -> Revision: if obj.type_name != b"commit": raise ValueError("Argument is not a commit.") commit = cast(Commit, obj) author_timezone = None committer_timezone = None + assert commit._chunked_text is not None # to keep mypy happy for (field, value) in _parse_message(commit._chunked_text): if field == b"author": m = AUTHORSHIP_LINE_RE.match(value) if m: author_timezone = m.group("timezone") elif field == b"committer": m = AUTHORSHIP_LINE_RE.match(value) if m: committer_timezone = m.group("timezone") extra_headers = [] if commit.encoding is not None: extra_headers.append((b"encoding", commit.encoding)) if commit.mergetag: for mergetag in commit.mergetag: raw_string = mergetag.as_raw_string() assert raw_string.endswith(b"\n") extra_headers.append((b"mergetag", raw_string[:-1])) if commit.extra: extra_headers.extend((k, v) for k, v in commit.extra) if commit.gpgsig: extra_headers.append((b"gpgsig", commit.gpgsig)) rev = Revision( id=commit.sha().digest(), author=parse_author(commit.author), date=dulwich_tsinfo_to_timestamp( commit.author_time, commit.author_timezone, commit._author_timezone_neg_utc, author_timezone, ), committer=parse_author(commit.committer), committer_date=dulwich_tsinfo_to_timestamp( commit.commit_time, commit.commit_timezone, commit._commit_timezone_neg_utc, committer_timezone, ), type=RevisionType.GIT, directory=bytes.fromhex(commit.tree.decode()), message=commit.message, metadata=None, extra_headers=tuple(extra_headers), synthetic=False, parents=tuple(bytes.fromhex(p.decode()) for p in commit.parents), ) if rev.compute_hash() != rev.id: expected_id = rev.id actual_id = rev.compute_hash() logger.warning( "Expected revision to have id %s, but got %s. Recording raw_manifest.", hash_to_hex(expected_id), hash_to_hex(actual_id), ) raw_string = commit.as_raw_string() rev = attr.evolve( rev, raw_manifest=git_object_header("commit", len(raw_string)) + raw_string ) check_id(rev) return rev DULWICH_TARGET_TYPES = { b"blob": TargetType.CONTENT, b"tree": TargetType.DIRECTORY, b"commit": TargetType.REVISION, b"tag": TargetType.RELEASE, } DULWICH_OBJECT_TYPES = { b"blob": ObjectType.CONTENT, b"tree": ObjectType.DIRECTORY, b"commit": ObjectType.REVISION, b"tag": ObjectType.RELEASE, } def dulwich_tag_to_release(obj: ShaFile) -> Release: if obj.type_name != b"tag": raise ValueError("Argument is not a tag.") tag = cast(Tag, obj) tagger_timezone = None # FIXME: _parse_message is a private function from Dulwich. for (field, value) in _parse_message(tag.as_raw_chunks()): if field == b"tagger": m = AUTHORSHIP_LINE_RE.match(value) if m: tagger_timezone = m.group("timezone") target_type, target = tag.object if tag.tagger: author: Optional[Person] = parse_author(tag.tagger) if tag.tag_time is None: date = None else: date = dulwich_tsinfo_to_timestamp( tag.tag_time, tag.tag_timezone, tag._tag_timezone_neg_utc, tagger_timezone, ) else: author = date = None message = tag.message if tag.signature: message += tag.signature rel = Release( id=tag.sha().digest(), author=author, date=date, name=tag.name, target=bytes.fromhex(target.decode()), target_type=DULWICH_OBJECT_TYPES[target_type.type_name], message=message, metadata=None, synthetic=False, ) if rel.compute_hash() != rel.id: expected_id = rel.id actual_id = rel.compute_hash() logger.warning( "Expected release to have id %s, but got %s. Recording raw_manifest.", hash_to_hex(expected_id), hash_to_hex(actual_id), ) raw_string = tag.as_raw_string() rel = attr.evolve( rel, raw_manifest=git_object_header("tag", len(raw_string)) + raw_string ) check_id(rel) return rel diff --git a/swh/loader/git/dumb.py b/swh/loader/git/dumb.py index c34c19b..35826e9 100644 --- a/swh/loader/git/dumb.py +++ b/swh/loader/git/dumb.py @@ -1,204 +1,204 @@ # Copyright (C) 2021 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information from __future__ import annotations from collections import defaultdict import logging import stat import struct from tempfile import SpooledTemporaryFile from typing import TYPE_CHECKING, Callable, Dict, Iterable, List, Set, cast import urllib.parse from dulwich.errors import NotGitRepository from dulwich.objects import S_IFGITLINK, Commit, ShaFile, Tree from dulwich.pack import Pack, PackData, PackIndex, load_pack_index_file import requests from swh.loader.git.utils import HexBytes if TYPE_CHECKING: from .loader import RepoRepresentation logger = logging.getLogger(__name__) HEADERS = {"User-Agent": "Software Heritage dumb Git loader"} def check_protocol(repo_url: str) -> bool: """Checks if a git repository can be cloned using the dumb protocol. Args: repo_url: Base URL of a git repository Returns: Whether the dumb protocol is supported. """ if not repo_url.startswith("http"): return False url = urllib.parse.urljoin( repo_url.rstrip("/") + "/", "info/refs?service=git-upload-pack/" ) logger.debug("Fetching %s", url) response = requests.get(url, headers=HEADERS) content_type = response.headers.get("Content-Type") return ( response.status_code in ( 200, 304, ) # header is not mandatory in protocol specification and (content_type is None or not content_type.startswith("application/x-git-")) ) class GitObjectsFetcher: """Git objects fetcher using dumb HTTP protocol. Fetches a set of git objects for a repository according to its archival state by Software Heritage and provides iterators on them. Args: repo_url: Base URL of a git repository base_repo: State of repository archived by Software Heritage """ def __init__(self, repo_url: str, base_repo: RepoRepresentation): self._session = requests.Session() self.repo_url = repo_url self.base_repo = base_repo self.objects: Dict[bytes, Set[bytes]] = defaultdict(set) self.refs = self._get_refs() self.head = self._get_head() if self.refs else {} self.packs = self._get_packs() def fetch_object_ids(self) -> None: """Fetches identifiers of git objects to load into the archive.""" wants = self.base_repo.determine_wants(self.refs) # process refs commit_objects = [] for ref in wants: ref_object = self._get_git_object(ref) - if ref_object.get_type() == Commit.type_num: + if ref_object.type_num == Commit.type_num: commit_objects.append(cast(Commit, ref_object)) self.objects[b"commit"].add(ref) else: self.objects[b"tag"].add(ref) # perform DFS on commits graph while commit_objects: commit = commit_objects.pop() # fetch tree and blob ids recursively self._fetch_tree_objects(commit.tree) for parent in commit.parents: if ( # commit not already seen in the current load parent not in self.objects[b"commit"] # commit not already archived by a previous load and parent not in self.base_repo.heads ): commit_objects.append(cast(Commit, self._get_git_object(parent))) self.objects[b"commit"].add(parent) def iter_objects(self, object_type: bytes) -> Iterable[ShaFile]: """Returns a generator on fetched git objects per type. Args: object_type: Git object type, either b"blob", b"commit", b"tag" or b"tree" Returns: A generator fetching git objects on the fly. """ return map(self._get_git_object, self.objects[object_type]) def _http_get(self, path: str) -> SpooledTemporaryFile: url = urllib.parse.urljoin(self.repo_url.rstrip("/") + "/", path) logger.debug("Fetching %s", url) response = self._session.get(url, headers=HEADERS) buffer = SpooledTemporaryFile(max_size=100 * 1024 * 1024) for chunk in response.iter_content(chunk_size=10 * 1024 * 1024): buffer.write(chunk) buffer.flush() buffer.seek(0) return buffer def _get_refs(self) -> Dict[bytes, HexBytes]: refs = {} refs_resp_bytes = self._http_get("info/refs") for ref_line in refs_resp_bytes.readlines(): ref_target, ref_name = ref_line.replace(b"\n", b"").split(b"\t") refs[ref_name] = ref_target return refs def _get_head(self) -> Dict[bytes, HexBytes]: head_resp_bytes = self._http_get("HEAD") _, head_target = head_resp_bytes.readline().replace(b"\n", b"").split(b" ") return {b"HEAD": head_target} def _get_pack_data(self, pack_name: str) -> Callable[[], PackData]: def _pack_data() -> PackData: pack_data_bytes = self._http_get(f"objects/pack/{pack_name}") return PackData(pack_name, file=pack_data_bytes) return _pack_data def _get_pack_idx(self, pack_idx_name: str) -> Callable[[], PackIndex]: def _pack_idx() -> PackIndex: pack_idx_bytes = self._http_get(f"objects/pack/{pack_idx_name}") return load_pack_index_file(pack_idx_name, pack_idx_bytes) return _pack_idx def _get_packs(self) -> List[Pack]: packs = [] packs_info_bytes = self._http_get("objects/info/packs") packs_info = packs_info_bytes.read().decode() for pack_info in packs_info.split("\n"): if pack_info: pack_name = pack_info.split(" ")[1] pack_idx_name = pack_name.replace(".pack", ".idx") # pack index and data file will be lazily fetched when required packs.append( Pack.from_lazy_objects( self._get_pack_data(pack_name), self._get_pack_idx(pack_idx_name), ) ) return packs def _get_git_object(self, sha: bytes) -> ShaFile: # try to get the object from a pack file first to avoid flooding # git server with numerous HTTP requests for pack in list(self.packs): try: if sha in pack: return pack[sha] except (NotGitRepository, struct.error): # missing (dulwich http client raises NotGitRepository on 404) # or invalid pack index/content, remove it from global packs list logger.debug("A pack file is missing or its content is invalid") self.packs.remove(pack) # fetch it from objects/ directory otherwise sha_hex = sha.decode() object_path = f"objects/{sha_hex[:2]}/{sha_hex[2:]}" return ShaFile.from_file(self._http_get(object_path)) def _fetch_tree_objects(self, sha: bytes) -> None: if sha not in self.objects[b"tree"]: tree = cast(Tree, self._get_git_object(sha)) self.objects[b"tree"].add(sha) for item in tree.items(): if item.mode == S_IFGITLINK: # skip submodules as objects are not stored in repository continue if item.mode & stat.S_IFDIR: self._fetch_tree_objects(item.sha) else: self.objects[b"blob"].add(item.sha) diff --git a/swh/loader/git/tests/test_converters.py b/swh/loader/git/tests/test_converters.py index 9add9ba..833f205 100644 --- a/swh/loader/git/tests/test_converters.py +++ b/swh/loader/git/tests/test_converters.py @@ -1,870 +1,923 @@ # Copyright (C) 2015-2022 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information import copy import datetime import os import shutil import subprocess import tempfile import dulwich.objects import dulwich.repo import pytest import swh.loader.git.converters as converters from swh.model.hashutil import bytehex_to_hash, hash_to_bytehex, hash_to_bytes from swh.model.model import ( Content, Directory, DirectoryEntry, ObjectType, Person, Release, Revision, RevisionType, Timestamp, TimestampWithTimezone, ) TEST_DATA = os.path.join(os.path.dirname(__file__), "data") GPGSIG = ( b"-----BEGIN PGP SIGNATURE-----\n" b"\n" b"iQJLBAABCAA1FiEEAOWDevQbOk/9ITMF6ImSleOlnUcFAl8EnS4XHGRhdmlkLmRv\n" b"dWFyZEBzZGZhMy5vcmcACgkQ6ImSleOlnUdrqQ/8C5RO4NZ5Qr/dwAy2cPA7ktkY\n" b"1oUjKtspQoPbC1X3MXVa1aWo9B3KuOMR2URw44RhMNFwjccLOhfss06E8p7CZr2H\n" b"uR3CzdDw7i52jHLCL2M2ZMaPAEbQuHjXWiUWIUXz9So8YwpTyd2XQneyOC2RDDEI\n" b"I2NVbmiMeDz33jJYPrQO0QayW+ErW+xgBF7N/qS9jFWsdV1ZNfn9NxkTH8UdGuAX\n" b"583P+0tVC2DjXc6vORVhyFzyfn1A9wHosbtWI2Mpa+zezPjoPSkcyQAJu2GyOkMC\n" b"YzSjJdQVqyovo+INkIf6PuUNdp41886BG/06xwT8fl4sVsyO51lNIfgH0DMwfTTB\n" b"ZgThYnvvO7SrXDm3QzBTXkvAiHiFFl3iNyGkCyxvgVmaTntuFT+cP+HD/pCiGaC+\n" b"jHzRwfUrmuLd/lLPyq3JXBibyjnfd3SVS+7q1NZHJ4WUmCboZ0+pfrEl65mEQ/Hz\n" b"J1qCwQ/3SsTB77ANf6lLzGSowjjrtHcBTkTbFxR4ACUhiBbosyDKpHTM7fzGFGjo\n" b"EIjohzrEnqR3bbyxJkK+nxoOByhIRdowgyeJ02I4neMyLJqcaup8NMWCddxqjaPt\n" b"YobghnjaDqEd+suL/v83hbZUAZHNO3i1OZYGMqzp1WHikDPoTwGP76baqBoXi56T\n" b"4WSpxCAJRDODHLk1HgU=\n" b"=73wF" b"\n" b"-----END PGP SIGNATURE-----" ) MERGETAG = ( b"object 9768d0b576dbaaecd80abedad6dfd0d72f1476da\n" b"type commit\n" b"tag v0.0.1\n" b"tagger David Douard 1594138133 +0200\n" b"\n" b"v0.0.1\n" b"-----BEGIN PGP SIGNATURE-----\n" b"\n" b"iQJLBAABCAA1FiEEAOWDevQbOk/9ITMF6ImSleOlnUcFAl8EnhkXHGRhdmlkLmRv\n" b"dWFyZEBzZGZhMy5vcmcACgkQ6ImSleOlnUcdzg//ZW9y2xU5JFQuUsBe/LfKrs+m\n" b"0ohVInPKXwAfpB3+gn/XtTSLe+Nnr8+QEZyVRCUz2gpGZ2tNqRjhYLIX4x5KKlaV\n" b"rfl/6Cy7zibsxxuzA1h7HylCs3IPsueQpznVHUwD9jQ5baGJSc2Lt1LufXTueHZJ\n" b"Oc0oLiP5xCZcPqeX8R/4zUUImJZ1QrPeKmQ/3F+Iq62iWp7nWDp8PtwpykSiYlNf\n" b"KrJM8omGvrlrWLtfPNUaQFClXwnwK1/HyNY2kYan6K5NtsIl2UX0LZ42GkRjJIrb\n" b"q4TFIZWZ6xndtEhHEX6B8Q5TZV6sqPgNnfGpbhj8BDoZgjD0Y43fzfDiZ0Bl2tph\n" b"tXaLg3SX/UUjFVzC1zkoQ2MR7+j8NVKauAsBINpKF4pMGsrsVRk8764pgO49iQ+S\n" b"8JVCVV76dNNm1gd7BbhFAdIAiegBtsEF69niJBoHKYLlrT8E8hDkF/gk4IkimPqf\n" b"UHtw/fPhVW3B4G2skd013NJGcnRj5oKtaM99d2Roxc3vhSRiTsoaM8BM9NDvLmJg\n" b"35rWEOnet39iJIMCHk3AYaJl8QmUhllDdr6vygaBVeVEf27m2c3NzONmIKpWqa2J\n" b"kTpF4cmzHYro34G7WuJ1bYvmLb6qWNQt9wd8RW+J1kVm5I8dkjPzLUougBpOd0YL\n" b"Bl5UTQILbV4Tv8ZlmJM=\n" b"=s1lv\n" b"-----END PGP SIGNATURE-----" ) class SWHObjectType: """Dulwich lookalike ObjectType class""" def __init__(self, type_name): self.type_name = type_name @pytest.mark.fs class TestConverters: @classmethod def setup_class(cls): cls.repo_path = tempfile.mkdtemp() bundle = os.path.join(TEST_DATA, "git-repos", "example-submodule.bundle") git = subprocess.Popen( ["git", "clone", "--quiet", "--bare", "--mirror", bundle, cls.repo_path], cwd=TEST_DATA, ) # flush stdout of xz git.communicate() cls.repo = dulwich.repo.Repo(cls.repo_path) @classmethod def tearDownClass(cls): super().tearDownClass() shutil.rmtree(cls.repo_path) def test_blob_to_content(self): content_id = b"28c6f4023d65f74e3b59a2dea3c4277ed9ee07b0" content = converters.dulwich_blob_to_content(self.repo[content_id]) expected_content = Content( sha1_git=bytehex_to_hash(content_id), sha1=hash_to_bytes("4850a3420a2262ff061cb296fb915430fa92301c"), sha256=hash_to_bytes( "fee7c8a485a10321ad94b64135073cb5" "5f22cb9f57fa2417d2adfb09d310adef" ), blake2s256=hash_to_bytes( "5d71873f42a137f6d89286e43677721e574" "1fa05ce4cd5e3c7ea7c44d4c2d10b" ), data=( b'[submodule "example-dependency"]\n' b"\tpath = example-dependency\n" b"\turl = https://github.com/githubtraining/" b"example-dependency.git\n" ), length=124, status="visible", ) assert content == expected_content def test_corrupt_blob(self, mocker): # has a signature sha1 = hash_to_bytes("28c6f4023d65f74e3b59a2dea3c4277ed9ee07b0") blob = copy.deepcopy(self.repo[hash_to_bytehex(sha1)]) class hasher: def digest(): return sha1 blob._sha = hasher converters.dulwich_blob_to_content(blob) converters.dulwich_blob_to_content_id(blob) sha1 = hash_to_bytes("1234" * 10) with pytest.raises(converters.HashMismatch): converters.dulwich_blob_to_content(blob) with pytest.raises(converters.HashMismatch): converters.dulwich_blob_to_content_id(blob) def test_convertion_wrong_input(self): class Something: type_name = b"something-not-the-right-type" m = { "blob": converters.dulwich_blob_to_content, "tree": converters.dulwich_tree_to_directory, "commit": converters.dulwich_tree_to_directory, "tag": converters.dulwich_tag_to_release, } for _callable in m.values(): with pytest.raises(ValueError): _callable(Something()) def test_corrupt_tree(self): sha1 = b"a9b41fc6347d778f16c4380b598d8083e9b4c1fb" target = b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce" tree = dulwich.objects.Tree() tree.add(b"file1", 0o644, target) assert tree.sha().hexdigest() == sha1.decode() converters.dulwich_tree_to_directory(tree) original_sha = tree.sha() tree.add(b"file2", 0o644, target) tree.sha() # reset tree._needs_serialization tree._sha = original_sha # force the wrong hash assert tree.sha().hexdigest() == sha1.decode() with pytest.raises(converters.HashMismatch): converters.dulwich_tree_to_directory(tree) def test_weird_tree(self): """Tests a tree with entries the wrong order""" - raw_manifest = ( + raw_string = ( b"0644 file2\x00" b"d\x1f\xb6\xe0\x8d\xdb.O\xd0\x96\xdc\xf1\x8e\x80\xb8\x94\xbf~%\xce" b"0644 file1\x00" b"d\x1f\xb6\xe0\x8d\xdb.O\xd0\x96\xdc\xf1\x8e\x80\xb8\x94\xbf~%\xce" ) - tree = dulwich.objects.Tree.from_raw_string(b"tree", raw_manifest) + tree = dulwich.objects.Tree.from_raw_string(b"tree", raw_string) assert converters.dulwich_tree_to_directory(tree) == Directory( entries=( # in alphabetical order, as it should be DirectoryEntry( name=b"file1", type="file", target=hash_to_bytes("641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce"), perms=0o644, ), DirectoryEntry( name=b"file2", type="file", target=hash_to_bytes("641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce"), perms=0o644, ), ), - raw_manifest=b"tree 62\x00" + raw_manifest, + raw_manifest=b"tree 62\x00" + raw_string, ) def test_tree_perms(self): entries = [ (b"blob_100644", 0o100644, "file"), (b"blob_100664", 0o100664, "file"), (b"blob_100666", 0o100666, "file"), (b"blob_120000", 0o120000, "file"), (b"commit_160644", 0o160644, "rev"), (b"commit_160664", 0o160664, "rev"), (b"commit_160666", 0o160666, "rev"), (b"commit_normal", 0o160000, "rev"), (b"tree_040644", 0o040644, "dir"), (b"tree_040664", 0o040664, "dir"), (b"tree_040666", 0o040666, "dir"), (b"tree_normal", 0o040000, "dir"), ] tree = dulwich.objects.Tree() for (name, mode, _) in entries: tree.add(name, mode, b"00" * 20) assert converters.dulwich_tree_to_directory(tree) == Directory( entries=tuple( DirectoryEntry(type=type, perms=perms, name=name, target=b"\x00" * 20) for (name, perms, type) in entries ) ) + def test_tree_with_slashes(self): + raw_string = ( + b"100775 AUTHORS\x00" + b"\x7f\xde\x98Av\x81I\xbb\x19\x88N\xffu\xed\xca\x01\xe1\x04\xb1\x81" + b"100775 COPYING\x00" + b'\xd5\n\x11\xd6O\xa5(\xfcv\xb3\x81\x92\xd1\x8c\x05?\xe8"A\xda' + b"100775 README.markdown\x00" + b"X-c\xd6\xb7\xa8*\xfa\x13\x9e\xef\x83q\xbf^\x90\xe9UVQ" + b"100775 gitter/gitter.xml\x00" + b"\xecJ\xfa\xa3\\\xe1\x9fo\x93\x131I\xcb\xbf1h2\x00}n" + b"100775 gitter/script.py\x00" + b"\x1d\xd3\xec\x83\x94+\xbc\x04\xde\xee\x7f\xc6\xbe\x8b\x9cnp=W\xf9" + ) + + tree = dulwich.objects.Tree.from_raw_string(b"tree", raw_string) + + dir_ = Directory( + entries=( + DirectoryEntry( + name=b"AUTHORS", + type="file", + target=hash_to_bytes("7fde9841768149bb19884eff75edca01e104b181"), + perms=0o100775, + ), + DirectoryEntry( + name=b"COPYING", + type="file", + target=hash_to_bytes("d50a11d64fa528fc76b38192d18c053fe82241da"), + perms=0o100775, + ), + DirectoryEntry( + name=b"README.markdown", + type="file", + target=hash_to_bytes("582d63d6b7a82afa139eef8371bf5e90e9555651"), + perms=0o100775, + ), + DirectoryEntry( + name=b"gitter_gitter.xml", # changed + type="file", + target=hash_to_bytes("ec4afaa35ce19f6f93133149cbbf316832007d6e"), + perms=0o100775, + ), + DirectoryEntry( + name=b"gitter_script.py", # changed + type="file", + target=hash_to_bytes("1dd3ec83942bbc04deee7fc6be8b9c6e703d57f9"), + perms=0o100775, + ), + ), + raw_manifest=b"tree 202\x00" + raw_string, + ) + assert converters.dulwich_tree_to_directory(tree) == dir_ + def test_commit_to_revision(self): sha1 = b"9768d0b576dbaaecd80abedad6dfd0d72f1476da" revision = converters.dulwich_commit_to_revision(self.repo[sha1]) expected_revision = Revision( id=hash_to_bytes("9768d0b576dbaaecd80abedad6dfd0d72f1476da"), directory=b"\xf0i\\./\xa7\xce\x9dW@#\xc3A7a\xa4s\xe5\x00\xca", type=RevisionType.GIT, committer=Person( name=b"Stefano Zacchiroli", fullname=b"Stefano Zacchiroli ", email=b"zack@upsilon.cc", ), author=Person( name=b"Stefano Zacchiroli", fullname=b"Stefano Zacchiroli ", email=b"zack@upsilon.cc", ), committer_date=TimestampWithTimezone( timestamp=Timestamp( seconds=1443083765, microseconds=0, ), offset_bytes=b"+0200", ), message=b"add submodule dependency\n", metadata=None, extra_headers=(), date=TimestampWithTimezone( timestamp=Timestamp( seconds=1443083765, microseconds=0, ), offset_bytes=b"+0200", ), parents=(b"\xc3\xc5\x88q23`\x9f[\xbb\xb2\xd9\xe7\xf3\xfbJf\x0f?r",), synthetic=False, ) assert revision == expected_revision def test_commit_to_revision_with_extra_headers(self): sha1 = b"322f5bc915e50fc25e85226b5a182bded0e98e4b" revision = converters.dulwich_commit_to_revision(self.repo[sha1]) expected_revision = Revision( id=hash_to_bytes(sha1.decode()), directory=bytes.fromhex("f8ec06e4ed7b9fff4918a0241a48023143f30000"), type=RevisionType.GIT, committer=Person( name=b"David Douard", fullname=b"David Douard ", email=b"david.douard@sdfa3.org", ), author=Person( name=b"David Douard", fullname=b"David Douard ", email=b"david.douard@sdfa3.org", ), committer_date=TimestampWithTimezone( timestamp=Timestamp( seconds=1594137902, microseconds=0, ), offset_bytes=b"+0200", ), message=b"Am\xe9lioration du fichier READM\xa4\n", metadata=None, extra_headers=((b"encoding", b"ISO-8859-15"), (b"gpgsig", GPGSIG)), date=TimestampWithTimezone( timestamp=Timestamp( seconds=1594136900, microseconds=0, ), offset_bytes=b"+0200", ), parents=(bytes.fromhex("c730509025c6e81947102b2d77bc4dc1cade9489"),), synthetic=False, ) assert revision == expected_revision def test_commit_without_manifest(self): """Tests a Release can still be produced when the manifest is not understood by the custom parser in dulwich_commit_to_revision.""" target = b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce" message = b"some commit message" author = Person( fullname=b"Foo ", name=b"Foo", email=b"foo@example.org" ) commit = dulwich.objects.Commit() commit.tree = target commit.message = message commit.author = commit.committer = b"Foo " commit.author_time = commit.commit_time = 1641980946 commit.author_timezone = commit.commit_timezone = 3600 assert converters.dulwich_commit_to_revision(commit) == Revision( message=b"some commit message", author=author, committer=author, date=TimestampWithTimezone( timestamp=Timestamp(seconds=1641980946, microseconds=0), offset_bytes=b"+0100", ), committer_date=TimestampWithTimezone( timestamp=Timestamp(seconds=1641980946, microseconds=0), offset_bytes=b"+0100", ), type=RevisionType.GIT, directory=hash_to_bytes(target.decode()), synthetic=False, metadata=None, parents=(), ) @pytest.mark.parametrize("attribute", ["message", "encoding", "author", "gpgsig"]) def test_corrupt_commit(self, attribute): sha = hash_to_bytes("3f0ac5a6d15d89cf928209a57334e3b77c5651b9") target = b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce" message = b"some commit message" commit = dulwich.objects.Commit() commit.tree = target commit.message = message commit.gpgsig = GPGSIG commit.author = commit.committer = b"Foo " commit.author_time = commit.commit_time = 1641980946 commit.author_timezone = commit.commit_timezone = 3600 converters.dulwich_commit_to_revision(commit) assert commit.sha().digest() == sha original_sha = commit.sha() setattr(commit, attribute, b"abcde") commit.sha() # reset tag._needs_serialization commit._sha = original_sha # force the wrong hash with pytest.raises(converters.HashMismatch): converters.dulwich_commit_to_revision(commit) if attribute == "_gpgsig": setattr(commit, attribute, None) commit.sha() # reset tag._needs_serialization commit._sha = original_sha # force the wrong hash with pytest.raises(converters.HashMismatch): converters.dulwich_commit_to_revision(commit) def test_commit_to_revision_with_extra_headers_mergetag(self): sha1 = b"3ab3da4bf0f81407be16969df09cd1c8af9ac703" revision = converters.dulwich_commit_to_revision(self.repo[sha1]) expected_revision = Revision( id=hash_to_bytes(sha1.decode()), directory=bytes.fromhex("faa4b64a841ca3e3f07d6501caebda2e3e8e544e"), type=RevisionType.GIT, committer=Person( name=b"David Douard", fullname=b"David Douard ", email=b"david.douard@sdfa3.org", ), author=Person( name=b"David Douard", fullname=b"David Douard ", email=b"david.douard@sdfa3.org", ), committer_date=TimestampWithTimezone( timestamp=Timestamp( seconds=1594138183, microseconds=0, ), offset_bytes=b"+0200", ), message=b"Merge tag 'v0.0.1' into readme\n\nv0.0.1\n", metadata=None, extra_headers=((b"encoding", b"ISO-8859-15"), (b"mergetag", MERGETAG)), date=TimestampWithTimezone( timestamp=Timestamp( seconds=1594138183, microseconds=0, ), offset_bytes=b"+0200", ), parents=( bytes.fromhex("322f5bc915e50fc25e85226b5a182bded0e98e4b"), bytes.fromhex("9768d0b576dbaaecd80abedad6dfd0d72f1476da"), ), synthetic=False, ) assert revision == expected_revision def test_weird_commit(self): """Checks raw_manifest is set when the commit cannot fit the data model""" # Well-formed manifest - raw_manifest = ( + raw_string = ( b"tree 641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce\n" b"author Foo 1640191028 +0200\n" b"committer Foo 1640191028 +0200\n\n" b"some commit message" ) - commit = dulwich.objects.Commit.from_raw_string(b"commit", raw_manifest) + commit = dulwich.objects.Commit.from_raw_string(b"commit", raw_string) date = TimestampWithTimezone( timestamp=Timestamp(seconds=1640191028, microseconds=0), offset_bytes=b"+0200", ) assert converters.dulwich_commit_to_revision(commit) == Revision( message=b"some commit message", directory=hash_to_bytes("641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce"), synthetic=False, author=Person.from_fullname( b"Foo ", ), committer=Person.from_fullname( b"Foo ", ), date=date, committer_date=date, type=RevisionType.GIT, raw_manifest=None, ) # Mess with the offset - raw_manifest2 = raw_manifest.replace(b"+0200", b"+200") - commit = dulwich.objects.Commit.from_raw_string(b"commit", raw_manifest2) + raw_string2 = raw_string.replace(b"+0200", b"+200") + commit = dulwich.objects.Commit.from_raw_string(b"commit", raw_string2) date = TimestampWithTimezone( timestamp=Timestamp(seconds=1640191028, microseconds=0), offset_bytes=b"+200", ) assert converters.dulwich_commit_to_revision(commit) == Revision( message=b"some commit message", directory=hash_to_bytes("641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce"), synthetic=False, author=Person.from_fullname( b"Foo ", ), committer=Person.from_fullname( b"Foo ", ), date=date, committer_date=date, type=RevisionType.GIT, raw_manifest=None, ) # Mess with the rest of the manifest - raw_manifest2 = raw_manifest.replace( + raw_string2 = raw_string.replace( b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce", b"641FB6E08DDB2E4FD096DCF18E80B894BF7E25CE", ) - commit = dulwich.objects.Commit.from_raw_string(b"commit", raw_manifest2) + commit = dulwich.objects.Commit.from_raw_string(b"commit", raw_string2) date = TimestampWithTimezone( timestamp=Timestamp(seconds=1640191028, microseconds=0), offset_bytes=b"+0200", ) assert converters.dulwich_commit_to_revision(commit) == Revision( message=b"some commit message", directory=hash_to_bytes("641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce"), synthetic=False, author=Person.from_fullname( b"Foo ", ), committer=Person.from_fullname( b"Foo ", ), date=date, committer_date=date, type=RevisionType.GIT, - raw_manifest=b"commit 161\x00" + raw_manifest2, + raw_manifest=b"commit 161\x00" + raw_string2, ) def test_author_line_to_author(self): # edge case out of the way with pytest.raises(TypeError): converters.parse_author(None) tests = { b"a ": Person( name=b"a", email=b"b@c.com", fullname=b"a ", ), b"": Person( name=None, email=b"foo@bar.com", fullname=b"", ), b"malformed ": Person( name=b"trailing", email=b"sp@c.e", fullname=b"trailing ", ), b"no": Person( name=b"no", email=b"sp@c.e", fullname=b"no", ), b" <>": Person( name=None, email=None, fullname=b" <>", ), b"something": Person(name=b"something", email=None, fullname=b"something"), } for author in sorted(tests): parsed_author = tests[author] assert parsed_author == converters.parse_author(author) def test_dulwich_tag_to_release_no_author_no_date(self): sha = hash_to_bytes("f6e367357b446bd1315276de5e88ba3d0d99e136") target = b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce" message = b"some release message" tag = dulwich.objects.Tag() tag.name = b"blah" tag.object = (dulwich.objects.Commit, target) tag.message = message tag.signature = None tag.tagger = None tag.tag_time = None tag.tag_timezone = None assert tag.sha().digest() == sha # when actual_release = converters.dulwich_tag_to_release(tag) # then expected_release = Release( author=None, date=None, id=sha, message=message, metadata=None, name=b"blah", synthetic=False, target=hash_to_bytes(target.decode()), target_type=ObjectType.REVISION, ) assert actual_release == expected_release def test_dulwich_tag_to_release_author_and_date(self): sha = hash_to_bytes("fc1e6a4f1e37e93e28e78560e73efd0b12f616ef") tagger = b"hey dude " target = b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce" message = b"some release message" date = int( datetime.datetime(2007, 12, 5, tzinfo=datetime.timezone.utc).timestamp() ) tag = dulwich.objects.Tag() tag.name = b"blah" tag.object = (dulwich.objects.Commit, target) tag.message = message tag.signature = None tag.tagger = tagger tag.tag_time = date tag.tag_timezone = 0 assert tag.sha().digest() == sha # when actual_release = converters.dulwich_tag_to_release(tag) # then expected_release = Release( author=Person( email=b"hello@mail.org", fullname=b"hey dude ", name=b"hey dude", ), date=TimestampWithTimezone( timestamp=Timestamp( seconds=1196812800, microseconds=0, ), offset_bytes=b"+0000", ), id=sha, message=message, metadata=None, name=b"blah", synthetic=False, target=hash_to_bytes(target.decode()), target_type=ObjectType.REVISION, ) assert actual_release == expected_release def test_dulwich_tag_to_release_author_no_date(self): # to reproduce bug T815 (fixed) sha = hash_to_bytes("41076e970975122dc6b2a878aa9797960bc4781d") tagger = b"hey dude " target = b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce" message = b"some release message" tag = dulwich.objects.Tag() tag.name = b"blah" tag.object = (dulwich.objects.Commit, target) tag.message = message tag.signature = None tag.tagger = tagger tag.tag_time = None tag.tag_timezone = None assert tag.sha().digest() == sha # when actual_release = converters.dulwich_tag_to_release(tag) # then expected_release = Release( author=Person( email=b"hello@mail.org", fullname=b"hey dude ", name=b"hey dude", ), date=None, id=sha, message=message, metadata=None, name=b"blah", synthetic=False, target=hash_to_bytes(target.decode()), target_type=ObjectType.REVISION, ) assert actual_release == expected_release def test_dulwich_tag_to_release_author_zero_date(self): # to reproduce bug T815 (fixed) sha = hash_to_bytes("6cc1deff5cdcd853428bb63b937f43dd2566c36f") tagger = b"hey dude " target = b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce" message = b"some release message" date = int( datetime.datetime(1970, 1, 1, tzinfo=datetime.timezone.utc).timestamp() ) tag = dulwich.objects.Tag() tag.name = b"blah" tag.object = (dulwich.objects.Commit, target) tag.message = message tag.signature = None tag.tagger = tagger tag.tag_time = date tag.tag_timezone = 0 assert tag.sha().digest() == sha # when actual_release = converters.dulwich_tag_to_release(tag) # then expected_release = Release( author=Person( email=b"hello@mail.org", fullname=b"hey dude ", name=b"hey dude", ), date=TimestampWithTimezone( timestamp=Timestamp( seconds=0, microseconds=0, ), offset_bytes=b"+0000", ), id=sha, message=message, metadata=None, name=b"blah", synthetic=False, target=hash_to_bytes(target.decode()), target_type=ObjectType.REVISION, ) assert actual_release == expected_release def test_dulwich_tag_to_release_signature(self): target = b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce" message = b"some release message" sha = hash_to_bytes("46fff489610ed733d2cc904e363070dadee05c71") tag = dulwich.objects.Tag() tag.name = b"blah" tag.object = (dulwich.objects.Commit, target) tag.message = message tag.signature = GPGSIG tag.tagger = None tag.tag_time = None tag.tag_timezone = None assert tag.sha().digest() == sha # when actual_release = converters.dulwich_tag_to_release(tag) # then expected_release = Release( author=None, date=None, id=sha, message=message + GPGSIG, metadata=None, name=b"blah", synthetic=False, target=hash_to_bytes(target.decode()), target_type=ObjectType.REVISION, ) assert actual_release == expected_release @pytest.mark.parametrize("attribute", ["name", "message", "signature"]) def test_corrupt_tag(self, attribute): sha = hash_to_bytes("46fff489610ed733d2cc904e363070dadee05c71") target = b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce" message = b"some release message" tag = dulwich.objects.Tag() tag.name = b"blah" tag.object = (dulwich.objects.Commit, target) tag.message = message tag.signature = GPGSIG tag.tagger = None tag.tag_time = None tag.tag_timezone = None assert tag.sha().digest() == sha converters.dulwich_tag_to_release(tag) original_sha = tag.sha() setattr(tag, attribute, b"abcde") tag.sha() # reset tag._needs_serialization tag._sha = original_sha # force the wrong hash with pytest.raises(converters.HashMismatch): converters.dulwich_tag_to_release(tag) if attribute == "signature": setattr(tag, attribute, None) tag.sha() # reset tag._needs_serialization tag._sha = original_sha # force the wrong hash with pytest.raises(converters.HashMismatch): converters.dulwich_tag_to_release(tag) def test_weird_tag(self): """Checks raw_manifest is set when the tag cannot fit the data model""" # Well-formed manifest - raw_manifest = ( + raw_string = ( b"object 641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce\n" b"type commit\n" b"tag blah\n" b"tagger Foo 1640191027 +0200\n\n" b"some release message" ) - tag = dulwich.objects.Tag.from_raw_string(b"tag", raw_manifest) + tag = dulwich.objects.Tag.from_raw_string(b"tag", raw_string) assert converters.dulwich_tag_to_release(tag) == Release( name=b"blah", message=b"some release message", target=hash_to_bytes("641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce"), target_type=ObjectType.REVISION, synthetic=False, author=Person.from_fullname( b"Foo ", ), date=TimestampWithTimezone( timestamp=Timestamp(seconds=1640191027, microseconds=0), offset_bytes=b"+0200", ), raw_manifest=None, ) # Mess with the offset (negative UTC) - raw_manifest2 = raw_manifest.replace(b"+0200", b"-0000") - tag = dulwich.objects.Tag.from_raw_string(b"tag", raw_manifest2) + raw_string2 = raw_string.replace(b"+0200", b"-0000") + tag = dulwich.objects.Tag.from_raw_string(b"tag", raw_string2) assert converters.dulwich_tag_to_release(tag) == Release( name=b"blah", message=b"some release message", target=hash_to_bytes("641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce"), target_type=ObjectType.REVISION, synthetic=False, author=Person.from_fullname( b"Foo ", ), date=TimestampWithTimezone( timestamp=Timestamp(seconds=1640191027, microseconds=0), offset_bytes=b"-0000", ), ) # Mess with the offset (other) - raw_manifest2 = raw_manifest.replace(b"+0200", b"+200") - tag = dulwich.objects.Tag.from_raw_string(b"tag", raw_manifest2) + raw_string2 = raw_string.replace(b"+0200", b"+200") + tag = dulwich.objects.Tag.from_raw_string(b"tag", raw_string2) assert converters.dulwich_tag_to_release(tag) == Release( name=b"blah", message=b"some release message", target=hash_to_bytes("641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce"), target_type=ObjectType.REVISION, synthetic=False, author=Person.from_fullname( b"Foo ", ), date=TimestampWithTimezone( timestamp=Timestamp(seconds=1640191027, microseconds=0), offset_bytes=b"+200", ), ) # Mess with the rest of the manifest - raw_manifest2 = raw_manifest.replace( + raw_string2 = raw_string.replace( b"641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce", b"641FB6E08DDB2E4FD096DCF18E80B894BF7E25CE", ) - tag = dulwich.objects.Tag.from_raw_string(b"tag", raw_manifest2) + tag = dulwich.objects.Tag.from_raw_string(b"tag", raw_string2) assert converters.dulwich_tag_to_release(tag) == Release( name=b"blah", message=b"some release message", target=hash_to_bytes("641fb6e08ddb2e4fd096dcf18e80b894bf7e25ce"), target_type=ObjectType.REVISION, synthetic=False, author=Person.from_fullname( b"Foo ", ), date=TimestampWithTimezone( timestamp=Timestamp(seconds=1640191027, microseconds=0), offset_bytes=b"+0200", ), - raw_manifest=b"tag 136\x00" + raw_manifest2, + raw_manifest=b"tag 136\x00" + raw_string2, ) diff --git a/swh/loader/git/tests/test_tasks.py b/swh/loader/git/tests/test_tasks.py index fd442e1..2491c05 100644 --- a/swh/loader/git/tests/test_tasks.py +++ b/swh/loader/git/tests/test_tasks.py @@ -1,164 +1,95 @@ # Copyright (C) 2018-2022 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information import uuid import pytest from swh.scheduler.model import ListedOrigin, Lister -from swh.scheduler.utils import create_origin_task_dict - -@pytest.fixture(autouse=True) -def celery_worker_and_swh_config(swh_scheduler_celery_worker, swh_config): - pass +NAMESPACE = "swh.loader.git" @pytest.fixture def git_lister(): return Lister(name="git-lister", instance_name="example", id=uuid.uuid4()) @pytest.fixture def git_listed_origin(git_lister): return ListedOrigin( lister_id=git_lister.id, url="https://git.example.org/repo", visit_type="git" ) -def test_git_loader( - mocker, - swh_scheduler_celery_app, -): - mock_loader = mocker.patch("swh.loader.git.loader.GitLoader.load") - mock_loader.return_value = {"status": "eventful"} - - res = swh_scheduler_celery_app.send_task( - "swh.loader.git.tasks.UpdateGitRepository", - kwargs={"url": "origin_url"}, - ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "eventful"} - mock_loader.assert_called_once_with() - - def test_git_loader_for_listed_origin( - mocker, - swh_scheduler_celery_app, + loading_task_creation_for_listed_origin_test, git_lister, git_listed_origin, ): - mock_loader = mocker.patch("swh.loader.git.loader.GitLoader.load") - mock_loader.return_value = {"status": "eventful"} - - task_dict = create_origin_task_dict(git_listed_origin, git_lister) - - res = swh_scheduler_celery_app.send_task( - "swh.loader.git.tasks.UpdateGitRepository", - kwargs=task_dict["arguments"]["kwargs"], - ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "eventful"} - mock_loader.assert_called_once_with() - - -def test_git_loader_from_disk( - mocker, - swh_scheduler_celery_app, -): - mock_loader = mocker.patch("swh.loader.git.from_disk.GitLoaderFromDisk.load") - mock_loader.return_value = {"status": "uneventful"} - - res = swh_scheduler_celery_app.send_task( - "swh.loader.git.tasks.LoadDiskGitRepository", - kwargs={"url": "origin_url2", "directory": "/some/repo", "visit_date": "now"}, + loading_task_creation_for_listed_origin_test( + loader_class_name=f"{NAMESPACE}.loader.GitLoader", + task_function_name=f"{NAMESPACE}.tasks.UpdateGitRepository", + lister=git_lister, + listed_origin=git_listed_origin, ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "uneventful"} - mock_loader.assert_called_once_with() +@pytest.mark.parametrize( + "extra_loader_arguments", + [ + { + "directory": "/some/repo", + }, + { + "directory": "/some/repo", + "visit_date": "now", + }, + ], +) def test_git_loader_from_disk_for_listed_origin( - mocker, - swh_scheduler_celery_app, + loading_task_creation_for_listed_origin_test, git_lister, git_listed_origin, + extra_loader_arguments, ): - mock_loader = mocker.patch("swh.loader.git.from_disk.GitLoaderFromDisk.load") - mock_loader.return_value = {"status": "uneventful"} - git_listed_origin.extra_loader_arguments = { - "directory": "/some/repo", - } - task_dict = create_origin_task_dict(git_listed_origin, git_lister) + git_listed_origin.extra_loader_arguments = extra_loader_arguments - res = swh_scheduler_celery_app.send_task( - "swh.loader.git.tasks.LoadDiskGitRepository", - kwargs=task_dict["arguments"]["kwargs"], + loading_task_creation_for_listed_origin_test( + loader_class_name=f"{NAMESPACE}.from_disk.GitLoaderFromDisk", + task_function_name=f"{NAMESPACE}.tasks.LoadDiskGitRepository", + lister=git_lister, + listed_origin=git_listed_origin, ) - assert res - res.wait() - assert res.successful() - assert res.result == {"status": "uneventful"} - mock_loader.assert_called_once_with() - - -def test_git_loader_from_archive( - mocker, - swh_scheduler_celery_app, -): - mock_loader = mocker.patch("swh.loader.git.from_disk.GitLoaderFromArchive.load") - mock_loader.return_value = {"status": "failed"} - res = swh_scheduler_celery_app.send_task( - "swh.loader.git.tasks.UncompressAndLoadDiskGitRepository", - kwargs={ - "url": "origin_url3", +@pytest.mark.parametrize( + "extra_loader_arguments", + [ + { + "archive_path": "/some/repo", + }, + { "archive_path": "/some/repo", "visit_date": "now", }, - ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "failed"} - mock_loader.assert_called_once_with() - - + ], +) def test_git_loader_from_archive_for_listed_origin( - mocker, - swh_scheduler_celery_app, + loading_task_creation_for_listed_origin_test, git_lister, git_listed_origin, + extra_loader_arguments, ): - mock_loader = mocker.patch("swh.loader.git.from_disk.GitLoaderFromArchive.load") - mock_loader.return_value = {"status": "failed"} - git_listed_origin.extra_loader_arguments = { - "archive_path": "/some/repo", - } - task_dict = create_origin_task_dict(git_listed_origin, git_lister) + git_listed_origin.extra_loader_arguments = extra_loader_arguments - res = swh_scheduler_celery_app.send_task( - "swh.loader.git.tasks.UncompressAndLoadDiskGitRepository", - kwargs=task_dict["arguments"]["kwargs"], + loading_task_creation_for_listed_origin_test( + loader_class_name=f"{NAMESPACE}.from_disk.GitLoaderFromArchive", + task_function_name=f"{NAMESPACE}.tasks.UncompressAndLoadDiskGitRepository", + lister=git_lister, + listed_origin=git_listed_origin, ) - assert res - res.wait() - assert res.successful() - - assert res.result == {"status": "failed"} - mock_loader.assert_called_once_with() diff --git a/tox.ini b/tox.ini index 4fad105..6d07f6d 100644 --- a/tox.ini +++ b/tox.ini @@ -1,79 +1,80 @@ [tox] envlist=black,flake8,mypy,py3 [testenv] extras = testing deps = # the dependency below is needed for now as a workaround for # https://github.com/pypa/pip/issues/6239 # TODO: remove when this issue is fixed swh.core[testing] >= 0.0.61 swh.storage[testing] swh.scheduler[testing] >= 0.5.0 pytest-cov commands = pytest --cov={envsitepackagesdir}/swh/loader/git \ {envsitepackagesdir}/swh/loader/git \ --cov-branch {posargs} [testenv:black] skip_install = true deps = - black==22.3.0 + black==22.10.0 commands = {envpython} -m black --check swh [testenv:flake8] skip_install = true deps = - flake8==4.0.1 - flake8-bugbear==22.3.23 + flake8==5.0.4 + flake8-bugbear==22.9.23 + pycodestyle==2.9.1 commands = {envpython} -m flake8 [testenv:mypy] extras = testing deps = mypy==0.942 commands = mypy swh # build documentation outside swh-environment using the current # git HEAD of swh-docs, is executed on CI for each diff to prevent # breaking doc build [testenv:sphinx] whitelist_externals = make usedevelop = true extras = testing deps = # fetch and install swh-docs in develop mode -e git+https://forge.softwareheritage.org/source/swh-docs#egg=swh.docs setenv = SWH_PACKAGE_DOC_TOX_BUILD = 1 # turn warnings into errors SPHINXOPTS = -W commands = make -I ../.tox/sphinx/src/swh-docs/swh/ -C docs # build documentation only inside swh-environment using local state # of swh-docs package [testenv:sphinx-dev] whitelist_externals = make usedevelop = true extras = testing deps = # install swh-docs in develop mode -e ../swh-docs setenv = SWH_PACKAGE_DOC_TOX_BUILD = 1 # turn warnings into errors SPHINXOPTS = -W commands = make -I ../.tox/sphinx-dev/src/swh-docs/swh/ -C docs