Page MenuHomeSoftware Heritage

No OneTemporary

This file is larger than 256 KB, so syntax highlighting was skipped.
diff --git a/.git-blame-ignore-revs b/.git-blame-ignore-revs
new file mode 100644
index 0000000..c3e18f1
--- /dev/null
+++ b/.git-blame-ignore-revs
@@ -0,0 +1,5 @@
+# Enable black
+cf496e18440a073ec3d2b65657882e1bdb69a4d2
+
+# python: Reformat code with black 22.3.0
+9c8a00123c37db204c59f6e6d17af520d29c7b65
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index 9a5ebc0..eba24da 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -1,44 +1,45 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.1.0
hooks:
- id: trailing-whitespace
- id: check-json
- id: check-yaml
- repo: https://gitlab.com/pycqa/flake8
rev: 4.0.1
hooks:
- id: flake8
+ additional_dependencies: [flake8-bugbear==22.3.23]
- repo: https://github.com/codespell-project/codespell
rev: v2.1.0
hooks:
- id: codespell
name: Check source code spelling
exclude: ^(swh/loader/package/.*[/]+tests/data/.*)$
entry: codespell --ignore-words-list=iff
stages: [commit]
- id: codespell
name: Check commit message spelling
stages: [commit-msg]
- repo: local
hooks:
- id: mypy
name: mypy
entry: mypy
args: [swh]
pass_filenames: false
language: system
types: [python]
- repo: https://github.com/PyCQA/isort
rev: 5.10.1
hooks:
- id: isort
- repo: https://github.com/python/black
- rev: 19.10b0
+ rev: 22.3.0
hooks:
- id: black
diff --git a/PKG-INFO b/PKG-INFO
index a92a4b5..95e9abe 100644
--- a/PKG-INFO
+++ b/PKG-INFO
@@ -1,56 +1,56 @@
Metadata-Version: 2.1
Name: swh.loader.core
-Version: 2.6.1
+Version: 2.6.2
Summary: Software Heritage Base Loader
Home-page: https://forge.softwareheritage.org/diffusion/DLDBASE
Author: Software Heritage developers
Author-email: swh-devel@inria.fr
License: UNKNOWN
Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest
Project-URL: Funding, https://www.softwareheritage.org/donate
Project-URL: Source, https://forge.softwareheritage.org/source/swh-loader-core
Project-URL: Documentation, https://docs.softwareheritage.org/devel/swh-loader-core/
Platform: UNKNOWN
Classifier: Programming Language :: Python :: 3
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3)
Classifier: Operating System :: OS Independent
Classifier: Development Status :: 5 - Production/Stable
Requires-Python: >=3.7
Description-Content-Type: text/markdown
Provides-Extra: testing
License-File: LICENSE
License-File: AUTHORS
Software Heritage - Loader foundations
======================================
The Software Heritage Loader Core is a low-level loading utilities and
helpers used by :term:`loaders <loader>`.
The main entry points are classes:
- :class:`swh.loader.core.loader.BaseLoader` for loaders (e.g. svn)
- :class:`swh.loader.core.loader.DVCSLoader` for DVCS loaders (e.g. hg, git, ...)
- :class:`swh.loader.package.loader.PackageLoader` for Package loaders (e.g. PyPI, Npm, ...)
Package loaders
---------------
This package also implements many package loaders directly, out of convenience,
as they usually are quite similar and each fits in a single file.
They all roughly follow these steps, explained in the
:py:meth:`swh.loader.package.loader.PackageLoader.load` documentation.
See the :ref:`package-loader-tutorial` for details.
VCS loaders
-----------
Unlike package loaders, VCS loaders remain in separate packages,
as they often need more advanced conversions and very VCS-specific operations.
This usually involves getting the branches of a repository and recursively loading
revisions in the history (and directory trees in these revisions),
until a known revision is found
diff --git a/debian/changelog b/debian/changelog
index 80a80d4..1f54527 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -1,1699 +1,1703 @@
-swh-loader-core (2.6.1-1~swh1~bpo10+1) buster-swh; urgency=medium
+swh-loader-core (2.6.2-1~swh1) unstable-swh; urgency=medium
- * Rebuild for buster-swh
+ * New upstream release 2.6.2 - (tagged by Antoine R. Dumont
+ (@ardumont) <ardumont@softwareheritage.org> on 2022-04-14 11:46:06
+ +0200)
+ * Upstream changes: - v2.6.2 - maven: Consistently read lister
+ input to ingest a mvn origin
- -- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 08 Apr 2022 09:15:47 +0000
+ -- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 14 Apr 2022 09:53:26 +0000
swh-loader-core (2.6.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.6.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2022-04-08 11:03:06
+0200)
* Upstream changes: - v2.6.1 - Rename metadata key in data
received from the deposit server - origin/master npm: Add all
fields we use to the ExtID manifest - npm: Include package
version id in ExtID manifest
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 08 Apr 2022 09:13:17 +0000
swh-loader-core (2.6.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.6.0 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2022-03-02 13:54:45 +0100)
* Upstream changes: - v2.6.0 - * Update for the new output
format of the Deposit's API.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 02 Mar 2022 12:58:43 +0000
swh-loader-core (2.5.4-1~swh2) unstable-swh; urgency=medium
* Bump new release with opam tests deactivated
-- Antoine R. Dumont (@ardumont) <ardumont@softwareheritage.org> Fri, 25 Feb 2022 12:40:40 +0100
swh-loader-core (2.5.4-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.5.4 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2022-02-25 10:23:51
+0100)
* Upstream changes: - v2.5.4 - loader/opam/tests: Do not run
actual opam init command call
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 25 Feb 2022 09:28:10 +0000
swh-loader-core (2.5.3-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.5.3 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2022-02-24 16:02:53
+0100)
* Upstream changes: - v2.5.3 - opam: Allow build to run the
opam init completely
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 24 Feb 2022 15:07:20 +0000
swh-loader-core (2.5.2-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.5.2 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2022-02-24 09:52:26 +0100)
* Upstream changes: - v2.5.2 - * deposit: Remove unused
raw_info
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 24 Feb 2022 08:57:52 +0000
swh-loader-core (2.5.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.5.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2022-02-16 15:27:02
+0100)
* Upstream changes: - v2.5.1 - Add URL and directory to CLI
loader status echo - Fix load_maven scheduling task name -
docs: Fix typo detected with codespell - pre-commit: Bump hooks
and add new one to check commit message spelling
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 16 Feb 2022 14:30:47 +0000
swh-loader-core (2.5.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.5.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2022-02-08 10:46:14
+0100)
* Upstream changes: - v2.5.0 - Move visit date helper from hg
loader to core
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 08 Feb 2022 09:49:53 +0000
swh-loader-core (2.4.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.4.1 - (tagged by Nicolas Dandrimont
<nicolas@dandrimont.eu> on 2022-02-03 14:12:05 +0100)
* Upstream changes: - Release swh.loader.core 2.4.1 - fix
Person mangling
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 03 Feb 2022 13:17:35 +0000
swh-loader-core (2.3.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.3.0 - (tagged by Nicolas Dandrimont
<nicolas@dandrimont.eu> on 2022-01-24 11:18:43 +0100)
* Upstream changes: - Release swh.loader.core - Stop using the
deprecated 'TimestampWithTimezone.offset' attribute - Include
clone_with_timeout utility from swh.loader.mercurial
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 24 Jan 2022 10:22:35 +0000
swh-loader-core (2.2.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.2.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2022-01-18 14:33:08
+0100)
* Upstream changes: - v2.2.0 - tests: Replace 'offset' and
'negative_utc' with 'offset_bytes' - deposit: Remove
'negative_utc' from test data - tests: Use
TimestampWithTimezone.from_datetime() instead of the constructor
- Add releases notes (from user-provided Atom document) to release
messages. - deposit: Strip 'offset_bytes' from date dicts to
support swh-model 4.0.0 - Pin mypy and drop type annotations
which makes mypy unhappy
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 18 Jan 2022 15:52:53 +0000
swh-loader-core (2.1.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.1.1 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-12-09 17:14:12 +0100)
* Upstream changes: - v2.1.1 - * nixguix: Fix crash when
filtering extids on archives that were already loaded, but only from
different URLs
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 09 Dec 2021 16:17:54 +0000
swh-loader-core (2.1.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.1.0 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-12-09 16:34:51 +0100)
* Upstream changes: - v2.1.0 - * maven: various refactorings
- * nixguix: Filter out releases with URLs different from the
expected one
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 09 Dec 2021 15:38:14 +0000
swh-loader-core (2.0.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 2.0.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-12-07 15:53:23
+0100)
* Upstream changes: - v2.0.0 - package-loaders: Add support
for extid versions, and bump it for Debian - debian: Remove the
extrinsic version from release names - debian: Fix confusion
between the two versions
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 07 Dec 2021 14:57:19 +0000
swh-loader-core (1.3.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 1.3.0 - (tagged by Antoine Lambert
<anlambert@softwareheritage.org> on 2021-12-07 10:54:49 +0100)
* Upstream changes: - version 1.3.0
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 07 Dec 2021 09:58:53 +0000
swh-loader-core (1.2.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 1.2.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-12-03 16:15:32
+0100)
* Upstream changes: - v1.2.1 - package.loader: Deduplicate
extid target
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 03 Dec 2021 15:19:13 +0000
swh-loader-core (1.2.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 1.2.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-12-03 12:16:04
+0100)
* Upstream changes: - v1.2.0 - debian: Rename loading task
function to fix scheduling - debian: Handle extra sha1 sum in
source package metadata - debian: Remove unused date parameter
of DebianLoader - package.loader: Deduplicate target SWHIDs -
package-loader-tutorial: Update to mention releases instead of
revisions - package-loader-tutorial: Add a checklist -
package-loader-tutorial: Highlight the recommendation to submit the
loader early.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 03 Dec 2021 11:19:52 +0000
swh-loader-core (1.1.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 1.1.0 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-11-22 11:58:11 +0100)
* Upstream changes: - v1.1.0 - * Package loader: Uniformize
author and message
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 22 Nov 2021 11:01:45 +0000
swh-loader-core (1.0.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 1.0.1 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-11-10 14:47:52 +0100)
* Upstream changes: - v1.0.1 - * utils: Add types and let log
instruction do the formatting - * Fix tests when run by gbp on
Sid.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 10 Nov 2021 13:53:43 +0000
swh-loader-core (1.0.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 1.0.0 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-11-10 14:25:24 +0100)
* Upstream changes: - v1.0.0 - Main change: thismakes package
loaders write releases instead of revisions - Other more-or-less
related changes: - * Add missing documentation for
`get_metadata_authority`. - * opam: Write package definitions to
the extrinsic metadata storage - * deposit: Remove 'parent'
deposit - * cleanup tests and unused code - * Document how
each package loader populates fields. - * Refactor package
loaders to make the version part of BasePackageInfo
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 10 Nov 2021 13:38:43 +0000
swh-loader-core (0.25.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.25.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-09-29 09:19:10
+0200)
* Upstream changes: - v0.25.0 - Allow opam loader to actually
use multi-instance opam root - opam: Define a
initialize_opam_root parameter for opam loader
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 29 Sep 2021 07:26:12 +0000
swh-loader-core (0.23.5-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.23.5 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-09-24 17:31:22
+0200)
* Upstream changes: - v0.23.5 - opam: Initialize opam root
directory outside the constructor
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 24 Sep 2021 15:34:52 +0000
swh-loader-core (0.23.4-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.23.4 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-09-20 11:53:11
+0200)
* Upstream changes: - v0.23.4 - Ensure that filename fallback
out of an url is properly sanitized
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 20 Sep 2021 09:56:31 +0000
swh-loader-core (0.23.3-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.23.3 - (tagged by Antoine Lambert
<anlambert@softwareheritage.org> on 2021-09-16 10:47:40 +0200)
* Upstream changes: - version 0.23.3
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 16 Sep 2021 08:51:47 +0000
swh-loader-core (0.23.2-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.23.2 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-08-12 12:22:44 +0200)
* Upstream changes: - v0.23.2 - * deposit: Update
status_detail on loader failure
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 12 Aug 2021 10:25:44 +0000
swh-loader-core (0.23.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.23.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-08-05 16:11:02
+0200)
* Upstream changes: - v0.23.1 - Fix pypi upload issue.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 05 Aug 2021 14:20:37 +0000
swh-loader-core (0.22.3-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.22.3 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-06-25 14:50:40 +0200)
* Upstream changes: - v0.22.3 - * Use the postgresql class to
instantiate storage in tests - * package-loader-tutorial: Add
anchor so it can be referenced from swh-docs
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 25 Jun 2021 12:57:33 +0000
swh-loader-core (0.22.2-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.22.2 - (tagged by Antoine Lambert
<antoine.lambert@inria.fr> on 2021-06-10 16:11:30 +0200)
* Upstream changes: - version 0.22.2
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 10 Jun 2021 14:19:06 +0000
swh-loader-core (0.22.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.22.1 - (tagged by Antoine Lambert
<antoine.lambert@inria.fr> on 2021-05-27 14:02:35 +0200)
* Upstream changes: - version 0.22.1
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 27 May 2021 12:20:04 +0000
swh-loader-core (0.22.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.22.0 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-04-15 15:13:56 +0200)
* Upstream changes: - v0.22.0 - Documentation: - *
Document the big picture view of VCS and package loaders - * Add
a package loader tutorial. - * Write an overview of how to write
VCS loaders. - * Fix various Sphinx warnings - Package
loaders: - * Add sha512 as a valid field in dsc metadata - *
package loaders: Stop reading/writing Revision.metadata
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 15 Apr 2021 13:18:13 +0000
swh-loader-core (0.21.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.21.0 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-03-30 17:19:13 +0200)
* Upstream changes: - v0.21.0 - * tests: recompute ids when
evolving RawExtrinsicMetadata objects, to support swh-model 2.0.0
- * deposit.loader: Make archive.tar the default_filename - *
debian: Make resolve_revision_from use the sha256 of the .dsc -
* package.loader.*: unify package "cache"/deduplication using ExtIDs
- * package.loader: Lookup packages from the ExtID storage - *
package.loader: Write to the ExtID storage
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 30 Mar 2021 15:26:35 +0000
swh-loader-core (0.20.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.20.0 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2021-03-02 10:52:18 +0100)
* Upstream changes: - v0.20.0 - * RawExtrinsicMetadata: update
to use the API in swh-model 1.0.0
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 02 Mar 2021 09:57:21 +0000
swh-loader-core (0.19.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.19.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-02-25 15:52:12
+0100)
* Upstream changes: - v0.19.0 - deposit: Make deposit loader
deal with tarball as well - deposit: Update deposit status when
the load status is 'partial' - Make finalize_visit a method
instead of nested function.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 25 Feb 2021 14:55:54 +0000
swh-loader-core (0.18.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.18.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-02-19 18:02:58
+0100)
* Upstream changes: - v0.18.1 - nixguix: Fix missing
max_content_size constructor parameter
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 19 Feb 2021 17:06:33 +0000
swh-loader-core (0.18.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.18.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-02-17 13:13:24
+0100)
* Upstream changes: - v0.18.0 - core.loader: Merge Loader into
BaseLoader - Unify loader instantiation - nixguix: Ensure
interaction with the origin url for edge case tests
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 17 Feb 2021 12:16:47 +0000
swh-loader-core (0.17.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.17.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-02-11 11:20:55
+0100)
* Upstream changes: - v0.17.0 - package: Mark visit as
not_found when relevant - package: Mark visit status as failed
when relevant - core: Allow vcs loaders to deal with not_found
status - core: Mark visit status as failed when relevant -
loader: Make loader write the origin_visit_status' type
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 11 Feb 2021 10:23:42 +0000
swh-loader-core (0.16.0-1~swh2) unstable-swh; urgency=medium
* Bump dependencies
-- Antoine R. Dumont (@ardumont) <ardumont@softwareheritage.org> Wed, 03 Feb 2021 14:25:26 +0100
swh-loader-core (0.16.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.16.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2021-02-03 14:14:01
+0100)
* Upstream changes: - v0.16.0 - Adapt
origin_get_latest_visit_status according to latest api change -
Add a cli section in the doc - tox.ini: Add swh.core[testing]
requirement - Small docstring improvements in the deposit loader
code
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 03 Feb 2021 13:17:30 +0000
swh-loader-core (0.15.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.15.0 - (tagged by Nicolas Dandrimont
<nicolas@dandrimont.eu> on 2020-11-03 17:21:21 +0100)
* Upstream changes: - Release swh-loader-core v0.15.0 - Attach
raw extrinsic metadata to directories, not revisions - Handle a
bunch of deprecation warnings: - explicit args in swh.objstorage
get_objstorage - id -> target for raw extrinsic metadata objects
- positional arguments for storage.raw_extrinsic_metadata_get
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 03 Nov 2020 16:26:20 +0000
swh-loader-core (0.14.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.14.0 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2020-10-16 18:23:28 +0200)
* Upstream changes: - v0.14.0 - * npm: write metadata on
revisions instead of snapshots. - * pypi: write metadata on
revisions instead of snapshots. - * deposit.loader: Avoid
unnecessary metadata json transformation
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 16 Oct 2020 16:26:14 +0000
swh-loader-core (0.13.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.13.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-10-02 16:54:05
+0200)
* Upstream changes: - v0.13.1 - core.loader: Allow config
parameter passing through constructor - tox.ini: pin black to
the pre-commit version (19.10b0) to avoid flip-flops
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 02 Oct 2020 14:55:59 +0000
swh-loader-core (0.13.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.13.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-10-02 13:18:55
+0200)
* Upstream changes: - v0.13.0 - package.loader: Migrate away
from SWHConfig mixin - core.loader: Migrate away from SWHConfig
mixin - Expose deposit configuration only within the deposit
tests
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 02 Oct 2020 11:21:55 +0000
swh-loader-core (0.12.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.12.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-10-01 16:03:45
+0200)
* Upstream changes: - v0.12.0 - deposit: Adapt loader to send
extrinsic raw metadata to the metadata storage - core.loader:
Log information about origin currently being ingested - Adapt
cli declaration entrypoint to swh.core 0.3
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 01 Oct 2020 14:04:59 +0000
swh-loader-core (0.11.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.11.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-09-18 10:19:56
+0200)
* Upstream changes: - v0.11.0 - loader: Stop materializing
full lists of objects to be stored - tests.get_stats: Don't
return a 'person' count - python: Reorder imports with isort
- pre-commit: Add isort hook and configuration - pre-commit:
Update flake8 hook configuration - cli: speedup the `swh` cli
command startup time
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 18 Sep 2020 09:12:18 +0000
swh-loader-core (0.10.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.10.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-09-04 13:19:29
+0200)
* Upstream changes: - v0.10.0 - loader: Adapt to latest
storage revision_get change - origin/master Rename metadata
format 'original-artifact-json' to 'original-artifacts-json'. -
Tell pytest not to recurse in dotdirs. - package loader: Add the
'url' to the 'original_artifact' extrinsic metadata. - Write
'original_artifact' metadata to the extrinsic metadata storage. -
Move parts of _load_revision to a new _load_directory method. -
tests: Don't use naive datetimes. - package.loader: Split the
warning message into multiple chunks - Replace calls to
snapshot_get with snapshot_get_all_branches.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 04 Sep 2020 11:28:09 +0000
swh-loader-core (0.9.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.9.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-08-08 14:47:52
+0200)
* Upstream changes: - v0.9.1 - nixguix: Make the unsupported
artifact extensions configurable - package.loader: Log a failure
summary report at the end of the task
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Sat, 08 Aug 2020 12:51:33 +0000
swh-loader-core (0.9.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.9.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-08-07 22:57:14
+0200)
* Upstream changes: - v0.9.0 - nixguix: Filter out unsupported
artifact extensions - swh.loader.tests: Use
snapshot_get_all_branches in check_snapshot - test_npm: Adapt
content_get_metadata call to content_get - npm: Fix assertion to
use the correct storage api
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 07 Aug 2020 21:00:40 +0000
swh-loader-core (0.8.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.8.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-08-06 16:48:38
+0200)
* Upstream changes: - v0.8.1 - Adapt code according to storage
signature
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 06 Aug 2020 14:50:39 +0000
swh-loader-core (0.8.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.8.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-08-05 10:16:36
+0200)
* Upstream changes: - v0.8.0 - archive: fix docstring -
nixguix: Fix docstring - nixguix: Align error message formatting
using f-string - nixguix: Fix format issue in error message -
Convert the 'metadata' and 'info' cached-properties/lazy-attributes
into methods - cran: fix call to logger.warning - pypi: Load
the content of the API's response as extrinsic snapshot metadata
- Add a default value for RawExtrinsicMetadataCore.discovery_date
- npm: Load the content of the API's response as extrinsic snapshot
metadata - Make retrieve_sources use generic api_info instead of
duplicating its code - nixguix: Load the content of sources.json
as extrinsic snapshot metadata - Update tests to accept
PagedResult from storage.raw_extrinsic_metadata_get
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 05 Aug 2020 08:19:20 +0000
swh-loader-core (0.7.3-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.7.3 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2020-07-30 19:16:21 +0200)
* Upstream changes: - v0.7.3 - core.loader: Fix Iterable/List
typing issues - package.loader: Fix type warning
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 30 Jul 2020 17:23:57 +0000
swh-loader-core (0.7.2-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.7.2 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2020-07-29 11:41:39 +0200)
* Upstream changes: - v0.7.2 - * Fix typo in message logged on
extrinsic metadata loading errors. - * Don't pass non-sequence
iterables to the storage API.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 29 Jul 2020 09:45:52 +0000
swh-loader-core (0.7.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.7.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-28 12:14:02
+0200)
* Upstream changes: - v0.7.1 - Apply rename of object_metadata
to raw_extrinsic_metadata.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 28 Jul 2020 10:16:56 +0000
swh-loader-core (0.6.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.6.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-23 11:12:29
+0200)
* Upstream changes: - v0.6.1 - npm.loader: Fix null author
parsing corner case - npm.loader: Fix author parsing corner case
- npm.loader: Extract _author_str function + add types, tests -
core.loader: docs: Update origin_add reference
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 23 Jul 2020 09:15:41 +0000
swh-loader-core (0.6.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.6.0 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2020-07-20 13:23:22 +0200)
* Upstream changes: - v0.6.0 - * Use the new
object_metadata_add endpoint instead of origin_metadata_add. - *
Apply renaming of MetadataAuthorityType.DEPOSIT to
MetadataAuthorityType.DEPOSIT_CLIENT.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 20 Jul 2020 11:27:53 +0000
swh-loader-core (0.5.10-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.10 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-17 15:10:42
+0200)
* Upstream changes: - v0.5.10 - test_init: Decrease assertion
checks so debian package builds fine - test_nixguix: Simplify
the nixguix specific check_snapshot function
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 17 Jul 2020 13:13:19 +0000
swh-loader-core (0.5.9-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.9 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-17 11:52:38
+0200)
* Upstream changes: - v0.5.9 - test.check_snapshot: Drop
accepting using dict for snapshot comparison - test: Check
against snapshot model object
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 17 Jul 2020 09:55:12 +0000
swh-loader-core (0.5.8-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.8 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-16 17:18:17
+0200)
* Upstream changes: - v0.5.8 - test_init: Use snapshot object
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 16 Jul 2020 15:20:49 +0000
swh-loader-core (0.5.7-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.7 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-16 16:10:57
+0200)
* Upstream changes: - v0.5.7 - test_init: Fix tests using the
latest swh-storage fixture
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 16 Jul 2020 14:14:59 +0000
swh-loader-core (0.5.5-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.5 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-15 12:34:09
+0200)
* Upstream changes: - v0.5.5 - check_snapshot: Check existence
down to contents - Expose a pytest_plugin module so other
loaders can reuse for tests - pytest: Remove no longer needed
pytest setup - Fix branches types in tests - Small code
improvement in package/loader.py
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 15 Jul 2020 10:37:11 +0000
swh-loader-core (0.5.4-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.4 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-10 09:52:21
+0200)
* Upstream changes: - v0.5.4 - Clean up the swh.scheduler /
swh.storage pytest plugin imports
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 10 Jul 2020 07:54:56 +0000
swh-loader-core (0.5.3-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.3 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-09 09:46:21
+0200)
* Upstream changes: - v0.5.3 - Update the revision metadata
field as an immutable dict - tests: Use dedicated storage and
scheduler fixtures - loaders.tests: Simplify and add coverage to
check_snapshot
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 09 Jul 2020 07:48:33 +0000
swh-loader-core (0.5.2-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.2 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-07 12:29:17
+0200)
* Upstream changes: - v0.5.2 - nixguix/loader: Check further
the source entry only if it's valid - nixguix/loader: Allow
version both as string or integer - Move remaining common test
utility functions to top-level arborescence - Move common test
utility function to the top-level arborescence - Define common
test helper function - Reuse swh.model.from_disk.iter_directory
function
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 07 Jul 2020 10:31:36 +0000
swh-loader-core (0.5.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.1 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-07-01 12:32:54
+0200)
* Upstream changes: - v0.5.1 - Use origin_add instead of
deprecated origin_add_one endpoint - Migrate to use object's
"object_type" field when computing objects
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 01 Jul 2020 10:34:59 +0000
swh-loader-core (0.5.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.5.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-06-29 13:18:41
+0200)
* Upstream changes: - v0.5.0 - loader*: Drop obsolete origin
visit fields
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 29 Jun 2020 11:20:59 +0000
swh-loader-core (0.4.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.4.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-06-23 15:02:20
+0200)
* Upstream changes: - v0.4.0 - loader: Retrieve latest
snapshot with snapshot-get-latest function
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 23 Jun 2020 13:14:09 +0000
swh-loader-core (0.3.2-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.3.2 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-06-22 15:13:05
+0200)
* Upstream changes: - v0.3.2 - Add helper function to ensure
loader visit are as expected
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 22 Jun 2020 13:15:41 +0000
swh-loader-core (0.3.1-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.3.1 - (tagged by Antoine Lambert
<antoine.lambert@inria.fr> on 2020-06-12 16:43:18 +0200)
* Upstream changes: - version 0.3.1
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 12 Jun 2020 14:47:42 +0000
swh-loader-core (0.3.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.3.0 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-06-12 11:05:41
+0200)
* Upstream changes: - v0.3.0 - Migrate to new
storage.origin_visit_add endpoint - loader: Migrate to origin
visit status - test_deposits: Fix origin_metadata_get which is a
paginated endpoint - Fix a potential UnboundLocalError in
clean_dangling_folders()
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 12 Jun 2020 09:08:17 +0000
swh-loader-core (0.2.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.2.0 - (tagged by David Douard
<david.douard@sdfa3.org> on 2020-06-04 14:20:08 +0200)
* Upstream changes: - v0.2.0
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 04 Jun 2020 12:25:57 +0000
swh-loader-core (0.1.0-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.1.0 - (tagged by Nicolas Dandrimont
<nicolas@dandrimont.eu> on 2020-05-29 16:01:11 +0200)
* Upstream changes: - Release swh.loader.core v0.1.0 - Make
sure partial visits don't reference unloaded snapshots - Ensure
proper behavior when loading into partial archives (e.g. staging)
- Improve test coverage
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 29 May 2020 14:05:36 +0000
swh-loader-core (0.0.97-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.97 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-05-26 14:22:51
+0200)
* Upstream changes: - v0.0.97 - nixguix: catch and log
artifact resolution failures - nixguix: Override known_artifacts
to filter out "evaluation" branch - nixguix.tests: Add missing
__init__ file
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 26 May 2020 12:25:35 +0000
swh-loader-core (0.0.96-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.96 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2020-05-19 18:42:23 +0200)
* Upstream changes: - v0.0.96 - * Pass bytes instead a dict to
origin_metadata_add.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 19 May 2020 16:45:03 +0000
swh-loader-core (0.0.95-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.95 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2020-05-19 14:44:01 +0200)
* Upstream changes: - v0.0.95 - * Use the new swh-storage API
for storing metadata.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 19 May 2020 12:47:48 +0000
swh-loader-core (0.0.94-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.94 - (tagged by Antoine R. Dumont
(@ardumont) <ardumont@softwareheritage.org> on 2020-05-15 12:49:22
+0200)
* Upstream changes: - v0.0.94 - deposit: Adapt loader to use
the latest deposit update api - tests: Use proper date
initialization - setup.py: add documentation link
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 15 May 2020 10:52:16 +0000
swh-loader-core (0.0.93-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.93 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-04-23 16:43:16
+0200)
* Upstream changes: - v0.0.93 - deposit.loader: Build revision
out of the deposit api read metadata
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 23 Apr 2020 14:46:48 +0000
swh-loader-core (0.0.92-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.92 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-04-23 11:49:30
+0200)
* Upstream changes: - v0.0.92 - deposit.loader: Fix revision
metadata redundancy in deposit metadata - loader.deposit:
Clarify FIXME intent - test_nixguix: Remove the incorrect fixme
- test_nixguix: Add a fixme note on test_loader_two_visits -
package.nixguix: Ensure the revisions are structurally sound
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 23 Apr 2020 09:52:18 +0000
swh-loader-core (0.0.91-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.91 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-04-21 15:59:55
+0200)
* Upstream changes: - v0.0.91 - deposit.loader: Fix committer
date appropriately - tests_deposit: Define specific
requests_mock_datadir fixture - nixguix: Move helper function
below the class definition - setup: Update the minimum required
runtime python3 version
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 21 Apr 2020 14:02:51 +0000
swh-loader-core (0.0.90-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.90 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-04-15 14:27:01
+0200)
* Upstream changes: - v0.0.90 - Improve exception handling
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 15 Apr 2020 12:30:07 +0000
swh-loader-core (0.0.89-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.89 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-04-14 15:48:15
+0200)
* Upstream changes: - v0.0.89 - package.utils: Define a
timeout on download connections - package.loader: Clear proxy
buffer state when failing to load revision - Fix a couple of
storage args deprecation warnings - cli: Sort loaders list and
fix some tests - Add a pyproject.toml file to target py37 for
black - Enable black
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 14 Apr 2020 15:30:08 +0000
swh-loader-core (0.0.88-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.88 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-04-03 15:52:07
+0200)
* Upstream changes: - v0.0.88 - v0.0.88 nixguix: validate and
clean sources.json structure
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 03 Apr 2020 13:54:24 +0000
swh-loader-core (0.0.87-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.87 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-04-02 14:37:37
+0200)
* Upstream changes: - v0.0.87 - nixguix: rename the `url`
source attribute to `urls` - nixguix: rename the test file -
nixguix: add the integrity attribute in release metadata
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 02 Apr 2020 12:39:58 +0000
swh-loader-core (0.0.86-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.86 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-03-26 16:15:24
+0100)
* Upstream changes: - v0.0.86 - core.loader: Remove
origin_visit_update call from DVCSLoader class
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 26 Mar 2020 15:19:29 +0000
swh-loader-core (0.0.85-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.85 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-03-26 15:36:58
+0100)
* Upstream changes: - v0.0.85 - core.loader: Allow core loader
to update origin_visit in one call - Rename the functional
loader to nixguix loader
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 26 Mar 2020 14:43:17 +0000
swh-loader-core (0.0.84-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.84 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-03-24 11:29:49
+0100)
* Upstream changes: - v0.0.84 - test: Use storage endpoint to
check latest origin visit status - package.loader: Fix status
visit to 'partial' - package.loader: add a test to reproduce
EOFError error
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 24 Mar 2020 10:32:55 +0000
swh-loader-core (0.0.83-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.83 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-03-23 15:16:14
+0100)
* Upstream changes: - v0.0.83 - Make the swh.loader.package
exception handling more granular - package.loader: Reference a
snapshot on partial visit - package.loader: Extract a
_load_snapshot method - functional: create a branch named
evaluation pointing to the evaluation commit - package.loader:
add extra_branches method
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 23 Mar 2020 14:19:43 +0000
swh-loader-core (0.0.82-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.82 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-03-18 11:55:48
+0100)
* Upstream changes: - v0.0.82 - functional.loader: Add loader
- package.loader: ignore non tarball source
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 18 Mar 2020 10:59:38 +0000
swh-loader-core (0.0.81-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.81 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-03-16 13:14:33
+0100)
* Upstream changes: - v0.0.81 - Migrate to latest
storage.origin_visit_add api change - Move Person parsing to swh-
model.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 16 Mar 2020 12:17:43 +0000
swh-loader-core (0.0.80-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.80 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2020-02-28 17:05:14 +0100)
* Upstream changes: - v0.0.80 - * use swh-model objects
instead of dicts.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 28 Feb 2020 16:10:06 +0000
swh-loader-core (0.0.79-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.79 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-02-25 11:40:05
+0100)
* Upstream changes: - v0.0.79 - Move revision loading logic to
its own function. - Use swh-storage validation proxy earlier in
the pipeline. - Use swh-storage validation proxy. - Add
missing __init__.py and fix tests.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 25 Feb 2020 10:48:07 +0000
swh-loader-core (0.0.78-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.78 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-02-06 15:28:11
+0100)
* Upstream changes: - v0.0.78 - tests: Use new get_storage
signature - loader.core.converters: Prefer the with open pattern
to read file - test_converters: Add coverage on prepare_contents
method - test_converters: Migrate to pytest -
loader.core/package: Call storage's (skipped_)content_add endpoints
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 06 Feb 2020 15:09:05 +0000
swh-loader-core (0.0.77-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.77 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-01-30 10:32:08
+0100)
* Upstream changes: - v0.0.77 - loader.npm: If no upload time
provided, use artifact's mtime if provided - loader.npm: Fail
ingestion if at least 1 artifact has no upload time
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 30 Jan 2020 09:37:58 +0000
swh-loader-core (0.0.76-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.76 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-01-28 13:07:30
+0100)
* Upstream changes: - v0.0.76 - npm.loader: Skip artifacts
with no intrinsic metadata - pypi.loader: Skip artifacts with no
intrinsic metadata - package.loader: Fix edge case when some
listing returns no content - core.loader: Drop retro-
compatibility class names - loader.tests: Add filter and buffer
proxy storage - docs: Fix sphinx warnings - README: Update
class names
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 28 Jan 2020 12:11:07 +0000
swh-loader-core (0.0.75-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.75 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-01-16 14:14:29
+0100)
* Upstream changes: - v0.0.75 - cran.loader: Align cran loader
with other package loaders
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 16 Jan 2020 13:17:30 +0000
swh-loader-core (0.0.74-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.74 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-01-15 15:30:13
+0100)
* Upstream changes: - v0.0.74 - Drop no longer used retrying
dependency - core.loader: Clean up indirection and retry
behavior - tests: Use retry proxy storage in loaders -
core.loader: Drop dead code - cran.loader: Fix parsing
description file error
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 15 Jan 2020 14:33:57 +0000
swh-loader-core (0.0.73-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.73 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-01-09 10:00:21
+0100)
* Upstream changes: - v0.0.73 - package.cran: Name CRAN task
appropriately
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 09 Jan 2020 09:05:07 +0000
swh-loader-core (0.0.72-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.72 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2020-01-06 16:37:58
+0100)
* Upstream changes: - v0.0.72 - package.loader: Fail fast when
unable to create origin/origin_visit - cran.loader: Add
implementation
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 06 Jan 2020 15:50:08 +0000
swh-loader-core (0.0.71-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.71 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-20 14:22:31
+0100)
* Upstream changes: - v0.0.71 - package.utils: Drop unneeded
hashes from download computation
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 20 Dec 2019 13:26:09 +0000
swh-loader-core (0.0.70-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.70 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-20 11:32:09
+0100)
* Upstream changes: - v0.0.70 - debian.loader: Improve and fix
revision resolution's corner cases
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 20 Dec 2019 10:39:34 +0000
swh-loader-core (0.0.69-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.69 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-12 16:21:59
+0100)
* Upstream changes: - v0.0.69 - loader.core: Fix correctly
loader initialization
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 12 Dec 2019 15:26:13 +0000
swh-loader-core (0.0.68-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.68 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-12 15:45:21
+0100)
* Upstream changes: - v0.0.68 - loader.core: Fix
initialization issue in dvcs loaders
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 12 Dec 2019 14:49:12 +0000
swh-loader-core (0.0.67-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.67 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-12 14:02:47
+0100)
* Upstream changes: - v0.0.67 - loader.core: Type methods -
loader.core: Transform data input into list - loader.core: Add
missing conversion step on content
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 12 Dec 2019 13:07:47 +0000
swh-loader-core (0.0.66-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.66 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-12 12:01:14
+0100)
* Upstream changes: - v0.0.66 - Drop deprecated behavior
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 12 Dec 2019 11:05:17 +0000
swh-loader-core (0.0.65-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.65 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-12 11:42:46
+0100)
* Upstream changes: - v0.0.65 - loader.cli: Improve current
implementation - tasks: Enforce kwargs use in task message
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 12 Dec 2019 10:51:02 +0000
swh-loader-core (0.0.64-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.64 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-10 09:49:06
+0100)
* Upstream changes: - v0.0.64 - requirements-test: Add missing
test dependency - tests: Refactor using pytest-mock's mocker
fixture - loader.cli: Add tests around cli - package.npm:
Align loader instantiation - loader.cli: Reference new loader
cli
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 10 Dec 2019 08:56:02 +0000
swh-loader-core (0.0.63-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.63 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-05 16:01:49
+0100)
* Upstream changes: - v0.0.63 - Add missing inclusion
instruction
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 05 Dec 2019 15:05:39 +0000
swh-loader-core (0.0.62-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.62 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-12-05 15:46:46
+0100)
* Upstream changes: - v0.0.62 - Move package loaders to their
own namespace
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 05 Dec 2019 14:50:19 +0000
swh-loader-core (0.0.61-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.61 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-28 17:25:49
+0100)
* Upstream changes: - v0.0.61 - pypi: metadata -> revision:
Deal with previous metadata format - npm: metadata -> revision:
Deal with previous metadata format
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 28 Nov 2019 16:29:47 +0000
swh-loader-core (0.0.60-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.60 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-26 12:09:28
+0100)
* Upstream changes: - v0.0.60 - package.deposit: Fix revision-
get inconsistency - package.deposit: Provide parents in any case
- package.deposit: Fix url computation issue - utils: Work
around header issue during download
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 26 Nov 2019 11:18:41 +0000
swh-loader-core (0.0.59-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.59 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-22 18:11:33
+0100)
* Upstream changes: - v0.0.59 - npm: Explicitly retrieve the
revision date from extrinsic metadata
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 22 Nov 2019 17:15:34 +0000
swh-loader-core (0.0.58-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.58 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-22 12:08:10
+0100)
* Upstream changes: - v0.0.58 - package.pypi: Filter out non-
sdist package type
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 22 Nov 2019 11:11:56 +0000
swh-loader-core (0.0.57-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.57 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-22 11:26:11
+0100)
* Upstream changes: - v0.0.57 - package.pypi: Fix project url
computation edge case - Use pkg_resources to get the package
version instead of vcversioner
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 22 Nov 2019 10:31:11 +0000
swh-loader-core (0.0.56-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.56 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-21 16:12:46
+0100)
* Upstream changes: - v0.0.56 - package.tasks: Rename
appropriately load_deb_package task type name - Fix typos
reported by codespell - Add a pre-commit config file
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 21 Nov 2019 15:16:23 +0000
swh-loader-core (0.0.55-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.55 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-21 13:51:03
+0100)
* Upstream changes: - v0.0.55 - package.tasks: Rename
load_archive into load_archive_files - Migrate tox.ini to extras
= xxx instead of deps = .[testing] - Merge tox test environments
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 21 Nov 2019 12:56:07 +0000
swh-loader-core (0.0.54-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.54 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-21 11:29:20
+0100)
* Upstream changes: - v0.0.54 - loader.package.deposit: Drop
swh.deposit.client requirement - Include all requirements in
MANIFEST.in
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 21 Nov 2019 10:32:23 +0000
swh-loader-core (0.0.53-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.53 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-20 14:26:36
+0100)
* Upstream changes: - v0.0.53 - loader.package.tasks: Document
tasks - Define correctly the setup.py's entry_points
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 20 Nov 2019 13:30:10 +0000
swh-loader-core (0.0.52-1~swh3) unstable-swh; urgency=medium
* Update dh-python version constraint
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 20 Nov 2019 12:03:00 +0100
swh-loader-core (0.0.52-1~swh2) unstable-swh; urgency=medium
* Add egg-info to pybuild.testfiles.
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 20 Nov 2019 11:42:42 +0100
swh-loader-core (0.0.52-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.52 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-19 15:15:40
+0100)
* Upstream changes: - v0.0.52 - Ensure BufferedLoader and
UnbufferedLoader do flush their storage - loader.package:
Register loader package tasks - package.tasks: Rename debian
task to load_deb
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 19 Nov 2019 14:18:41 +0000
swh-loader-core (0.0.51-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.51 - (tagged by David Douard
<david.douard@sdfa3.org> on 2019-11-18 17:05:17 +0100)
* Upstream changes: - v0.0.51
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 18 Nov 2019 16:09:44 +0000
swh-loader-core (0.0.50-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.50 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-13 15:56:55
+0100)
* Upstream changes: - v0.0.50 - package.loader: Check
snapshot_id is set as returned value - package.loader: Ensure
the origin visit type is set appropriately - package.loader: Fix
serialization issue - package.debian: Align origin_visit type to
'deb' as in production
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 13 Nov 2019 15:04:37 +0000
swh-loader-core (0.0.49-1~swh2) unstable-swh; urgency=medium
* Update dependencies
-- Antoine R. Dumont <antoine.romain.dumont@gmail.com> Fri, 08 Nov 2019 14:07:20 +0100
swh-loader-core (0.0.49-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.49 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-11-08 13:21:56
+0100)
* Upstream changes: - v0.0.49 - New package loader
implementations: archive, pypi, npm, deposit, debian
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 08 Nov 2019 12:29:47 +0000
swh-loader-core (0.0.48-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.48 - (tagged by Stefano Zacchiroli
<zack@upsilon.cc> on 2019-10-01 16:49:39 +0200)
* Upstream changes: - v0.0.48 - * typing: minimal changes to
make a no-op mypy run pass
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 01 Oct 2019 14:52:59 +0000
swh-loader-core (0.0.47-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.47 - (tagged by Antoine Lambert
<antoine.lambert@inria.fr> on 2019-10-01 11:32:50 +0200)
* Upstream changes: - version 0.0.47: Workaround HashCollision
errors
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 01 Oct 2019 09:35:38 +0000
swh-loader-core (0.0.46-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.46 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-09-06 18:30:42
+0200)
* Upstream changes: - v0.0.46 - pytest.ini: Remove warnings
about our custom markers - pep8: Fix log.warning calls -
core/loader: Fix get_save_data_path implementation - Fix
validation errors in test.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 06 Sep 2019 16:33:13 +0000
swh-loader-core (0.0.45-1~swh2) unstable-swh; urgency=medium
* Fix missing build dependency
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Tue, 03 Sep 2019 14:12:13 +0200
swh-loader-core (0.0.45-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.45 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-09-03 10:38:36
+0200)
* Upstream changes: - v0.0.45 - loader: Provide visit type
when calling origin_visit_add - loader: Drop keys 'perms' and
'path' from content before sending to the - storage -
swh.loader.package: Implement GNU loader - docs: add code of
conduct document
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Tue, 03 Sep 2019 08:41:49 +0000
swh-loader-core (0.0.44-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.44 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2019-06-25 12:18:27 +0200)
* Upstream changes: - Drop use of deprecated methods
fetch_history_*
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 26 Jun 2019 09:40:59 +0000
swh-loader-core (0.0.43-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.43 - (tagged by Valentin Lorentz
<vlorentz@softwareheritage.org> on 2019-06-18 16:21:58 +0200)
* Upstream changes: - Use origin urls instead of origin ids.
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 19 Jun 2019 09:33:53 +0000
swh-loader-core (0.0.42-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.42 - (tagged by David Douard
<david.douard@sdfa3.org> on 2019-05-20 11:28:49 +0200)
* Upstream changes: - v0.0.42 - update/fix requirements
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Mon, 20 May 2019 09:33:47 +0000
swh-loader-core (0.0.41-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.41 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-04-11 11:46:00
+0200)
* Upstream changes: - v0.0.41 - core.loader: Migrate to latest
snapshot_add, origin_visit_update api - core.loader: Count only
the effectively new objects ingested - test_utils: Add coverage
on utils module
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Thu, 11 Apr 2019 09:52:55 +0000
swh-loader-core (0.0.40-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.40 - (tagged by Antoine Lambert
<antoine.lambert@inria.fr> on 2019-03-29 10:57:14 +0100)
* Upstream changes: - version 0.0.40
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Fri, 29 Mar 2019 10:02:37 +0000
swh-loader-core (0.0.39-1~swh1) unstable-swh; urgency=medium
* New upstream release 0.0.39 - (tagged by Antoine R. Dumont
(@ardumont) <antoine.romain.dumont@gmail.com> on 2019-01-30 11:10:39
+0100)
* Upstream changes: - v0.0.39
-- Software Heritage autobuilder (on jenkins-debian1) <jenkins@jenkins-debian1.internal.softwareheritage.org> Wed, 30 Jan 2019 10:13:56 +0000
swh-loader-core (0.0.35-1~swh1) unstable-swh; urgency=medium
* v0.0.35
* tests: Initialize tox.ini use
* tests, debian/*: Migrate to pytest
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Tue, 23 Oct 2018 15:47:22 +0200
swh-loader-core (0.0.34-1~swh1) unstable-swh; urgency=medium
* v0.0.34
* setup: prepare for PyPI upload
* README.md: Simplify module description
* core.tests: Install tests fixture for derivative loaders to use
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Tue, 09 Oct 2018 14:11:29 +0200
swh-loader-core (0.0.33-1~swh1) unstable-swh; urgency=medium
* v0.0.33
* loader/utils: Add clean_dangling_folders function to ease clean up
* loader/core: Add optional pre_cleanup for dangling files cleaning
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Fri, 09 Mar 2018 14:41:17 +0100
swh-loader-core (0.0.32-1~swh1) unstable-swh; urgency=medium
* v0.0.32
* Improve origin_visit initialization step
* Properly sandbox the prepare statement so that if it breaks, we can
* update appropriately the visit with the correct status
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 07 Mar 2018 11:06:27 +0100
swh-loader-core (0.0.31-1~swh1) unstable-swh; urgency=medium
* Release swh.loader.core v0.0.31
* Remove backwards-compatibility when sending snapshots
-- Nicolas Dandrimont <nicolas@dandrimont.eu> Tue, 13 Feb 2018 18:52:20 +0100
swh-loader-core (0.0.30-1~swh1) unstable-swh; urgency=medium
* Release swh.loader.core v0.0.30
* Update Debian metadata for snapshot-related breakage
-- Nicolas Dandrimont <nicolas@dandrimont.eu> Tue, 06 Feb 2018 14:22:53 +0100
swh-loader-core (0.0.29-1~swh1) unstable-swh; urgency=medium
* Release swh.loader.core v0.0.29
* Replace occurrences with snapshots
* Enhance logging on error cases
-- Nicolas Dandrimont <nicolas@dandrimont.eu> Tue, 06 Feb 2018 14:13:11 +0100
swh-loader-core (0.0.28-1~swh1) unstable-swh; urgency=medium
* v0.0.28
* Add stateless loader base class
* Remove bare exception handlers
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Tue, 19 Dec 2017 17:48:09 +0100
swh-loader-core (0.0.27-1~swh1) unstable-swh; urgency=medium
* v0.0.27
* Migrate from indexer's indexer_configuration to storage's tool
notion.
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Thu, 07 Dec 2017 10:36:23 +0100
swh-loader-core (0.0.26-1~swh1) unstable-swh; urgency=medium
* v0.0.26
* Fix send_provider method
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Tue, 05 Dec 2017 15:40:57 +0100
swh-loader-core (0.0.25-1~swh1) unstable-swh; urgency=medium
* v0.0.25
* swh.loader.core: Fix to retrieve the provider_id as an actual id
* swh.loader.core: Fix log format error
* swh.loader.core: Align log message according to conventions
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 29 Nov 2017 12:55:45 +0100
swh-loader-core (0.0.24-1~swh1) unstable-swh; urgency=medium
* v0.0.24
* Added metadata injection possible from loader core
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Fri, 24 Nov 2017 11:35:40 +0100
swh-loader-core (0.0.23-1~swh1) unstable-swh; urgency=medium
* v0.0.23
* loader: Fix dangling data flush
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Tue, 07 Nov 2017 16:25:20 +0100
swh-loader-core (0.0.22-1~swh1) unstable-swh; urgency=medium
* v0.0.22
* core.loader: Use the global setup set in swh.core.config
* core.loader: Properly batch object insertions for big requests
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Mon, 30 Oct 2017 18:50:00 +0100
swh-loader-core (0.0.21-1~swh1) unstable-swh; urgency=medium
* v0.0.21
* swh.loader.core: Only send origin if not already sent before
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Tue, 24 Oct 2017 16:30:53 +0200
swh-loader-core (0.0.20-1~swh1) unstable-swh; urgency=medium
* v0.0.20
* Permit to add 'post_load' actions in loaders
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Fri, 13 Oct 2017 14:30:37 +0200
swh-loader-core (0.0.19-1~swh1) unstable-swh; urgency=medium
* v0.0.19
* Permit to add 'post_load' actions in loaders
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Fri, 13 Oct 2017 14:14:14 +0200
swh-loader-core (0.0.18-1~swh1) unstable-swh; urgency=medium
* Release swh.loader.core version 0.0.18
* Update packaging runes
-- Nicolas Dandrimont <nicolas@dandrimont.eu> Thu, 12 Oct 2017 18:07:53 +0200
swh-loader-core (0.0.17-1~swh1) unstable-swh; urgency=medium
* Release swh.loader.core v0.0.17
* Allow iterating when fetching and storing data
* Allow overriding the status of the loaded visit
* Allow overriding the status of the load itself
-- Nicolas Dandrimont <nicolas@dandrimont.eu> Wed, 11 Oct 2017 16:38:29 +0200
swh-loader-core (0.0.16-1~swh1) unstable-swh; urgency=medium
* Release swh.loader.core v0.0.16
* Migrate from swh.model.git to swh.model.from_disk
-- Nicolas Dandrimont <nicolas@dandrimont.eu> Fri, 06 Oct 2017 14:46:41 +0200
swh-loader-core (0.0.15-1~swh1) unstable-swh; urgency=medium
* v0.0.15
* docs: Add sphinx apidoc generation skeleton
* docs: Add a simple README.md explaining the module's goal
* swh.loader.core.loader: Unify origin_visit add/update function call
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Fri, 29 Sep 2017 11:47:37 +0200
swh-loader-core (0.0.14-1~swh1) unstable-swh; urgency=medium
* v0.0.14
* Add the blake2s256 hash computation
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Sat, 25 Mar 2017 18:20:52 +0100
swh-loader-core (0.0.13-1~swh1) unstable-swh; urgency=medium
* v0.0.13
* Improve core loader's interface api
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 22 Feb 2017 13:43:54 +0100
swh-loader-core (0.0.12-1~swh1) unstable-swh; urgency=medium
* v0.0.12
* Update storage configuration reading
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Thu, 15 Dec 2016 18:34:41 +0100
swh-loader-core (0.0.11-1~swh1) unstable-swh; urgency=medium
* v0.0.11
* d/control: Bump dependency to latest storage
* Fix: Objects can be injected even though global loading failed
* Populate the counters in fetch_history
* Open open/close fetch_history function in the core loader
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 24 Aug 2016 14:38:55 +0200
swh-loader-core (0.0.10-1~swh1) unstable-swh; urgency=medium
* v0.0.10
* d/control: Update dependency
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Sat, 11 Jun 2016 02:26:50 +0200
swh-loader-core (0.0.9-1~swh1) unstable-swh; urgency=medium
* v0.0.9
* Improve default task that initialize storage as well
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Fri, 10 Jun 2016 15:12:14 +0200
swh-loader-core (0.0.8-1~swh1) unstable-swh; urgency=medium
* v0.0.8
* Migrate specific converter to the right module
* Fix dangling parameter
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 08 Jun 2016 18:09:23 +0200
swh-loader-core (0.0.7-1~swh1) unstable-swh; urgency=medium
* v0.0.7
* Fix on revision conversion
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 08 Jun 2016 16:19:02 +0200
swh-loader-core (0.0.6-1~swh1) unstable-swh; urgency=medium
* v0.0.6
* d/control: Bump dependency on swh-model
* d/control: Add missing description
* Keep the abstraction for all entities
* Align parameter definition order
* Fix missing option in DEFAULT ones
* Decrease verbosity
* Fix missing origin_id assignment
* d/rules: Add target to run tests during packaging
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 08 Jun 2016 16:00:40 +0200
swh-loader-core (0.0.5-1~swh1) unstable-swh; urgency=medium
* v0.0.5
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 25 May 2016 12:17:06 +0200
swh-loader-core (0.0.4-1~swh1) unstable-swh; urgency=medium
* v0.0.4
* Rename package from python3-swh.loader to python3-swh.loader.core
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 25 May 2016 11:44:48 +0200
swh-loader-core (0.0.3-1~swh1) unstable-swh; urgency=medium
* v0.0.3
* Improve default configuration
* Rename package from swh-loader-vcs to swh-loader
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 25 May 2016 11:23:06 +0200
swh-loader-core (0.0.2-1~swh1) unstable-swh; urgency=medium
* v0.0.2
* Fix: Flush data even when no data is sent to swh-storage
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Tue, 24 May 2016 16:41:49 +0200
swh-loader-core (0.0.1-1~swh1) unstable-swh; urgency=medium
* Initial release
* v0.0.1
-- Antoine R. Dumont (@ardumont) <antoine.romain.dumont@gmail.com> Wed, 13 Apr 2016 16:54:47 +0200
diff --git a/setup.cfg b/setup.cfg
index 1d722c2..f65ba0a 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -1,8 +1,9 @@
[flake8]
-ignore = E203,E231,W503
+select = C,E,F,W,B950
+ignore = E203,E231,E501,W503
max-line-length = 88
[egg_info]
tag_build =
tag_date = 0
diff --git a/swh.loader.core.egg-info/PKG-INFO b/swh.loader.core.egg-info/PKG-INFO
index a92a4b5..95e9abe 100644
--- a/swh.loader.core.egg-info/PKG-INFO
+++ b/swh.loader.core.egg-info/PKG-INFO
@@ -1,56 +1,56 @@
Metadata-Version: 2.1
Name: swh.loader.core
-Version: 2.6.1
+Version: 2.6.2
Summary: Software Heritage Base Loader
Home-page: https://forge.softwareheritage.org/diffusion/DLDBASE
Author: Software Heritage developers
Author-email: swh-devel@inria.fr
License: UNKNOWN
Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest
Project-URL: Funding, https://www.softwareheritage.org/donate
Project-URL: Source, https://forge.softwareheritage.org/source/swh-loader-core
Project-URL: Documentation, https://docs.softwareheritage.org/devel/swh-loader-core/
Platform: UNKNOWN
Classifier: Programming Language :: Python :: 3
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3)
Classifier: Operating System :: OS Independent
Classifier: Development Status :: 5 - Production/Stable
Requires-Python: >=3.7
Description-Content-Type: text/markdown
Provides-Extra: testing
License-File: LICENSE
License-File: AUTHORS
Software Heritage - Loader foundations
======================================
The Software Heritage Loader Core is a low-level loading utilities and
helpers used by :term:`loaders <loader>`.
The main entry points are classes:
- :class:`swh.loader.core.loader.BaseLoader` for loaders (e.g. svn)
- :class:`swh.loader.core.loader.DVCSLoader` for DVCS loaders (e.g. hg, git, ...)
- :class:`swh.loader.package.loader.PackageLoader` for Package loaders (e.g. PyPI, Npm, ...)
Package loaders
---------------
This package also implements many package loaders directly, out of convenience,
as they usually are quite similar and each fits in a single file.
They all roughly follow these steps, explained in the
:py:meth:`swh.loader.package.loader.PackageLoader.load` documentation.
See the :ref:`package-loader-tutorial` for details.
VCS loaders
-----------
Unlike package loaders, VCS loaders remain in separate packages,
as they often need more advanced conversions and very VCS-specific operations.
This usually involves getting the branches of a repository and recursively loading
revisions in the history (and directory trees in these revisions),
until a known revision is found
diff --git a/swh.loader.core.egg-info/SOURCES.txt b/swh.loader.core.egg-info/SOURCES.txt
index 7a9d27f..df7acc5 100644
--- a/swh.loader.core.egg-info/SOURCES.txt
+++ b/swh.loader.core.egg-info/SOURCES.txt
@@ -1,218 +1,219 @@
+.git-blame-ignore-revs
.gitignore
.pre-commit-config.yaml
AUTHORS
CODE_OF_CONDUCT.md
CONTRIBUTORS
LICENSE
MANIFEST.in
Makefile
README.rst
conftest.py
mypy.ini
pyproject.toml
pytest.ini
requirements-swh.txt
requirements-test.txt
requirements.txt
setup.cfg
setup.py
tox.ini
docs/.gitignore
docs/Makefile
docs/README.rst
docs/cli.rst
docs/conf.py
docs/index.rst
docs/package-loader-specifications.rst
docs/package-loader-tutorial.rst
docs/vcs-loader-overview.rst
docs/_static/.placeholder
docs/_templates/.placeholder
swh/__init__.py
swh.loader.core.egg-info/PKG-INFO
swh.loader.core.egg-info/SOURCES.txt
swh.loader.core.egg-info/dependency_links.txt
swh.loader.core.egg-info/entry_points.txt
swh.loader.core.egg-info/requires.txt
swh.loader.core.egg-info/top_level.txt
swh/loader/__init__.py
swh/loader/cli.py
swh/loader/exception.py
swh/loader/pytest_plugin.py
swh/loader/core/__init__.py
swh/loader/core/converters.py
swh/loader/core/loader.py
swh/loader/core/py.typed
swh/loader/core/utils.py
swh/loader/core/tests/__init__.py
swh/loader/core/tests/test_converters.py
swh/loader/core/tests/test_loader.py
swh/loader/core/tests/test_utils.py
swh/loader/package/__init__.py
swh/loader/package/loader.py
swh/loader/package/py.typed
swh/loader/package/utils.py
swh/loader/package/archive/__init__.py
swh/loader/package/archive/loader.py
swh/loader/package/archive/tasks.py
swh/loader/package/archive/tests/__init__.py
swh/loader/package/archive/tests/test_archive.py
swh/loader/package/archive/tests/test_tasks.py
swh/loader/package/archive/tests/data/not_gzipped_tarball.tar.gz
swh/loader/package/archive/tests/data/https_ftp.gnu.org/gnu_8sync_8sync-0.1.0.tar.gz
swh/loader/package/archive/tests/data/https_ftp.gnu.org/gnu_8sync_8sync-0.1.0.tar.gz_visit1
swh/loader/package/archive/tests/data/https_ftp.gnu.org/gnu_8sync_8sync-0.1.0.tar.gz_visit2
swh/loader/package/archive/tests/data/https_ftp.gnu.org/gnu_8sync_8sync-0.2.0.tar.gz
swh/loader/package/cran/__init__.py
swh/loader/package/cran/loader.py
swh/loader/package/cran/tasks.py
swh/loader/package/cran/tests/__init__.py
swh/loader/package/cran/tests/test_cran.py
swh/loader/package/cran/tests/test_tasks.py
swh/loader/package/cran/tests/data/description/KnownBR
swh/loader/package/cran/tests/data/description/acepack
swh/loader/package/cran/tests/data/https_cran.r-project.org/src_contrib_1.4.0_Recommended_KernSmooth_2.22-6.tar.gz
swh/loader/package/debian/__init__.py
swh/loader/package/debian/loader.py
swh/loader/package/debian/tasks.py
swh/loader/package/debian/tests/__init__.py
swh/loader/package/debian/tests/test_debian.py
swh/loader/package/debian/tests/test_tasks.py
swh/loader/package/debian/tests/data/http_deb.debian.org/debian_pool_contrib_c_cicero_cicero_0.7.2-3.diff.gz
swh/loader/package/debian/tests/data/http_deb.debian.org/debian_pool_contrib_c_cicero_cicero_0.7.2-3.dsc
swh/loader/package/debian/tests/data/http_deb.debian.org/debian_pool_contrib_c_cicero_cicero_0.7.2-4.diff.gz
swh/loader/package/debian/tests/data/http_deb.debian.org/debian_pool_contrib_c_cicero_cicero_0.7.2-4.dsc
swh/loader/package/debian/tests/data/http_deb.debian.org/debian_pool_contrib_c_cicero_cicero_0.7.2.orig.tar.gz
swh/loader/package/debian/tests/data/http_deb.debian.org/onefile.txt
swh/loader/package/deposit/__init__.py
swh/loader/package/deposit/loader.py
swh/loader/package/deposit/tasks.py
swh/loader/package/deposit/tests/__init__.py
swh/loader/package/deposit/tests/conftest.py
swh/loader/package/deposit/tests/test_deposit.py
swh/loader/package/deposit/tests/test_tasks.py
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/1_private_666_meta
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/1_private_666_raw
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/1_private_777_meta
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/1_private_777_raw
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/1_private_888_meta
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/1_private_888_raw
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/1_private_999_meta
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/1_private_999_raw
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/hello-2.10.zip
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/hello-2.12.tar.gz
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/hello_2.10.json
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/hello_2.11.json
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/hello_2.12.json
swh/loader/package/deposit/tests/data/https_deposit.softwareheritage.org/hello_2.13.json
swh/loader/package/maven/__init__.py
swh/loader/package/maven/loader.py
swh/loader/package/maven/tasks.py
swh/loader/package/maven/tests/__init__.py
swh/loader/package/maven/tests/test_maven.py
swh/loader/package/maven/tests/test_tasks.py
swh/loader/package/maven/tests/data/https_maven.org/sprova4j-0.1.0-sources.jar
swh/loader/package/maven/tests/data/https_maven.org/sprova4j-0.1.0.pom
swh/loader/package/maven/tests/data/https_maven.org/sprova4j-0.1.1-sources.jar
swh/loader/package/maven/tests/data/https_maven.org/sprova4j-0.1.1.pom
swh/loader/package/nixguix/__init__.py
swh/loader/package/nixguix/loader.py
swh/loader/package/nixguix/tasks.py
swh/loader/package/nixguix/tests/__init__.py
swh/loader/package/nixguix/tests/conftest.py
swh/loader/package/nixguix/tests/test_nixguix.py
swh/loader/package/nixguix/tests/test_tasks.py
swh/loader/package/nixguix/tests/data/https_example.com/file.txt
swh/loader/package/nixguix/tests/data/https_fail.com/truncated-archive.tgz
swh/loader/package/nixguix/tests/data/https_ftp.gnu.org/gnu_8sync_8sync-0.1.0.tar.gz
swh/loader/package/nixguix/tests/data/https_ftp.gnu.org/gnu_8sync_8sync-0.1.0.tar.gz_visit1
swh/loader/package/nixguix/tests/data/https_ftp.gnu.org/gnu_8sync_8sync-0.1.0.tar.gz_visit2
swh/loader/package/nixguix/tests/data/https_ftp.gnu.org/gnu_8sync_8sync-0.2.0.tar.gz
swh/loader/package/nixguix/tests/data/https_github.com/owner-1_repository-1_revision-1.tgz
swh/loader/package/nixguix/tests/data/https_github.com/owner-2_repository-1_revision-1.tgz
swh/loader/package/nixguix/tests/data/https_github.com/owner-3_repository-1_revision-1.tgz
swh/loader/package/nixguix/tests/data/https_nix-community.github.io/nixpkgs-swh_sources-EOFError.json
swh/loader/package/nixguix/tests/data/https_nix-community.github.io/nixpkgs-swh_sources.json
swh/loader/package/nixguix/tests/data/https_nix-community.github.io/nixpkgs-swh_sources.json_visit1
swh/loader/package/nixguix/tests/data/https_nix-community.github.io/nixpkgs-swh_sources_special.json
swh/loader/package/nixguix/tests/data/https_nix-community.github.io/nixpkgs-swh_sources_special.json_visit1
swh/loader/package/npm/__init__.py
swh/loader/package/npm/loader.py
swh/loader/package/npm/tasks.py
swh/loader/package/npm/tests/__init__.py
swh/loader/package/npm/tests/test_npm.py
swh/loader/package/npm/tests/test_tasks.py
swh/loader/package/npm/tests/data/https_registry.npmjs.org/@aller_shared_-_shared-0.1.0.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/@aller_shared_-_shared-0.1.1-alpha.14.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/jammit-express_-_jammit-express-0.0.1.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/nativescript-telerik-analytics_-_nativescript-telerik-analytics-1.0.0.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/org_-_org-0.0.2.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/org_-_org-0.0.3-beta.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/org_-_org-0.0.3.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/org_-_org-0.0.4.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/org_-_org-0.0.5.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/org_-_org-0.1.0.tgz
swh/loader/package/npm/tests/data/https_registry.npmjs.org/org_-_org-0.2.0.tgz
swh/loader/package/npm/tests/data/https_replicate.npmjs.com/@aller_shared
swh/loader/package/npm/tests/data/https_replicate.npmjs.com/catify
swh/loader/package/npm/tests/data/https_replicate.npmjs.com/jammit-express
swh/loader/package/npm/tests/data/https_replicate.npmjs.com/jammit-no-time
swh/loader/package/npm/tests/data/https_replicate.npmjs.com/nativescript-telerik-analytics
swh/loader/package/npm/tests/data/https_replicate.npmjs.com/org
swh/loader/package/npm/tests/data/https_replicate.npmjs.com/org_version_mismatch
swh/loader/package/npm/tests/data/https_replicate.npmjs.com/org_visit1
swh/loader/package/opam/__init__.py
swh/loader/package/opam/loader.py
swh/loader/package/opam/tasks.py
swh/loader/package/opam/tests/__init__.py
swh/loader/package/opam/tests/test_opam.py
swh/loader/package/opam/tests/test_tasks.py
swh/loader/package/opam/tests/data/fake_opam_repo/_repo
swh/loader/package/opam/tests/data/fake_opam_repo/version
swh/loader/package/opam/tests/data/fake_opam_repo/repo/loadertest/lock
swh/loader/package/opam/tests/data/fake_opam_repo/repo/loadertest/repos-config
swh/loader/package/opam/tests/data/fake_opam_repo/repo/loadertest/packages/agrid/agrid.0.1/opam
swh/loader/package/opam/tests/data/fake_opam_repo/repo/loadertest/packages/directories/directories.0.1/opam
swh/loader/package/opam/tests/data/fake_opam_repo/repo/loadertest/packages/directories/directories.0.2/opam
swh/loader/package/opam/tests/data/fake_opam_repo/repo/loadertest/packages/directories/directories.0.3/opam
swh/loader/package/opam/tests/data/fake_opam_repo/repo/loadertest/packages/ocb/ocb.0.1/opam
swh/loader/package/opam/tests/data/https_github.com/OCamlPro_agrid_archive_0.1.tar.gz
swh/loader/package/opam/tests/data/https_github.com/OCamlPro_directories_archive_0.1.tar.gz
swh/loader/package/opam/tests/data/https_github.com/OCamlPro_directories_archive_0.2.tar.gz
swh/loader/package/opam/tests/data/https_github.com/OCamlPro_directories_archive_0.3.tar.gz
swh/loader/package/opam/tests/data/https_github.com/OCamlPro_ocb_archive_0.1.tar.gz
swh/loader/package/pypi/__init__.py
swh/loader/package/pypi/loader.py
swh/loader/package/pypi/tasks.py
swh/loader/package/pypi/tests/__init__.py
swh/loader/package/pypi/tests/test_pypi.py
swh/loader/package/pypi/tests/test_tasks.py
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/0805nexter-1.1.0.tar.gz
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/0805nexter-1.1.0.zip
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/0805nexter-1.2.0.zip
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/0805nexter-1.3.0.zip
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/0805nexter-1.4.0.zip
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/nexter-1.1.0.tar.gz
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/nexter-1.1.0.zip
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/packages_70_97_c49fb8ec24a7aaab54c3dbfbb5a6ca1431419d9ee0f6c363d9ad01d2b8b1_0805nexter-1.3.0.zip
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/packages_86_10_c9555ec63106153aaaad753a281ff47f4ac79e980ff7f5d740d6649cd56a_upymenu-0.0.1.tar.gz
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/packages_c4_a0_4562cda161dc4ecbbe9e2a11eb365400c0461845c5be70d73869786809c4_0805nexter-1.2.0.zip
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/packages_c4_a0_4562cda161dc4ecbbe9e2a11eb365400c0461845c5be70d73869786809c4_0805nexter-1.2.0.zip_visit1
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/packages_ec_65_c0116953c9a3f47de89e71964d6c7b0c783b01f29fa3390584dbf3046b4d_0805nexter-1.1.0.zip
swh/loader/package/pypi/tests/data/https_files.pythonhosted.org/packages_ec_65_c0116953c9a3f47de89e71964d6c7b0c783b01f29fa3390584dbf3046b4d_0805nexter-1.1.0.zip_visit1
swh/loader/package/pypi/tests/data/https_pypi.org/pypi_0805nexter_json
swh/loader/package/pypi/tests/data/https_pypi.org/pypi_0805nexter_json_visit1
swh/loader/package/pypi/tests/data/https_pypi.org/pypi_nexter_json
swh/loader/package/pypi/tests/data/https_pypi.org/pypi_upymenu_json
swh/loader/package/tests/__init__.py
swh/loader/package/tests/common.py
swh/loader/package/tests/test_conftest.py
swh/loader/package/tests/test_loader.py
swh/loader/package/tests/test_loader_metadata.py
swh/loader/package/tests/test_utils.py
swh/loader/tests/__init__.py
swh/loader/tests/conftest.py
swh/loader/tests/py.typed
swh/loader/tests/test_cli.py
swh/loader/tests/test_init.py
swh/loader/tests/data/0805nexter-1.1.0.tar.gz
\ No newline at end of file
diff --git a/swh/loader/cli.py b/swh/loader/cli.py
index 53c4a11..479aa33 100644
--- a/swh/loader/cli.py
+++ b/swh/loader/cli.py
@@ -1,134 +1,135 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
# WARNING: do not import unnecessary things here to keep cli startup time under
# control
import logging
from typing import Any
import click
import pkg_resources
from swh.core.cli import CONTEXT_SETTINGS
from swh.core.cli import swh as swh_cli_group
logger = logging.getLogger(__name__)
LOADERS = {
entry_point.name.split(".", 1)[1]: entry_point
for entry_point in pkg_resources.iter_entry_points("swh.workers")
if entry_point.name.split(".", 1)[0] == "loader"
}
SUPPORTED_LOADERS = sorted(list(LOADERS))
def get_loader(name: str, **kwargs) -> Any:
"""Given a loader name, instantiate it.
Args:
name: Loader's name
kwargs: Configuration dict (url...)
Returns:
An instantiated loader
"""
if name not in LOADERS:
raise ValueError(
"Invalid loader %s: only supported loaders are %s"
% (name, SUPPORTED_LOADERS)
)
registry_entry = LOADERS[name].load()()
logger.debug(f"registry: {registry_entry}")
loader_cls = registry_entry["loader"]
logger.debug(f"loader class: {loader_cls}")
return loader_cls.from_config(**kwargs)
@swh_cli_group.group(name="loader", context_settings=CONTEXT_SETTINGS)
@click.option(
"--config-file",
"-C",
default=None,
- type=click.Path(exists=True, dir_okay=False,),
+ type=click.Path(
+ exists=True,
+ dir_okay=False,
+ ),
help="Configuration file.",
)
@click.pass_context
def loader(ctx, config_file):
- """Loader cli tools
-
- """
+ """Loader cli tools"""
from os import environ
from swh.core.config import read
ctx.ensure_object(dict)
logger.debug("ctx: %s", ctx)
if not config_file:
config_file = environ.get("SWH_CONFIG_FILENAME")
ctx.obj["config"] = read(config_file)
logger.debug("config_file: %s", config_file)
logger.debug("config: ", ctx.obj["config"])
@loader.command(name="run", context_settings=CONTEXT_SETTINGS)
@click.argument("type", type=click.Choice(SUPPORTED_LOADERS))
@click.argument("url")
@click.argument("options", nargs=-1)
@click.pass_context
def run(ctx, type, url, options):
"""Ingest with loader <type> the origin located at <url>"""
import iso8601
from swh.scheduler.cli.utils import parse_options
conf = ctx.obj.get("config", {})
if "storage" not in conf:
raise ValueError("Missing storage configuration key")
(_, kw) = parse_options(options)
logger.debug(f"kw: {kw}")
visit_date = kw.get("visit_date")
if visit_date and isinstance(visit_date, str):
visit_date = iso8601.parse_date(visit_date)
kw["visit_date"] = visit_date
loader = get_loader(type, url=url, storage=conf["storage"], **kw)
result = loader.load()
msg = f"{result} for origin '{url}'"
directory = kw.get("directory")
if directory:
msg = msg + f" and directory '{directory}'"
click.echo(msg)
@loader.command(name="list", context_settings=CONTEXT_SETTINGS)
@click.argument("type", default="all", type=click.Choice(["all"] + SUPPORTED_LOADERS))
@click.pass_context
def list(ctx, type):
"""List supported loaders and optionally their arguments"""
import inspect
if type == "all":
loaders = ", ".join(SUPPORTED_LOADERS)
click.echo(f"Supported loaders: {loaders}")
else:
registry_entry = LOADERS[type].load()()
loader_cls = registry_entry["loader"]
doc = inspect.getdoc(loader_cls).strip()
# Hack to get the signature of the class even though it subclasses
# Generic, which reimplements __new__.
# See <https://bugs.python.org/issue40897>
signature = inspect.signature(loader_cls.__init__)
signature_str = str(signature).replace("self, ", "")
click.echo(f"Loader: {doc}\nsignature: {signature_str}")
diff --git a/swh/loader/core/loader.py b/swh/loader/core/loader.py
index 4763817..2e77fba 100644
--- a/swh/loader/core/loader.py
+++ b/swh/loader/core/loader.py
@@ -1,468 +1,462 @@
# Copyright (C) 2015-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import hashlib
import logging
import os
from typing import Any, Dict, Iterable, Optional
from swh.core.config import load_from_envvar
from swh.loader.exception import NotFound
from swh.model.model import (
BaseContent,
Content,
Directory,
Origin,
OriginVisit,
OriginVisitStatus,
Release,
Revision,
Sha1Git,
SkippedContent,
Snapshot,
)
from swh.storage import get_storage
from swh.storage.interface import StorageInterface
from swh.storage.utils import now
DEFAULT_CONFIG: Dict[str, Any] = {
"max_content_size": 100 * 1024 * 1024,
}
class BaseLoader:
"""Base class for (D)VCS loaders (e.g Svn, Git, Mercurial, ...) or PackageLoader (e.g
PyPI, Npm, CRAN, ...)
A loader retrieves origin information (git/mercurial/svn repositories, pypi/npm/...
package artifacts), ingests the contents/directories/revisions/releases/snapshot
read from those artifacts and send them to the archive through the storage backend.
The main entry point for the loader is the :func:`load` function.
2 static methods (:func:`from_config`, :func:`from_configfile`) centralizes and
eases the loader instantiation from either configuration dict or configuration file.
Some class examples:
- :class:`SvnLoader`
- :class:`GitLoader`
- :class:`PyPILoader`
- :class:`NpmLoader`
"""
visit_date: Optional[datetime.datetime]
origin: Optional[Origin]
origin_metadata: Dict[str, Any]
loaded_snapshot_id: Optional[Sha1Git]
def __init__(
self,
storage: StorageInterface,
logging_class: Optional[str] = None,
save_data_path: Optional[str] = None,
max_content_size: Optional[int] = None,
):
super().__init__()
self.storage = storage
self.max_content_size = int(max_content_size) if max_content_size else None
if logging_class is None:
logging_class = "%s.%s" % (
self.__class__.__module__,
self.__class__.__name__,
)
self.log = logging.getLogger(logging_class)
_log = logging.getLogger("requests.packages.urllib3.connectionpool")
_log.setLevel(logging.WARN)
# possibly overridden in self.prepare method
self.visit_date = None
self.origin = None
if not hasattr(self, "visit_type"):
self.visit_type: Optional[str] = None
self.origin_metadata = {}
self.loaded_snapshot_id = None
if save_data_path:
path = save_data_path
os.stat(path)
if not os.access(path, os.R_OK | os.W_OK):
raise PermissionError("Permission denied: %r" % path)
self.save_data_path = save_data_path
@classmethod
def from_config(cls, storage: Dict[str, Any], **config: Any):
"""Instantiate a loader from a configuration dict.
This is basically a backwards-compatibility shim for the CLI.
Args:
storage: instantiation config for the storage
config: the configuration dict for the loader, with the following keys:
- credentials (optional): credentials list for the scheduler
- any other kwargs passed to the loader.
Returns:
the instantiated loader
"""
# Drop the legacy config keys which aren't used for this generation of loader.
for legacy_key in ("storage", "celery"):
config.pop(legacy_key, None)
# Instantiate the storage
storage_instance = get_storage(**storage)
return cls(storage=storage_instance, **config)
@classmethod
def from_configfile(cls, **kwargs: Any):
"""Instantiate a loader from the configuration loaded from the
SWH_CONFIG_FILENAME envvar, with potential extra keyword arguments if their
value is not None.
Args:
kwargs: kwargs passed to the loader instantiation
"""
config = dict(load_from_envvar(DEFAULT_CONFIG))
config.update({k: v for k, v in kwargs.items() if v is not None})
return cls.from_config(**config)
def save_data(self) -> None:
"""Save the data associated to the current load"""
raise NotImplementedError
def get_save_data_path(self) -> str:
"""The path to which we archive the loader's raw data"""
if not hasattr(self, "__save_data_path"):
year = str(self.visit_date.year) # type: ignore
assert self.origin
url = self.origin.url.encode("utf-8")
origin_url_hash = hashlib.sha1(url).hexdigest()
path = "%s/sha1:%s/%s/%s" % (
self.save_data_path,
origin_url_hash[0:2],
origin_url_hash,
year,
)
os.makedirs(path, exist_ok=True)
self.__save_data_path = path
return self.__save_data_path
def flush(self) -> None:
- """Flush any potential buffered data not sent to swh-storage.
-
- """
+ """Flush any potential buffered data not sent to swh-storage."""
self.storage.flush()
def cleanup(self) -> None:
- """Last step executed by the loader.
-
- """
+ """Last step executed by the loader."""
raise NotImplementedError
def prepare_origin_visit(self) -> None:
"""First step executed by the loader to prepare origin and visit
- references. Set/update self.origin, and
- optionally self.origin_url, self.visit_date.
+ references. Set/update self.origin, and
+ optionally self.origin_url, self.visit_date.
"""
raise NotImplementedError
def _store_origin_visit(self) -> None:
- """Store origin and visit references. Sets the self.visit references.
-
- """
+ """Store origin and visit references. Sets the self.visit references."""
assert self.origin
self.storage.origin_add([self.origin])
if not self.visit_date: # now as default visit_date if not provided
self.visit_date = datetime.datetime.now(tz=datetime.timezone.utc)
assert isinstance(self.visit_date, datetime.datetime)
assert isinstance(self.visit_type, str)
self.visit = list(
self.storage.origin_visit_add(
[
OriginVisit(
origin=self.origin.url,
date=self.visit_date,
type=self.visit_type,
)
]
)
)[0]
def prepare(self) -> None:
"""Second step executed by the loader to prepare some state needed by
the loader.
Raises
NotFound exception if the origin to ingest is not found.
"""
raise NotImplementedError
def get_origin(self) -> Origin:
"""Get the origin that is currently being loaded.
self.origin should be set in :func:`prepare_origin`
Returns:
dict: an origin ready to be sent to storage by
:func:`origin_add`.
"""
assert self.origin
return self.origin
def fetch_data(self) -> bool:
"""Fetch the data from the source the loader is currently loading
(ex: git/hg/svn/... repository).
Returns:
a value that is interpreted as a boolean. If True, fetch_data needs
to be called again to complete loading.
"""
raise NotImplementedError
def store_data(self):
"""Store fetched data in the database.
Should call the :func:`maybe_load_xyz` methods, which handle the
bundles sent to storage, rather than send directly.
"""
raise NotImplementedError
def store_metadata(self) -> None:
"""Store fetched metadata in the database.
For more information, see implementation in :class:`DepositLoader`.
"""
pass
def load_status(self) -> Dict[str, str]:
"""Detailed loading status.
Defaults to logging an eventful load.
Returns: a dictionary that is eventually passed back as the task's
result to the scheduler, allowing tuning of the task recurrence
mechanism.
"""
return {
"status": "eventful",
}
def post_load(self, success: bool = True) -> None:
"""Permit the loader to do some additional actions according to status
after the loading is done. The flag success indicates the
loading's status.
Defaults to doing nothing.
This is up to the implementer of this method to make sure this
does not break.
Args:
success (bool): the success status of the loading
"""
pass
def visit_status(self) -> str:
"""Detailed visit status.
Defaults to logging a full visit.
"""
return "full"
def pre_cleanup(self) -> None:
"""As a first step, will try and check for dangling data to cleanup.
This should do its best to avoid raising issues.
"""
pass
def load(self) -> Dict[str, str]:
r"""Loading logic for the loader to follow:
- 1. Call :meth:`prepare_origin_visit` to prepare the
origin and visit we will associate loading data to
- 2. Store the actual ``origin_visit`` to storage
- 3. Call :meth:`prepare` to prepare any eventual state
- 4. Call :meth:`get_origin` to get the origin we work with and store
- while True:
- 5. Call :meth:`fetch_data` to fetch the data to store
- 6. Call :meth:`store_data` to store the data
- 7. Call :meth:`cleanup` to clean up any eventual state put in place
in :meth:`prepare` method.
"""
try:
self.pre_cleanup()
except Exception:
msg = "Cleaning up dangling data failed! Continue loading."
self.log.warning(msg)
self.prepare_origin_visit()
self._store_origin_visit()
assert (
self.origin
), "The method `prepare_origin_visit` call should set the origin (Origin)"
assert (
self.visit.visit
), "The method `_store_origin_visit` should set the visit (OriginVisit)"
self.log.info(
"Load origin '%s' with type '%s'", self.origin.url, self.visit.type
)
try:
self.prepare()
while True:
more_data_to_fetch = self.fetch_data()
self.store_data()
if not more_data_to_fetch:
break
self.store_metadata()
visit_status = OriginVisitStatus(
origin=self.origin.url,
visit=self.visit.visit,
type=self.visit_type,
date=now(),
status=self.visit_status(),
snapshot=self.loaded_snapshot_id,
)
self.storage.origin_visit_status_add([visit_status])
self.post_load()
except Exception as e:
if isinstance(e, NotFound):
status = "not_found"
task_status = "uneventful"
else:
status = "partial" if self.loaded_snapshot_id else "failed"
task_status = "failed"
self.log.exception(
"Loading failure, updating to `%s` status",
status,
extra={
"swh_task_args": [],
"swh_task_kwargs": {"origin": self.origin.url},
},
)
visit_status = OriginVisitStatus(
origin=self.origin.url,
visit=self.visit.visit,
type=self.visit_type,
date=now(),
status=status,
snapshot=self.loaded_snapshot_id,
)
self.storage.origin_visit_status_add([visit_status])
self.post_load(success=False)
return {"status": task_status}
finally:
self.flush()
self.cleanup()
return self.load_status()
class DVCSLoader(BaseLoader):
"""This base class is a pattern for dvcs loaders (e.g. git, mercurial).
Those loaders are able to load all the data in one go. For example, the
loader defined in swh-loader-git :class:`BulkUpdater`.
For other loaders (stateful one, (e.g :class:`SWHSvnLoader`),
inherit directly from :class:`BaseLoader`.
"""
def cleanup(self) -> None:
"""Clean up an eventual state installed for computations."""
pass
def has_contents(self) -> bool:
"""Checks whether we need to load contents"""
return True
def get_contents(self) -> Iterable[BaseContent]:
"""Get the contents that need to be loaded"""
raise NotImplementedError
def has_directories(self) -> bool:
"""Checks whether we need to load directories"""
return True
def get_directories(self) -> Iterable[Directory]:
"""Get the directories that need to be loaded"""
raise NotImplementedError
def has_revisions(self) -> bool:
"""Checks whether we need to load revisions"""
return True
def get_revisions(self) -> Iterable[Revision]:
"""Get the revisions that need to be loaded"""
raise NotImplementedError
def has_releases(self) -> bool:
"""Checks whether we need to load releases"""
return True
def get_releases(self) -> Iterable[Release]:
"""Get the releases that need to be loaded"""
raise NotImplementedError
def get_snapshot(self) -> Snapshot:
"""Get the snapshot that needs to be loaded"""
raise NotImplementedError
def eventful(self) -> bool:
"""Whether the load was eventful"""
raise NotImplementedError
def store_data(self) -> None:
assert self.origin
if self.save_data_path:
self.save_data()
if self.has_contents():
for obj in self.get_contents():
if isinstance(obj, Content):
self.storage.content_add([obj])
elif isinstance(obj, SkippedContent):
self.storage.skipped_content_add([obj])
else:
raise TypeError(f"Unexpected content type: {obj}")
if self.has_directories():
for directory in self.get_directories():
self.storage.directory_add([directory])
if self.has_revisions():
for revision in self.get_revisions():
self.storage.revision_add([revision])
if self.has_releases():
for release in self.get_releases():
self.storage.release_add([release])
snapshot = self.get_snapshot()
self.storage.snapshot_add([snapshot])
self.flush()
self.loaded_snapshot_id = snapshot.id
diff --git a/swh/loader/core/tests/test_converters.py b/swh/loader/core/tests/test_converters.py
index 8ef0a96..d9a76a6 100644
--- a/swh/loader/core/tests/test_converters.py
+++ b/swh/loader/core/tests/test_converters.py
@@ -1,111 +1,111 @@
# Copyright (C) 2015-2020 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import os
import tempfile
from swh.loader.core import converters
from swh.model import from_disk
from swh.model.model import Content, SkippedContent
def tmpfile_with_content(fromdir, contentfile):
- """Create a temporary file with content contentfile in directory fromdir.
-
- """
+ """Create a temporary file with content contentfile in directory fromdir."""
tmpfilepath = tempfile.mktemp(
suffix=".swh", prefix="tmp-file-for-test", dir=str(fromdir)
)
with open(tmpfilepath, "wb") as f:
f.write(contentfile)
return tmpfilepath
def test_content_for_storage_path(tmpdir):
# given
data = b"temp file for testing content storage conversion"
tmpfile = tmpfile_with_content(tmpdir, data)
obj = from_disk.Content.from_file(path=os.fsdecode(tmpfile)).get_data()
expected_content = obj.copy()
expected_content["data"] = data
expected_content["status"] = "visible"
del expected_content["path"]
del expected_content["perms"]
expected_content = Content.from_dict(expected_content)
# when
content = converters.content_for_storage(obj)
# then
assert content == expected_content
def test_content_for_storage_data(tmpdir):
# given
data = b"temp file for testing content storage conversion"
obj = from_disk.Content.from_bytes(data=data, mode=0o100644).get_data()
del obj["perms"]
expected_content = obj.copy()
expected_content["status"] = "visible"
expected_content = Content.from_dict(expected_content)
# when
content = converters.content_for_storage(obj)
# then
assert content == expected_content
def test_content_for_storage_too_long(tmpdir):
# given
data = b"temp file for testing content storage conversion"
obj = from_disk.Content.from_bytes(data=data, mode=0o100644).get_data()
del obj["perms"]
expected_content = obj.copy()
expected_content.pop("data")
expected_content["status"] = "absent"
expected_content["origin"] = "http://example.org/"
expected_content["reason"] = "Content too large"
expected_content = SkippedContent.from_dict(expected_content)
# when
content = converters.content_for_storage(
- obj, max_content_size=len(data) - 1, origin_url=expected_content.origin,
+ obj,
+ max_content_size=len(data) - 1,
+ origin_url=expected_content.origin,
)
# then
assert content == expected_content
def test_prepare_contents(tmpdir):
contents = []
data_fine = b"tmp file fine"
max_size = len(data_fine)
for data in [b"tmp file with too much data", data_fine]:
obj = from_disk.Content.from_bytes(data=data, mode=0o100644).get_data()
del obj["perms"]
contents.append(obj)
actual_contents, actual_skipped_contents = converters.prepare_contents(
contents, max_content_size=max_size, origin_url="some-origin"
)
assert len(actual_contents) == 1
assert len(actual_skipped_contents) == 1
actual_content = actual_contents[0]
assert "reason" not in actual_content
assert actual_content["status"] == "visible"
actual_skipped_content = actual_skipped_contents[0]
assert actual_skipped_content["reason"] == "Content too large"
assert actual_skipped_content["status"] == "absent"
assert actual_skipped_content["origin"] == "some-origin"
diff --git a/swh/loader/core/tests/test_loader.py b/swh/loader/core/tests/test_loader.py
index 2ac2636..3a195c4 100644
--- a/swh/loader/core/tests/test_loader.py
+++ b/swh/loader/core/tests/test_loader.py
@@ -1,237 +1,236 @@
# Copyright (C) 2018-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import hashlib
import logging
from swh.loader.core.loader import BaseLoader, DVCSLoader
from swh.loader.exception import NotFound
from swh.loader.tests import assert_last_visit_matches
from swh.model.hashutil import hash_to_bytes
from swh.model.model import Origin, OriginVisit, Snapshot
ORIGIN = Origin(url="some-url")
class DummyLoader:
"""Base Loader to overload and simplify the base class (technical: to avoid repetition
- in other *Loader classes)"""
+ in other *Loader classes)"""
def cleanup(self):
pass
def prepare(self, *args, **kwargs):
pass
def fetch_data(self):
pass
def get_snapshot_id(self):
return None
def prepare_origin_visit(self, *args, **kwargs):
self.origin = ORIGIN
self.origin_url = ORIGIN.url
self.visit_date = datetime.datetime.now(tz=datetime.timezone.utc)
self.visit_type = "git"
self.storage.origin_add([ORIGIN])
visit = OriginVisit(
- origin=self.origin_url, date=self.visit_date, type=self.visit_type,
+ origin=self.origin_url,
+ date=self.visit_date,
+ type=self.visit_type,
)
self.visit = self.storage.origin_visit_add([visit])[0]
class DummyDVCSLoader(DummyLoader, DVCSLoader):
- """DVCS Loader that does nothing in regards to DAG objects.
-
- """
+ """DVCS Loader that does nothing in regards to DAG objects."""
def get_contents(self):
return []
def get_directories(self):
return []
def get_revisions(self):
return []
def get_releases(self):
return []
def get_snapshot(self):
return Snapshot(branches={})
def eventful(self):
return False
class DummyBaseLoader(DummyLoader, BaseLoader):
- """Buffered loader will send new data when threshold is reached
-
- """
+ """Buffered loader will send new data when threshold is reached"""
def store_data(self):
pass
def test_base_loader(swh_storage):
loader = DummyBaseLoader(swh_storage)
result = loader.load()
assert result == {"status": "eventful"}
def test_base_loader_with_config(swh_storage):
loader = DummyBaseLoader(swh_storage, "logger-name")
result = loader.load()
assert result == {"status": "eventful"}
def test_dvcs_loader(swh_storage):
loader = DummyDVCSLoader(swh_storage)
result = loader.load()
assert result == {"status": "eventful"}
def test_dvcs_loader_with_config(swh_storage):
loader = DummyDVCSLoader(swh_storage, "another-logger")
result = loader.load()
assert result == {"status": "eventful"}
def test_loader_logger_default_name(swh_storage):
loader = DummyBaseLoader(swh_storage)
assert isinstance(loader.log, logging.Logger)
assert loader.log.name == "swh.loader.core.tests.test_loader.DummyBaseLoader"
loader = DummyDVCSLoader(swh_storage)
assert isinstance(loader.log, logging.Logger)
assert loader.log.name == "swh.loader.core.tests.test_loader.DummyDVCSLoader"
def test_loader_logger_with_name(swh_storage):
loader = DummyBaseLoader(swh_storage, "some.logger.name")
assert isinstance(loader.log, logging.Logger)
assert loader.log.name == "some.logger.name"
def test_loader_save_data_path(swh_storage, tmp_path):
loader = DummyBaseLoader(swh_storage, "some.logger.name.1", save_data_path=tmp_path)
url = "http://bitbucket.org/something"
loader.origin = Origin(url=url)
loader.visit_date = datetime.datetime(year=2019, month=10, day=1)
hash_url = hashlib.sha1(url.encode("utf-8")).hexdigest()
expected_save_path = "%s/sha1:%s/%s/2019" % (str(tmp_path), hash_url[0:2], hash_url)
save_path = loader.get_save_data_path()
assert save_path == expected_save_path
def _check_load_failure(caplog, loader, exc_class, exc_text, status="partial"):
"""Check whether a failed load properly logged its exception, and that the
snapshot didn't get referenced in storage"""
assert isinstance(loader, DVCSLoader) # was implicit so far
for record in caplog.records:
if record.levelname != "ERROR":
continue
assert "Loading failure" in record.message
assert record.exc_info
exc = record.exc_info[1]
assert isinstance(exc, exc_class)
assert exc_text in exc.args[0]
# Check that the get_snapshot operation would have succeeded
assert loader.get_snapshot() is not None
# And confirm that the visit doesn't reference a snapshot
visit = assert_last_visit_matches(loader.storage, ORIGIN.url, status)
if status != "partial":
assert visit.snapshot is None
# But that the snapshot didn't get loaded
assert loader.loaded_snapshot_id is None
class DummyDVCSLoaderExc(DummyDVCSLoader):
"""A loader which raises an exception when loading some contents"""
def get_contents(self):
raise RuntimeError("Failed to get contents!")
def test_dvcs_loader_exc_partial_visit(swh_storage, caplog):
logger_name = "dvcsloaderexc"
caplog.set_level(logging.ERROR, logger=logger_name)
loader = DummyDVCSLoaderExc(swh_storage, logging_class=logger_name)
# fake the loading ending up in a snapshot
loader.loaded_snapshot_id = hash_to_bytes(
"9e4dd2b40d1b46b70917c0949aa2195c823a648e"
)
result = loader.load()
# loading failed
assert result == {"status": "failed"}
# still resulted in a partial visit with a snapshot (somehow)
_check_load_failure(
- caplog, loader, RuntimeError, "Failed to get contents!",
+ caplog,
+ loader,
+ RuntimeError,
+ "Failed to get contents!",
)
class BrokenStorageProxy:
def __init__(self, storage):
self.storage = storage
def __getattr__(self, attr):
return getattr(self.storage, attr)
def snapshot_add(self, snapshots):
raise RuntimeError("Failed to add snapshot!")
class DummyDVCSLoaderStorageExc(DummyDVCSLoader):
"""A loader which raises an exception when loading some contents"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.storage = BrokenStorageProxy(self.storage)
def test_dvcs_loader_storage_exc_failed_visit(swh_storage, caplog):
logger_name = "dvcsloaderexc"
caplog.set_level(logging.ERROR, logger=logger_name)
loader = DummyDVCSLoaderStorageExc(swh_storage, logging_class=logger_name)
result = loader.load()
assert result == {"status": "failed"}
_check_load_failure(
caplog, loader, RuntimeError, "Failed to add snapshot!", status="failed"
)
class DummyDVCSLoaderNotFound(DummyDVCSLoader, BaseLoader):
- """A loader which raises a not_found exception during the prepare method call
-
- """
+ """A loader which raises a not_found exception during the prepare method call"""
def prepare(*args, **kwargs):
raise NotFound("Unknown origin!")
def load_status(self):
return {
"status": "uneventful",
}
def test_loader_not_found(swh_storage, caplog):
loader = DummyDVCSLoaderNotFound(swh_storage)
result = loader.load()
assert result == {"status": "uneventful"}
_check_load_failure(caplog, loader, NotFound, "Unknown origin!", status="not_found")
diff --git a/swh/loader/core/tests/test_utils.py b/swh/loader/core/tests/test_utils.py
index 628243b..28d6c21 100644
--- a/swh/loader/core/tests/test_utils.py
+++ b/swh/loader/core/tests/test_utils.py
@@ -1,171 +1,187 @@
# Copyright (C) 2019 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
from datetime import datetime
import os
import signal
from time import sleep
from unittest.mock import patch
import pytest
from swh.loader.core.utils import (
CloneFailure,
CloneTimeout,
clean_dangling_folders,
clone_with_timeout,
parse_visit_date,
)
def prepare_arborescence_from(tmpdir, folder_names):
"""Prepare arborescence tree with folders
Args:
tmpdir (Either[LocalPath, str]): Root temporary directory
folder_names (List[str]): List of folder names
Returns:
List of folders
"""
dangling_folders = []
for dname in folder_names:
d = str(tmpdir / dname)
os.mkdir(d)
dangling_folders.append(d)
return str(tmpdir), dangling_folders
def assert_dirs(actual_dirs, expected_dirs):
- """Assert that the directory actual and expected match
-
- """
+ """Assert that the directory actual and expected match"""
for d in actual_dirs:
assert d in expected_dirs
assert len(actual_dirs) == len(expected_dirs)
def test_clean_dangling_folders_0(tmpdir):
"""Folder does not exist, do nothing"""
r = clean_dangling_folders("/path/does/not/exist", "unused-pattern")
assert r is None
@patch("swh.loader.core.utils.psutil.pid_exists", return_value=False)
def test_clean_dangling_folders_1(mock_pid_exists, tmpdir):
- """Folder which matches pattern with dead pid are cleaned up
-
- """
+ """Folder which matches pattern with dead pid are cleaned up"""
rootpath, dangling = prepare_arborescence_from(
- tmpdir, ["something", "swh.loader.svn-4321.noisynoise",]
+ tmpdir,
+ [
+ "something",
+ "swh.loader.svn-4321.noisynoise",
+ ],
)
clean_dangling_folders(rootpath, "swh.loader.svn")
actual_dirs = os.listdir(rootpath)
mock_pid_exists.assert_called_once_with(4321)
assert_dirs(actual_dirs, ["something"])
@patch("swh.loader.core.utils.psutil.pid_exists", return_value=True)
def test_clean_dangling_folders_2(mock_pid_exists, tmpdir):
- """Folder which matches pattern with live pid are skipped
-
- """
+ """Folder which matches pattern with live pid are skipped"""
rootpath, dangling = prepare_arborescence_from(
- tmpdir, ["something", "swh.loader.hg-1234.noisynoise",]
+ tmpdir,
+ [
+ "something",
+ "swh.loader.hg-1234.noisynoise",
+ ],
)
clean_dangling_folders(rootpath, "swh.loader.hg")
actual_dirs = os.listdir(rootpath)
mock_pid_exists.assert_called_once_with(1234)
- assert_dirs(actual_dirs, ["something", "swh.loader.hg-1234.noisynoise",])
+ assert_dirs(
+ actual_dirs,
+ [
+ "something",
+ "swh.loader.hg-1234.noisynoise",
+ ],
+ )
@patch("swh.loader.core.utils.psutil.pid_exists", return_value=False)
@patch(
"swh.loader.core.utils.shutil.rmtree",
side_effect=ValueError("Could not remove for reasons"),
)
def test_clean_dangling_folders_3(mock_rmtree, mock_pid_exists, tmpdir):
- """Error in trying to clean dangling folders are skipped
-
- """
+ """Error in trying to clean dangling folders are skipped"""
path1 = "thingy"
path2 = "swh.loader.git-1468.noisy"
- rootpath, dangling = prepare_arborescence_from(tmpdir, [path1, path2,])
+ rootpath, dangling = prepare_arborescence_from(
+ tmpdir,
+ [
+ path1,
+ path2,
+ ],
+ )
clean_dangling_folders(rootpath, "swh.loader.git")
actual_dirs = os.listdir(rootpath)
mock_pid_exists.assert_called_once_with(1468)
mock_rmtree.assert_called_once_with(os.path.join(rootpath, path2))
assert_dirs(actual_dirs, [path2, path1])
def test_clone_with_timeout_no_error_no_timeout():
def succeed():
"""This does nothing to simulate a successful clone"""
clone_with_timeout("foo", "bar", succeed, timeout=0.5)
def test_clone_with_timeout_no_error_timeout():
def slow():
"""This lasts for more than the timeout"""
sleep(1)
with pytest.raises(CloneTimeout):
clone_with_timeout("foo", "bar", slow, timeout=0.5)
def test_clone_with_timeout_error():
def raise_something():
raise RuntimeError("panic!")
with pytest.raises(CloneFailure):
clone_with_timeout("foo", "bar", raise_something, timeout=0.5)
def test_clone_with_timeout_sigkill():
"""This also tests that the traceback is useful"""
src = "https://www.mercurial-scm.org/repo/hello"
dest = "/dev/null"
timeout = 0.5
sleepy_time = 100 * timeout
assert sleepy_time > timeout
def ignores_sigterm(*args, **kwargs):
# ignore SIGTERM to force sigkill
signal.signal(signal.SIGTERM, lambda signum, frame: None)
sleep(sleepy_time) # we make sure we exceed the timeout
with pytest.raises(CloneTimeout) as e:
clone_with_timeout(src, dest, ignores_sigterm, timeout)
killed = True
assert e.value.args == (src, timeout, killed)
VISIT_DATE_STR = "2021-02-17 15:50:04.518963"
VISIT_DATE = datetime(2021, 2, 17, 15, 50, 4, 518963)
@pytest.mark.parametrize(
"input_visit_date,expected_date",
- [(None, None), (VISIT_DATE, VISIT_DATE), (VISIT_DATE_STR, VISIT_DATE),],
+ [
+ (None, None),
+ (VISIT_DATE, VISIT_DATE),
+ (VISIT_DATE_STR, VISIT_DATE),
+ ],
)
def test_utils_parse_visit_date(input_visit_date, expected_date):
assert parse_visit_date(input_visit_date) == expected_date
def test_utils_parse_visit_date_now():
actual_date = parse_visit_date("now")
assert isinstance(actual_date, datetime)
def test_utils_parse_visit_date_fails():
with pytest.raises(ValueError, match="invalid"):
parse_visit_date(10) # not a string nor a date
diff --git a/swh/loader/core/utils.py b/swh/loader/core/utils.py
index 84be8ff..0e9b388 100644
--- a/swh/loader/core/utils.py
+++ b/swh/loader/core/utils.py
@@ -1,127 +1,127 @@
# Copyright (C) 2018-2022 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
from datetime import datetime, timezone
import io
import os
import shutil
import signal
import time
import traceback
from typing import Callable, Optional, Union
from billiard import Process, Queue # type: ignore
from dateutil.parser import parse
import psutil
def clean_dangling_folders(dirpath: str, pattern_check: str, log=None) -> None:
"""Clean up potential dangling temporary working folder rooted at `dirpath`. Those
folders must match a dedicated pattern and not belonging to a live pid.
Args:
dirpath: Path to check for dangling files
pattern_check: A dedicated pattern to check on first level directory (e.g
`swh.loader.mercurial.`, `swh.loader.svn.`)
log (Logger): Optional logger
"""
if not os.path.exists(dirpath):
return
for filename in os.listdir(dirpath):
path_to_cleanup = os.path.join(dirpath, filename)
try:
# pattern: `swh.loader.{loader-type}-pid.{noise}`
if (
pattern_check not in filename or "-" not in filename
): # silently ignore unknown patterns
continue
_, pid_ = filename.split("-")
pid = int(pid_.split(".")[0])
if psutil.pid_exists(pid):
if log:
log.debug("PID %s is live, skipping", pid)
continue
# could be removed concurrently, so check before removal
if os.path.exists(path_to_cleanup):
shutil.rmtree(path_to_cleanup)
except Exception as e:
if log:
log.warn("Fail to clean dangling path %s: %s", path_to_cleanup, e)
class CloneTimeout(Exception):
pass
class CloneFailure(Exception):
pass
def _clone_task(clone_func: Callable[[], None], errors: Queue) -> None:
try:
clone_func()
except Exception as e:
exc_buffer = io.StringIO()
traceback.print_exc(file=exc_buffer)
errors.put_nowait(exc_buffer.getvalue())
raise e
def clone_with_timeout(
src: str, dest: str, clone_func: Callable[[], None], timeout: float
) -> None:
"""Clone a repository with timeout.
Args:
src: clone source
dest: clone destination
clone_func: callable that does the actual cloning
timeout: timeout in seconds
"""
errors: Queue = Queue()
process = Process(target=_clone_task, args=(clone_func, errors))
process.start()
process.join(timeout)
if process.is_alive():
process.terminate()
# Give it literally a second (in successive steps of 0.1 second),
# then kill it.
# Can't use `process.join(1)` here, billiard appears to be bugged
# https://github.com/celery/billiard/issues/270
killed = False
for _ in range(10):
time.sleep(0.1)
if not process.is_alive():
break
else:
killed = True
os.kill(process.pid, signal.SIGKILL)
raise CloneTimeout(src, timeout, killed)
if not errors.empty():
raise CloneFailure(src, dest, errors.get())
def parse_visit_date(visit_date: Optional[Union[datetime, str]]) -> Optional[datetime]:
"""Convert visit date from either None, a string or a datetime to either None or
- datetime.
+ datetime.
"""
if visit_date is None:
return None
if isinstance(visit_date, datetime):
return visit_date
if visit_date == "now":
return datetime.now(tz=timezone.utc)
if isinstance(visit_date, str):
return parse(visit_date)
raise ValueError(f"invalid visit date {visit_date!r}")
diff --git a/swh/loader/exception.py b/swh/loader/exception.py
index fd28021..6a77fc9 100644
--- a/swh/loader/exception.py
+++ b/swh/loader/exception.py
@@ -1,13 +1,13 @@
# Copyright (C) 2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
class NotFound(ValueError):
"""An exception raised when some information to retrieve is not found (e.g origin,
- artifact, ...)
+ artifact, ...)
"""
pass
diff --git a/swh/loader/package/archive/loader.py b/swh/loader/package/archive/loader.py
index 7e80cae..0c2d888 100644
--- a/swh/loader/package/archive/loader.py
+++ b/swh/loader/package/archive/loader.py
@@ -1,166 +1,164 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import hashlib
import logging
from os import path
import string
from typing import Any, Dict, Iterator, Mapping, Optional, Sequence, Tuple, Union
import attr
import iso8601
from swh.loader.package.loader import BasePackageInfo, PackageLoader, PartialExtID
from swh.loader.package.utils import EMPTY_AUTHOR, release_name
from swh.model.model import ObjectType, Release, Sha1Git, TimestampWithTimezone
from swh.storage.interface import StorageInterface
logger = logging.getLogger(__name__)
@attr.s
class ArchivePackageInfo(BasePackageInfo):
raw_info = attr.ib(type=Dict[str, Any])
length = attr.ib(type=int)
"""Size of the archive file"""
time = attr.ib(type=Union[str, datetime.datetime])
"""Timestamp of the archive file on the server"""
# default format for gnu
MANIFEST_FORMAT = string.Template("$time $length $version $url")
def extid(self, manifest_format: Optional[string.Template] = None) -> PartialExtID:
"""Returns a unique intrinsic identifier of this package info
``manifest_format`` allows overriding the class' default MANIFEST_FORMAT"""
manifest_format = manifest_format or self.MANIFEST_FORMAT
# TODO: use parsed attributes instead of self.raw_info
manifest = manifest_format.substitute(
{k: str(v) for (k, v) in self.raw_info.items()}
)
return (
self.EXTID_TYPE,
self.EXTID_VERSION,
hashlib.sha256(manifest.encode()).digest(),
)
@classmethod
def from_metadata(cls, a_metadata: Dict[str, Any]) -> "ArchivePackageInfo":
url = a_metadata["url"]
filename = a_metadata.get("filename")
return cls(
url=url,
filename=filename if filename else path.split(url)[-1],
raw_info=a_metadata,
length=a_metadata["length"],
time=a_metadata["time"],
version=a_metadata["version"],
)
class ArchiveLoader(PackageLoader[ArchivePackageInfo]):
- """Load archive origin's artifact files into swh archive
-
- """
+ """Load archive origin's artifact files into swh archive"""
visit_type = "tar"
def __init__(
self,
storage: StorageInterface,
url: str,
artifacts: Sequence[Dict[str, Any]],
extid_manifest_format: Optional[str] = None,
max_content_size: Optional[int] = None,
snapshot_append: bool = False,
):
f"""Loader constructor.
For now, this is the lister's task output.
Args:
url: Origin url
artifacts: List of artifact information with keys:
- **time**: last modification time as either isoformat date
string or timestamp
- **url**: the artifact url to retrieve filename
- **filename**: optionally, the file's name
- **version**: artifact's version
- **length**: artifact's length
extid_manifest_format: template string used to format a manifest,
which is hashed to get the extid of a package.
Defaults to {ArchivePackageInfo.MANIFEST_FORMAT!r}
snapshot_append: if :const:`True`, append latest snapshot content to
the new snapshot created by the loader
"""
super().__init__(storage=storage, url=url, max_content_size=max_content_size)
self.artifacts = artifacts # assume order is enforced in the lister
self.extid_manifest_format = (
None
if extid_manifest_format is None
else string.Template(extid_manifest_format)
)
self.snapshot_append = snapshot_append
def get_versions(self) -> Sequence[str]:
versions = []
for archive in self.artifacts:
v = archive.get("version")
if v:
versions.append(v)
return versions
def get_default_version(self) -> str:
# It's the most recent, so for this loader, it's the last one
return self.artifacts[-1]["version"]
def get_package_info(
self, version: str
) -> Iterator[Tuple[str, ArchivePackageInfo]]:
for a_metadata in self.artifacts:
p_info = ArchivePackageInfo.from_metadata(a_metadata)
if version == p_info.version:
# FIXME: this code assumes we have only 1 artifact per
# versioned package
yield release_name(version), p_info
def new_packageinfo_to_extid(
self, p_info: ArchivePackageInfo
) -> Optional[PartialExtID]:
return p_info.extid(manifest_format=self.extid_manifest_format)
def build_release(
self, p_info: ArchivePackageInfo, uncompressed_path: str, directory: Sha1Git
) -> Optional[Release]:
time = p_info.time # assume it's a timestamp
if isinstance(time, str): # otherwise, assume it's a parsable date
parsed_time = iso8601.parse_date(time)
else:
parsed_time = time
normalized_time = TimestampWithTimezone.from_datetime(parsed_time)
msg = f"Synthetic release for archive at {p_info.url}\n"
return Release(
name=p_info.version.encode(),
message=msg.encode(),
date=normalized_time,
author=EMPTY_AUTHOR,
target=directory,
target_type=ObjectType.DIRECTORY,
synthetic=True,
)
def extra_branches(self) -> Dict[bytes, Mapping[str, Any]]:
if not self.snapshot_append:
return {}
last_snapshot = self.last_snapshot()
return last_snapshot.to_dict()["branches"] if last_snapshot else {}
diff --git a/swh/loader/package/archive/tests/test_archive.py b/swh/loader/package/archive/tests/test_archive.py
index 4529f90..a590c1d 100644
--- a/swh/loader/package/archive/tests/test_archive.py
+++ b/swh/loader/package/archive/tests/test_archive.py
@@ -1,489 +1,488 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import hashlib
from io import BytesIO
from pathlib import Path
import string
import attr
import pytest
from requests.exceptions import ContentDecodingError
from swh.loader.package.archive.loader import ArchiveLoader, ArchivePackageInfo
from swh.loader.tests import assert_last_visit_matches, check_snapshot, get_stats
from swh.model.hashutil import hash_to_bytes, hash_to_hex
from swh.model.model import (
ObjectType,
Person,
Release,
Snapshot,
SnapshotBranch,
TargetType,
TimestampWithTimezone,
)
URL = "https://ftp.gnu.org/gnu/8sync/"
GNU_ARTIFACTS = [
{
"time": 944729610,
"url": "https://ftp.gnu.org/gnu/8sync/8sync-0.1.0.tar.gz",
"length": 221837,
"filename": "8sync-0.1.0.tar.gz",
"version": "0.1.0",
},
{
"time": 1480991830,
"url": "https://ftp.gnu.org/gnu/8sync/8sync-0.2.0.tar.gz",
"length": 238466,
"filename": "8sync-0.2.0.tar.gz",
"version": "0.2.0",
},
]
_expected_new_contents_first_visit = [
"e9258d81faf5881a2f96a77ba609396f82cb97ad",
"1170cf105b04b7e2822a0e09d2acf71da7b9a130",
"fbd27c3f41f2668624ffc80b7ba5db9b92ff27ac",
"0057bec9b5422aff9256af240b177ac0e3ac2608",
"2b8d0d0b43a1078fc708930c8ddc2956a86c566e",
"27de3b3bc6545d2a797aeeb4657c0e215a0c2e55",
"2e6db43f5cd764e677f416ff0d0c78c7a82ef19b",
"ae9be03bd2a06ed8f4f118d3fe76330bb1d77f62",
"edeb33282b2bffa0e608e9d2fd960fd08093c0ea",
"d64e64d4c73679323f8d4cde2643331ba6c20af9",
"7a756602914be889c0a2d3952c710144b3e64cb0",
"84fb589b554fcb7f32b806951dcf19518d67b08f",
"8624bcdae55baeef00cd11d5dfcfa60f68710a02",
"e08441aeab02704cfbd435d6445f7c072f8f524e",
"f67935bc3a83a67259cda4b2d43373bd56703844",
"809788434b433eb2e3cfabd5d591c9a659d5e3d8",
"7d7c6c8c5ebaeff879f61f37083a3854184f6c41",
"b99fec102eb24bffd53ab61fc30d59e810f116a2",
"7d149b28eaa228b3871c91f0d5a95a2fa7cb0c68",
"f0c97052e567948adf03e641301e9983c478ccff",
"7fb724242e2b62b85ca64190c31dcae5303e19b3",
"4f9709e64a9134fe8aefb36fd827b84d8b617ab5",
"7350628ccf194c2c3afba4ac588c33e3f3ac778d",
"0bb892d9391aa706dc2c3b1906567df43cbe06a2",
"49d4c0ce1a16601f1e265d446b6c5ea6b512f27c",
"6b5cc594ac466351450f7f64a0b79fdaf4435ad3",
"3046e5d1f70297e2a507b98224b6222c9688d610",
"1572607d456d7f633bc6065a2b3048496d679a31",
]
_expected_new_directories_first_visit = [
"daabc65ec75d487b1335ffc101c0ac11c803f8fc",
"263be23b4a8101d3ad0d9831319a3e0f2b065f36",
"7f6e63ba6eb3e2236f65892cd822041f1a01dd5c",
"4db0a3ecbc976083e2dac01a62f93729698429a3",
"dfef1c80e1098dd5deda664bb44a9ab1f738af13",
"eca971d346ea54d95a6e19d5051f900237fafdaa",
"3aebc29ed1fccc4a6f2f2010fb8e57882406b528",
]
_expected_new_releases_first_visit = {
"c92b2ad9e70ef1dce455e8fe1d8e41b92512cc08": (
"3aebc29ed1fccc4a6f2f2010fb8e57882406b528"
)
}
def test_archive_visit_with_no_artifact_found(swh_storage, requests_mock_datadir):
url = URL
unknown_artifact_url = "https://ftp.g.o/unknown/8sync-0.1.0.tar.gz"
loader = ArchiveLoader(
swh_storage,
url,
artifacts=[
{
"time": 944729610,
"url": unknown_artifact_url, # unknown artifact
"length": 221837,
"filename": "8sync-0.1.0.tar.gz",
"version": "0.1.0",
}
],
)
actual_load_status = loader.load()
assert actual_load_status["status"] == "uneventful"
assert actual_load_status["snapshot_id"] is not None
stats = get_stats(swh_storage)
assert {
"content": 0,
"directory": 0,
"origin": 1,
"origin_visit": 1,
"release": 0,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
assert_last_visit_matches(swh_storage, url, status="partial", type="tar")
def test_archive_visit_with_release_artifact_no_prior_visit(
swh_storage, requests_mock_datadir
):
- """With no prior visit, load a gnu project ends up with 1 snapshot
-
- """
+ """With no prior visit, load a gnu project ends up with 1 snapshot"""
loader = ArchiveLoader(swh_storage, URL, artifacts=GNU_ARTIFACTS[:1])
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
expected_snapshot_first_visit_id = hash_to_bytes(
"9efecc835e8f99254934f256b5301b94f348fd17"
)
assert actual_load_status["snapshot_id"] == hash_to_hex(
expected_snapshot_first_visit_id
)
assert_last_visit_matches(swh_storage, URL, status="full", type="tar")
stats = get_stats(swh_storage)
assert {
"content": len(_expected_new_contents_first_visit),
"directory": len(_expected_new_directories_first_visit),
"origin": 1,
"origin_visit": 1,
"release": len(_expected_new_releases_first_visit),
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
release_id = hash_to_bytes(list(_expected_new_releases_first_visit)[0])
expected_snapshot = Snapshot(
id=expected_snapshot_first_visit_id,
branches={
b"HEAD": SnapshotBranch(
- target_type=TargetType.ALIAS, target=b"releases/0.1.0",
+ target_type=TargetType.ALIAS,
+ target=b"releases/0.1.0",
),
b"releases/0.1.0": SnapshotBranch(
- target_type=TargetType.RELEASE, target=release_id,
+ target_type=TargetType.RELEASE,
+ target=release_id,
),
},
)
check_snapshot(expected_snapshot, swh_storage)
assert swh_storage.release_get([release_id])[0] == Release(
id=release_id,
name=b"0.1.0",
message=(
b"Synthetic release for archive at "
b"https://ftp.gnu.org/gnu/8sync/8sync-0.1.0.tar.gz\n"
),
target=hash_to_bytes("3aebc29ed1fccc4a6f2f2010fb8e57882406b528"),
target_type=ObjectType.DIRECTORY,
synthetic=True,
author=Person.from_fullname(b""),
date=TimestampWithTimezone.from_datetime(
datetime.datetime(1999, 12, 9, 8, 53, 30, tzinfo=datetime.timezone.utc)
),
)
expected_contents = map(hash_to_bytes, _expected_new_contents_first_visit)
assert list(swh_storage.content_missing_per_sha1(expected_contents)) == []
expected_dirs = map(hash_to_bytes, _expected_new_directories_first_visit)
assert list(swh_storage.directory_missing(expected_dirs)) == []
expected_rels = map(hash_to_bytes, _expected_new_releases_first_visit)
assert list(swh_storage.release_missing(expected_rels)) == []
def test_archive_2_visits_without_change(swh_storage, requests_mock_datadir):
- """With no prior visit, load a gnu project ends up with 1 snapshot
-
- """
+ """With no prior visit, load a gnu project ends up with 1 snapshot"""
url = URL
loader = ArchiveLoader(swh_storage, url, artifacts=GNU_ARTIFACTS[:1])
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(swh_storage, url, status="full", type="tar")
actual_load_status2 = loader.load()
assert actual_load_status2["status"] == "uneventful"
assert actual_load_status2["snapshot_id"] is not None
assert actual_load_status["snapshot_id"] == actual_load_status2["snapshot_id"]
assert_last_visit_matches(swh_storage, url, status="full", type="tar")
urls = [
m.url
for m in requests_mock_datadir.request_history
if m.url.startswith("https://ftp.gnu.org")
]
assert len(urls) == 1
def test_archive_2_visits_with_new_artifact(swh_storage, requests_mock_datadir):
- """With no prior visit, load a gnu project ends up with 1 snapshot
-
- """
+ """With no prior visit, load a gnu project ends up with 1 snapshot"""
url = URL
artifact1 = GNU_ARTIFACTS[0]
loader = ArchiveLoader(swh_storage, url, [artifact1])
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(swh_storage, url, status="full", type="tar")
stats = get_stats(swh_storage)
assert {
"content": len(_expected_new_contents_first_visit),
"directory": len(_expected_new_directories_first_visit),
"origin": 1,
"origin_visit": 1,
"release": len(_expected_new_releases_first_visit),
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
urls = [
m.url
for m in requests_mock_datadir.request_history
if m.url.startswith("https://ftp.gnu.org")
]
assert len(urls) == 1
artifact2 = GNU_ARTIFACTS[1]
loader2 = ArchiveLoader(swh_storage, url, [artifact1, artifact2])
stats2 = get_stats(swh_storage)
assert stats == stats2 # ensure we share the storage
actual_load_status2 = loader2.load()
assert actual_load_status2["status"] == "eventful"
assert actual_load_status2["snapshot_id"] is not None
stats2 = get_stats(swh_storage)
assert {
"content": len(_expected_new_contents_first_visit) + 14,
"directory": len(_expected_new_directories_first_visit) + 8,
"origin": 1,
"origin_visit": 1 + 1,
"release": len(_expected_new_releases_first_visit) + 1,
"revision": 0,
"skipped_content": 0,
"snapshot": 1 + 1,
} == stats2
assert_last_visit_matches(swh_storage, url, status="full", type="tar")
urls = [
m.url
for m in requests_mock_datadir.request_history
if m.url.startswith("https://ftp.gnu.org")
]
# 1 artifact (2nd time no modification) + 1 new artifact
assert len(urls) == 2
def test_archive_2_visits_without_change_not_gnu(swh_storage, requests_mock_datadir):
- """Load a project archive (not gnu) ends up with 1 snapshot
-
- """
+ """Load a project archive (not gnu) ends up with 1 snapshot"""
url = "https://something.else.org/8sync/"
artifacts = [ # this is not a gnu artifact
{
"time": "1999-12-09T09:53:30+00:00", # it's also not a timestamp
"sha256": "d5d1051e59b2be6f065a9fc6aedd3a391e44d0274b78b9bb4e2b57a09134dbe4", # noqa
# keep a gnu artifact reference to avoid adding other test files
"url": "https://ftp.gnu.org/gnu/8sync/8sync-0.2.0.tar.gz",
"length": 238466,
"filename": "8sync-0.2.0.tar.gz",
"version": "0.2.0",
}
]
# Here the loader defines the id_keys to use for existence in the snapshot
# It's not the default archive loader which
loader = ArchiveLoader(
swh_storage,
url,
artifacts=artifacts,
extid_manifest_format="$sha256 $length $url",
)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(swh_storage, url, status="full", type="tar")
actual_load_status2 = loader.load()
assert actual_load_status2["status"] == "uneventful"
assert actual_load_status2["snapshot_id"] == actual_load_status["snapshot_id"]
assert_last_visit_matches(swh_storage, url, status="full", type="tar")
urls = [
m.url
for m in requests_mock_datadir.request_history
if m.url.startswith("https://ftp.gnu.org")
]
assert len(urls) == 1
def test_archive_extid():
- """Compute primary key should return the right identity
-
- """
+ """Compute primary key should return the right identity"""
@attr.s
class TestPackageInfo(ArchivePackageInfo):
a = attr.ib()
b = attr.ib()
metadata = GNU_ARTIFACTS[0]
p_info = TestPackageInfo(
- raw_info={**metadata, "a": 1, "b": 2}, a=1, b=2, **metadata,
+ raw_info={**metadata, "a": 1, "b": 2},
+ a=1,
+ b=2,
+ **metadata,
)
for manifest_format, expected_manifest in [
(string.Template("$a $b"), b"1 2"),
(string.Template(""), b""),
(None, "{time} {length} {version} {url}".format(**metadata).encode()),
]:
actual_id = p_info.extid(manifest_format=manifest_format)
assert actual_id == (
"package-manifest-sha256",
0,
hashlib.sha256(expected_manifest).digest(),
)
with pytest.raises(KeyError):
p_info.extid(manifest_format=string.Template("$a $unknown_key"))
def test_archive_snapshot_append(swh_storage, requests_mock_datadir):
# first loading with a first artifact
artifact1 = GNU_ARTIFACTS[0]
loader = ArchiveLoader(swh_storage, URL, [artifact1], snapshot_append=True)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(swh_storage, URL, status="full", type="tar")
# check expected snapshot
snapshot = loader.last_snapshot()
assert len(snapshot.branches) == 2
branch_artifact1_name = f"releases/{artifact1['version']}".encode()
assert b"HEAD" in snapshot.branches
assert branch_artifact1_name in snapshot.branches
assert snapshot.branches[b"HEAD"].target == branch_artifact1_name
# second loading with a second artifact
artifact2 = GNU_ARTIFACTS[1]
loader = ArchiveLoader(swh_storage, URL, [artifact2], snapshot_append=True)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(swh_storage, URL, status="full", type="tar")
# check expected snapshot, should contain a new branch and the
# branch for the first artifact
snapshot = loader.last_snapshot()
assert len(snapshot.branches) == 3
branch_artifact2_name = f"releases/{artifact2['version']}".encode()
assert b"HEAD" in snapshot.branches
assert branch_artifact2_name in snapshot.branches
assert branch_artifact1_name in snapshot.branches
assert snapshot.branches[b"HEAD"].target == branch_artifact2_name
def test_archive_snapshot_append_branch_override(swh_storage, requests_mock_datadir):
# first loading for a first artifact
artifact1 = GNU_ARTIFACTS[0]
loader = ArchiveLoader(swh_storage, URL, [artifact1], snapshot_append=True)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(swh_storage, URL, status="full", type="tar")
# check expected snapshot
snapshot = loader.last_snapshot()
assert len(snapshot.branches) == 2
branch_artifact1_name = f"releases/{artifact1['version']}".encode()
assert branch_artifact1_name in snapshot.branches
branch_target_first_visit = snapshot.branches[branch_artifact1_name].target
# second loading for a second artifact with same version as the first one
# but with different tarball content
artifact2 = dict(GNU_ARTIFACTS[0])
artifact2["url"] = GNU_ARTIFACTS[1]["url"]
artifact2["time"] = GNU_ARTIFACTS[1]["time"]
artifact2["length"] = GNU_ARTIFACTS[1]["length"]
loader = ArchiveLoader(swh_storage, URL, [artifact2], snapshot_append=True)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(swh_storage, URL, status="full", type="tar")
# check expected snapshot, should contain the same branch as previously
# but with different target
snapshot = loader.last_snapshot()
assert len(snapshot.branches) == 2
assert branch_artifact1_name in snapshot.branches
branch_target_second_visit = snapshot.branches[branch_artifact1_name].target
assert branch_target_first_visit != branch_target_second_visit
@pytest.fixture
def not_gzipped_tarball_bytes(datadir):
return Path(datadir, "not_gzipped_tarball.tar.gz").read_bytes()
def test_archive_not_gzipped_tarball(
swh_storage, requests_mock, not_gzipped_tarball_bytes
):
"""Check that a tarball erroneously marked as gzip compressed can still
be downloaded and processed.
"""
filename = "not_gzipped_tarball.tar.gz"
url = f"https://example.org/ftp/{filename}"
requests_mock.get(
url,
[
- {"exc": ContentDecodingError,},
- {"body": BytesIO(not_gzipped_tarball_bytes),},
+ {
+ "exc": ContentDecodingError,
+ },
+ {
+ "body": BytesIO(not_gzipped_tarball_bytes),
+ },
],
)
loader = ArchiveLoader(
swh_storage,
url,
artifacts=[
{
"time": 944729610,
"url": url,
"length": 221837,
"filename": filename,
"version": "0.1.0",
}
],
)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
snapshot = loader.last_snapshot()
assert len(snapshot.branches) == 2
assert b"releases/0.1.0" in snapshot.branches
diff --git a/swh/loader/package/cran/loader.py b/swh/loader/package/cran/loader.py
index 1bef983..0239ee2 100644
--- a/swh/loader/package/cran/loader.py
+++ b/swh/loader/package/cran/loader.py
@@ -1,181 +1,179 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
from datetime import timezone
import logging
import os
from os import path
import re
import string
from typing import Any, Dict, Iterator, List, Optional, Tuple
import attr
import dateutil.parser
from debian.deb822 import Deb822
from swh.loader.package.loader import BasePackageInfo, PackageLoader
from swh.loader.package.utils import release_name
from swh.model.model import ObjectType, Person, Release, Sha1Git, TimestampWithTimezone
from swh.storage.interface import StorageInterface
logger = logging.getLogger(__name__)
DATE_PATTERN = re.compile(r"^(?P<year>\d{4})-(?P<month>\d{2})$")
@attr.s
class CRANPackageInfo(BasePackageInfo):
raw_info = attr.ib(type=Dict[str, Any])
name = attr.ib(type=str)
EXTID_TYPE = "cran-sha256"
MANIFEST_FORMAT = string.Template("$version $url")
@classmethod
def from_metadata(cls, a_metadata: Dict[str, Any]) -> "CRANPackageInfo":
url = a_metadata["url"]
return CRANPackageInfo(
url=url,
filename=path.basename(url),
raw_info=a_metadata,
name=a_metadata["package"],
version=a_metadata["version"],
)
class CRANLoader(PackageLoader[CRANPackageInfo]):
visit_type = "cran"
def __init__(
self,
storage: StorageInterface,
url: str,
artifacts: List[Dict],
max_content_size: Optional[int] = None,
):
"""Loader constructor.
Args:
url: Origin url to retrieve cran artifact(s) from
artifacts: List of associated artifact for the origin url
"""
super().__init__(storage=storage, url=url, max_content_size=max_content_size)
# explicit what we consider the artifact identity
self.artifacts = artifacts
def get_versions(self) -> List[str]:
versions = []
for artifact in self.artifacts:
versions.append(artifact["version"])
return versions
def get_default_version(self) -> str:
return self.artifacts[-1]["version"]
def get_package_info(self, version: str) -> Iterator[Tuple[str, CRANPackageInfo]]:
for a_metadata in self.artifacts:
p_info = CRANPackageInfo.from_metadata(a_metadata)
if version == p_info.version:
yield release_name(version), p_info
def build_release(
self, p_info: CRANPackageInfo, uncompressed_path: str, directory: Sha1Git
) -> Optional[Release]:
# a_metadata is empty
metadata = extract_intrinsic_metadata(uncompressed_path)
date = parse_date(metadata.get("Date"))
author = Person.from_fullname(metadata.get("Maintainer", "").encode())
msg = (
f"Synthetic release for CRAN source package {p_info.name} "
f"version {p_info.version}\n"
)
return Release(
name=p_info.version.encode(),
message=msg.encode(),
date=date,
author=author,
target_type=ObjectType.DIRECTORY,
target=directory,
synthetic=True,
)
def parse_debian_control(filepath: str) -> Dict[str, Any]:
"""Parse debian control at filepath"""
metadata: Dict = {}
logger.debug("Debian control file %s", filepath)
for paragraph in Deb822.iter_paragraphs(open(filepath, "rb")):
logger.debug("paragraph: %s", paragraph)
metadata.update(**paragraph)
logger.debug("metadata parsed: %s", metadata)
return metadata
def extract_intrinsic_metadata(dir_path: str) -> Dict[str, Any]:
"""Given an uncompressed path holding the DESCRIPTION file, returns a
DESCRIPTION parsed structure as a dict.
Cran origins describes their intrinsic metadata within a DESCRIPTION file
at the root tree of a tarball. This DESCRIPTION uses a simple file format
called DCF, the Debian control format.
The release artifact contains at their root one folder. For example:
$ tar tvf zprint-0.0.6.tar.gz
drwxr-xr-x root/root 0 2018-08-22 11:01 zprint-0.0.6/
...
Args:
dir_path (str): Path to the uncompressed directory
representing a release artifact from pypi.
Returns:
the DESCRIPTION parsed structure as a dict (or empty dict if missing)
"""
# Retrieve the root folder of the archive
if not os.path.exists(dir_path):
return {}
lst = os.listdir(dir_path)
if len(lst) != 1:
return {}
project_dirname = lst[0]
description_path = os.path.join(dir_path, project_dirname, "DESCRIPTION")
if not os.path.exists(description_path):
return {}
return parse_debian_control(description_path)
def parse_date(date: Optional[str]) -> Optional[TimestampWithTimezone]:
- """Parse a date into a datetime
-
- """
+ """Parse a date into a datetime"""
assert not date or isinstance(date, str)
dt: Optional[datetime.datetime] = None
if not date:
return None
try:
specific_date = DATE_PATTERN.match(date)
if specific_date:
year = int(specific_date.group("year"))
month = int(specific_date.group("month"))
dt = datetime.datetime(year, month, 1)
else:
dt = dateutil.parser.parse(date)
if not dt.tzinfo:
# up for discussion the timezone needs to be set or
# normalize_timestamp is not happy: ValueError: normalize_timestamp
# received datetime without timezone: 2001-06-08 00:00:00
dt = dt.replace(tzinfo=timezone.utc)
except Exception as e:
logger.warning("Fail to parse date %s. Reason: %s", date, e)
if dt:
return TimestampWithTimezone.from_datetime(dt)
else:
return None
diff --git a/swh/loader/package/cran/tests/test_cran.py b/swh/loader/package/cran/tests/test_cran.py
index 526ecdc..b1518e6 100644
--- a/swh/loader/package/cran/tests/test_cran.py
+++ b/swh/loader/package/cran/tests/test_cran.py
@@ -1,420 +1,426 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
from datetime import datetime, timezone
import os
from os import path
from unittest.mock import patch
from dateutil.tz import tzlocal
import pytest
from swh.core.tarball import uncompress
from swh.loader.package.cran.loader import (
CRANLoader,
extract_intrinsic_metadata,
parse_date,
parse_debian_control,
)
from swh.loader.tests import assert_last_visit_matches, check_snapshot, get_stats
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
ObjectType,
Person,
Release,
Snapshot,
SnapshotBranch,
TargetType,
TimestampWithTimezone,
)
RELEASE_ID = hash_to_bytes("daaf3cffedac946060de53648994631d0b3c63bc")
SNAPSHOT = Snapshot(
id=hash_to_bytes("c0ccd6452cbe9cd4f0a523b23f09c411bd92ef4e"),
branches={
b"HEAD": SnapshotBranch(
target=b"releases/2.22-6", target_type=TargetType.ALIAS
),
b"releases/2.22-6": SnapshotBranch(
- target=RELEASE_ID, target_type=TargetType.RELEASE,
+ target=RELEASE_ID,
+ target_type=TargetType.RELEASE,
),
},
)
def test_cran_parse_date():
data = [
# parsable, some have debatable results though
("2001-June-08", datetime(2001, 6, 8, 0, 0, tzinfo=timezone.utc)),
(
"Tue Dec 27 15:06:08 PST 2011",
datetime(2011, 12, 27, 15, 6, 8, tzinfo=timezone.utc),
),
("8-14-2013", datetime(2013, 8, 14, 0, 0, tzinfo=timezone.utc)),
("2011-01", datetime(2011, 1, 1, 0, 0, tzinfo=timezone.utc)),
("201109", datetime(2009, 11, 20, 0, 0, tzinfo=timezone.utc)),
("04-12-2014", datetime(2014, 4, 12, 0, 0, tzinfo=timezone.utc)),
(
"2018-08-24, 10:40:10",
datetime(2018, 8, 24, 10, 40, 10, tzinfo=timezone.utc),
),
("2013-October-16", datetime(2013, 10, 16, 0, 0, tzinfo=timezone.utc)),
("Aug 23, 2013", datetime(2013, 8, 23, 0, 0, tzinfo=timezone.utc)),
("27-11-2014", datetime(2014, 11, 27, 0, 0, tzinfo=timezone.utc)),
("2019-09-26,", datetime(2019, 9, 26, 0, 0, tzinfo=timezone.utc)),
("9/25/2014", datetime(2014, 9, 25, 0, 0, tzinfo=timezone.utc)),
(
"Fri Jun 27 17:23:53 2014",
datetime(2014, 6, 27, 17, 23, 53, tzinfo=timezone.utc),
),
("28-04-2014", datetime(2014, 4, 28, 0, 0, tzinfo=timezone.utc)),
("04-14-2014", datetime(2014, 4, 14, 0, 0, tzinfo=timezone.utc)),
(
"2019-05-08 14:17:31 UTC",
datetime(2019, 5, 8, 14, 17, 31, tzinfo=timezone.utc),
),
(
"Wed May 21 13:50:39 CEST 2014",
datetime(2014, 5, 21, 13, 50, 39, tzinfo=tzlocal()),
),
(
"2018-04-10 00:01:04 KST",
datetime(2018, 4, 10, 0, 1, 4, tzinfo=timezone.utc),
),
("2019-08-25 10:45", datetime(2019, 8, 25, 10, 45, tzinfo=timezone.utc)),
("March 9, 2015", datetime(2015, 3, 9, 0, 0, tzinfo=timezone.utc)),
("Aug. 18, 2012", datetime(2012, 8, 18, 0, 0, tzinfo=timezone.utc)),
("2014-Dec-17", datetime(2014, 12, 17, 0, 0, tzinfo=timezone.utc)),
("March 01, 2013", datetime(2013, 3, 1, 0, 0, tzinfo=timezone.utc)),
("2017-04-08.", datetime(2017, 4, 8, 0, 0, tzinfo=timezone.utc)),
("2014-Apr-22", datetime(2014, 4, 22, 0, 0, tzinfo=timezone.utc)),
(
"Mon Jan 12 19:54:04 2015",
datetime(2015, 1, 12, 19, 54, 4, tzinfo=timezone.utc),
),
("May 22, 2014", datetime(2014, 5, 22, 0, 0, tzinfo=timezone.utc)),
(
"2014-08-12 09:55:10 EDT",
datetime(2014, 8, 12, 9, 55, 10, tzinfo=timezone.utc),
),
# unparsable
("Fabruary 21, 2012", None),
('2019-05-28"', None),
("2017-03-01 today", None),
("2016-11-0110.1093/icesjms/fsw182", None),
("2019-07-010", None),
("2015-02.23", None),
("20013-12-30", None),
("2016-08-017", None),
("2019-02-07l", None),
("2018-05-010", None),
("2019-09-27 KST", None),
("$Date$", None),
("2019-09-27 KST", None),
("2019-06-22 $Date$", None),
("$Date: 2013-01-18 12:49:03 -0600 (Fri, 18 Jan 2013) $", None),
("2015-7-013", None),
("2018-05-023", None),
("Check NEWS file for changes: news(package='simSummary')", None),
]
for date, expected_date in data:
actual_tstz = parse_date(date)
if expected_date is None:
assert actual_tstz is None, date
else:
expected_tstz = TimestampWithTimezone.from_datetime(expected_date)
assert actual_tstz == expected_tstz, date
@pytest.mark.fs
def test_cran_extract_intrinsic_metadata(tmp_path, datadir):
"""Parsing existing archive's PKG-INFO should yield results"""
uncompressed_archive_path = str(tmp_path)
# sample url
# https://cran.r-project.org/src_contrib_1.4.0_Recommended_KernSmooth_2.22-6.tar.gz # noqa
archive_path = path.join(
datadir,
"https_cran.r-project.org",
"src_contrib_1.4.0_Recommended_KernSmooth_2.22-6.tar.gz",
)
uncompress(archive_path, dest=uncompressed_archive_path)
actual_metadata = extract_intrinsic_metadata(uncompressed_archive_path)
expected_metadata = {
"Package": "KernSmooth",
"Priority": "recommended",
"Version": "2.22-6",
"Date": "2001-June-08",
"Title": "Functions for kernel smoothing for Wand & Jones (1995)",
"Author": "S original by Matt Wand.\n\tR port by Brian Ripley <ripley@stats.ox.ac.uk>.", # noqa
"Maintainer": "Brian Ripley <ripley@stats.ox.ac.uk>",
"Description": 'functions for kernel smoothing (and density estimation)\n corresponding to the book: \n Wand, M.P. and Jones, M.C. (1995) "Kernel Smoothing".', # noqa
"License": "Unlimited use and distribution (see LICENCE).",
"URL": "http://www.biostat.harvard.edu/~mwand",
}
assert actual_metadata == expected_metadata
@pytest.mark.fs
def test_cran_extract_intrinsic_metadata_failures(tmp_path):
"""Parsing inexistent path/archive/PKG-INFO yield None"""
# inexistent first level path
assert extract_intrinsic_metadata("/something-inexistent") == {}
# inexistent second level path (as expected by pypi archives)
assert extract_intrinsic_metadata(tmp_path) == {}
# inexistent PKG-INFO within second level path
existing_path_no_pkginfo = str(tmp_path / "something")
os.mkdir(existing_path_no_pkginfo)
assert extract_intrinsic_metadata(tmp_path) == {}
def test_cran_one_visit(swh_storage, requests_mock_datadir):
version = "2.22-6"
base_url = "https://cran.r-project.org"
origin_url = f"{base_url}/Packages/Recommended_KernSmooth/index.html"
artifact_url = (
f"{base_url}/src_contrib_1.4.0_Recommended_KernSmooth_{version}.tar.gz" # noqa
)
loader = CRANLoader(
swh_storage,
origin_url,
artifacts=[
{
"url": artifact_url,
"version": version,
"package": "Recommended_KernSmooth",
}
],
)
actual_load_status = loader.load()
assert actual_load_status == {
"status": "eventful",
"snapshot_id": SNAPSHOT.id.hex(),
}
assert_last_visit_matches(
swh_storage, origin_url, status="full", type="cran", snapshot=SNAPSHOT.id
)
check_snapshot(SNAPSHOT, swh_storage)
assert swh_storage.release_get([RELEASE_ID])[0] == Release(
id=RELEASE_ID,
name=b"2.22-6",
message=(
b"Synthetic release for CRAN source package "
b"Recommended_KernSmooth version 2.22-6\n"
),
target=hash_to_bytes("ff64177fea3f4a5136b9caf7581a4f7d4cf65296"),
target_type=ObjectType.DIRECTORY,
synthetic=True,
author=Person(
fullname=b"Brian Ripley <ripley@stats.ox.ac.uk>",
name=b"Brian Ripley",
email=b"ripley@stats.ox.ac.uk",
),
date=TimestampWithTimezone.from_datetime(
datetime(2001, 6, 8, 0, 0, tzinfo=timezone.utc)
),
)
visit_stats = get_stats(swh_storage)
assert {
"content": 33,
"directory": 7,
"origin": 1,
"origin_visit": 1,
"release": 1,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == visit_stats
urls = [
m.url
for m in requests_mock_datadir.request_history
if m.url.startswith(base_url)
]
# visited each artifact once across 2 visits
assert len(urls) == 1
def test_cran_2_visits_same_origin(swh_storage, requests_mock_datadir):
"""Multiple visits on the same origin, only 1 archive fetch"""
version = "2.22-6"
base_url = "https://cran.r-project.org"
origin_url = f"{base_url}/Packages/Recommended_KernSmooth/index.html"
artifact_url = (
f"{base_url}/src_contrib_1.4.0_Recommended_KernSmooth_{version}.tar.gz" # noqa
)
loader = CRANLoader(
swh_storage,
origin_url,
artifacts=[
{
"url": artifact_url,
"version": version,
"package": "Recommended_KernSmooth",
}
],
)
# first visit
actual_load_status = loader.load()
assert actual_load_status == {
"status": "eventful",
"snapshot_id": SNAPSHOT.id.hex(),
}
check_snapshot(SNAPSHOT, swh_storage)
assert_last_visit_matches(
swh_storage, origin_url, status="full", type="cran", snapshot=SNAPSHOT.id
)
visit_stats = get_stats(swh_storage)
assert {
"content": 33,
"directory": 7,
"origin": 1,
"origin_visit": 1,
"release": 1,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == visit_stats
# second visit
actual_load_status2 = loader.load()
assert actual_load_status2 == {
"status": "uneventful",
"snapshot_id": SNAPSHOT.id.hex(),
}
assert_last_visit_matches(
- swh_storage, origin_url, status="full", type="cran", snapshot=SNAPSHOT.id,
+ swh_storage,
+ origin_url,
+ status="full",
+ type="cran",
+ snapshot=SNAPSHOT.id,
)
visit_stats2 = get_stats(swh_storage)
visit_stats["origin_visit"] += 1
assert visit_stats2 == visit_stats, "same stats as 1st visit, +1 visit"
urls = [
m.url
for m in requests_mock_datadir.request_history
if m.url.startswith(base_url)
]
assert len(urls) == 1, "visited one time artifact url (across 2 visits)"
def test_cran_parse_debian_control(datadir):
description_file = os.path.join(datadir, "description", "acepack")
actual_metadata = parse_debian_control(description_file)
assert actual_metadata == {
"Package": "acepack",
"Maintainer": "Shawn Garbett",
"Version": "1.4.1",
"Author": "Phil Spector, Jerome Friedman, Robert Tibshirani...",
"Description": "Two nonparametric methods for multiple regression...",
"Title": "ACE & AVAS 4 Selecting Multiple Regression Transformations",
"License": "MIT + file LICENSE",
"Suggests": "testthat",
"Packaged": "2016-10-28 15:38:59 UTC; garbetsp",
"Repository": "CRAN",
"Date/Publication": "2016-10-29 00:11:52",
"NeedsCompilation": "yes",
}
def test_cran_parse_debian_control_unicode_issue(datadir):
# iso-8859-1 caused failure, now fixed
description_file = os.path.join(datadir, "description", "KnownBR")
actual_metadata = parse_debian_control(description_file)
assert actual_metadata == {
"Package": "KnowBR",
"Version": "2.0",
"Title": """Discriminating Well Surveyed Spatial Units from Exhaustive
Biodiversity Databases""",
"Author": "Cástor Guisande González and Jorge M. Lobo",
"Maintainer": "Cástor Guisande González <castor@email.es>",
"Description": "It uses species accumulation curves and diverse estimators...",
"License": "GPL (>= 2)",
"Encoding": "latin1",
"Depends": "R (>= 3.0), fossil, mgcv, plotrix, sp, vegan",
"Suggests": "raster, rgbif",
"NeedsCompilation": "no",
"Packaged": "2019-01-30 13:27:29 UTC; castor",
"Repository": "CRAN",
"Date/Publication": "2019-01-31 20:53:50 UTC",
}
@pytest.mark.parametrize(
"method_name",
- ["build_extrinsic_snapshot_metadata", "build_extrinsic_origin_metadata",],
+ [
+ "build_extrinsic_snapshot_metadata",
+ "build_extrinsic_origin_metadata",
+ ],
)
def test_cran_fail_to_build_or_load_extrinsic_metadata(
method_name, swh_storage, requests_mock_datadir
):
- """problem during loading: {visit: failed, status: failed, no snapshot}
-
- """
+ """problem during loading: {visit: failed, status: failed, no snapshot}"""
version = "2.22-6"
base_url = "https://cran.r-project.org"
origin_url = f"{base_url}/Packages/Recommended_KernSmooth/index.html"
artifact_url = (
f"{base_url}/src_contrib_1.4.0_Recommended_KernSmooth_{version}.tar.gz" # noqa
)
full_method_name = f"swh.loader.package.cran.loader.CRANLoader.{method_name}"
with patch(
full_method_name,
side_effect=ValueError("Fake to fail to build or load extrinsic metadata"),
):
loader = CRANLoader(
swh_storage,
origin_url,
artifacts=[
{
"url": artifact_url,
"version": version,
"package": "Recommended_KernSmooth",
}
],
)
actual_load_status = loader.load()
assert actual_load_status == {
"status": "failed",
"snapshot_id": SNAPSHOT.id.hex(),
}
visit_stats = get_stats(swh_storage)
assert {
"content": 33,
"directory": 7,
"origin": 1,
"origin_visit": 1,
"release": 1,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == visit_stats
assert_last_visit_matches(
swh_storage, origin_url, status="partial", type="cran", snapshot=SNAPSHOT.id
)
diff --git a/swh/loader/package/cran/tests/test_tasks.py b/swh/loader/package/cran/tests/test_tasks.py
index f944f0c..ae8a604 100644
--- a/swh/loader/package/cran/tests/test_tasks.py
+++ b/swh/loader/package/cran/tests/test_tasks.py
@@ -1,23 +1,24 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
def test_tasks_cran_loader(
mocker, swh_scheduler_celery_app, swh_scheduler_celery_worker, swh_config
):
mock_load = mocker.patch("swh.loader.package.cran.loader.CRANLoader.load")
mock_load.return_value = {"status": "eventful"}
res = swh_scheduler_celery_app.send_task(
"swh.loader.package.cran.tasks.LoadCRAN",
kwargs=dict(
- url="some-url", artifacts=[{"version": "1.2.3", "url": "artifact-url"}],
+ url="some-url",
+ artifacts=[{"version": "1.2.3", "url": "artifact-url"}],
),
)
assert res
res.wait()
assert res.successful()
assert mock_load.called
assert res.result == {"status": "eventful"}
diff --git a/swh/loader/package/debian/loader.py b/swh/loader/package/debian/loader.py
index b5ec9dc..284a473 100644
--- a/swh/loader/package/debian/loader.py
+++ b/swh/loader/package/debian/loader.py
@@ -1,462 +1,467 @@
# Copyright (C) 2017-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import logging
from os import path
import re
import subprocess
from typing import Any, Dict, Iterator, List, Mapping, Optional, Sequence, Tuple
import attr
from dateutil.parser import parse as parse_date
from debian.changelog import Changelog
from debian.deb822 import Dsc
from swh.loader.package.loader import BasePackageInfo, PackageLoader, PartialExtID
from swh.loader.package.utils import download, release_name
from swh.model.hashutil import hash_to_bytes
from swh.model.model import ObjectType, Person, Release, Sha1Git, TimestampWithTimezone
from swh.storage.interface import StorageInterface
logger = logging.getLogger(__name__)
UPLOADERS_SPLIT = re.compile(r"(?<=\>)\s*,\s*")
EXTID_TYPE = "dsc-sha256"
EXTID_VERSION = 1
class DscCountError(ValueError):
"""Raised when an unexpected number of .dsc files is seen"""
pass
@attr.s
class DebianFileMetadata:
name = attr.ib(type=str)
"""Filename"""
size = attr.ib(type=int)
uri = attr.ib(type=str)
"""URL of this specific file"""
# all checksums are not always available, make them optional
sha256 = attr.ib(type=str, default="")
md5sum = attr.ib(type=str, default="")
sha1 = attr.ib(type=str, default="")
# Some of the DSC files imported in swh apparently had a Checksums-SHA512
# field which got recorded in the archive. Current versions of dpkg-source
# don't seem to generate them, but keep the field available for
# future-proofing.
sha512 = attr.ib(type=str, default="")
@attr.s
class DebianPackageChangelog:
person = attr.ib(type=Dict[str, str])
"""A dict with fields like, model.Person, except they are str instead
of bytes, and 'email' is optional."""
date = attr.ib(type=str)
"""Date of the changelog entry."""
history = attr.ib(type=List[Tuple[str, str]])
"""List of tuples (package_name, version)"""
@attr.s
class DebianPackageInfo(BasePackageInfo):
raw_info = attr.ib(type=Dict[str, Any])
files = attr.ib(type=Dict[str, DebianFileMetadata])
"""Metadata of the files (.deb, .dsc, ...) of the package."""
name = attr.ib(type=str)
intrinsic_version = attr.ib(type=str)
"""eg. ``0.7.2-3``, while :attr:`version` would be ``stretch/contrib/0.7.2-3``"""
@classmethod
def from_metadata(
cls, a_metadata: Dict[str, Any], url: str, version: str
) -> "DebianPackageInfo":
intrinsic_version = a_metadata["version"]
assert "/" in version and "/" not in intrinsic_version, (
version,
intrinsic_version,
)
return cls(
url=url,
filename=None,
version=version,
raw_info=a_metadata,
files={
file_name: DebianFileMetadata(**file_metadata)
for (file_name, file_metadata) in a_metadata.get("files", {}).items()
},
name=a_metadata["name"],
intrinsic_version=intrinsic_version,
)
def extid(self) -> Optional[PartialExtID]:
dsc_files = [
file for (name, file) in self.files.items() if name.endswith(".dsc")
]
if len(dsc_files) != 1:
raise DscCountError(
f"Expected exactly one .dsc file for package {self.name}, "
f"got {len(dsc_files)}"
)
return (EXTID_TYPE, EXTID_VERSION, hash_to_bytes(dsc_files[0].sha256))
@attr.s
class IntrinsicPackageMetadata:
"""Metadata extracted from a package's .dsc file."""
name = attr.ib(type=str)
version = attr.ib(type=str)
changelog = attr.ib(type=DebianPackageChangelog)
maintainers = attr.ib(type=List[Dict[str, str]])
"""A list of dicts with fields like, model.Person, except they are str instead
of bytes, and 'email' is optional."""
class DebianLoader(PackageLoader[DebianPackageInfo]):
- """Load debian origins into swh archive.
-
- """
+ """Load debian origins into swh archive."""
visit_type = "deb"
def __init__(
self,
storage: StorageInterface,
url: str,
packages: Mapping[str, Any],
max_content_size: Optional[int] = None,
):
"""Debian Loader implementation.
Args:
url: Origin url (e.g. deb://Debian/packages/cicero)
date: Ignored
packages: versioned packages and associated artifacts, example::
{
'stretch/contrib/0.7.2-3': {
'name': 'cicero',
'version': '0.7.2-3'
'files': {
'cicero_0.7.2-3.diff.gz': {
'md5sum': 'a93661b6a48db48d59ba7d26796fc9ce',
'name': 'cicero_0.7.2-3.diff.gz',
'sha256': 'f039c9642fe15c75bed5254315e2a29f...',
'size': 3964,
'uri': 'http://d.d.o/cicero_0.7.2-3.diff.gz',
},
'cicero_0.7.2-3.dsc': {
'md5sum': 'd5dac83eb9cfc9bb52a15eb618b4670a',
'name': 'cicero_0.7.2-3.dsc',
'sha256': '35b7f1048010c67adfd8d70e4961aefb...',
'size': 1864,
'uri': 'http://d.d.o/cicero_0.7.2-3.dsc',
},
'cicero_0.7.2.orig.tar.gz': {
'md5sum': '4353dede07c5728319ba7f5595a7230a',
'name': 'cicero_0.7.2.orig.tar.gz',
'sha256': '63f40f2436ea9f67b44e2d4bd669dbab...',
'size': 96527,
'uri': 'http://d.d.o/cicero_0.7.2.orig.tar.gz',
}
},
},
# ...
}
"""
super().__init__(storage=storage, url=url, max_content_size=max_content_size)
self.packages = packages
def get_versions(self) -> Sequence[str]:
"""Returns the keys of the packages input (e.g.
- stretch/contrib/0.7.2-3, etc...)
+ stretch/contrib/0.7.2-3, etc...)
"""
return list(self.packages.keys())
def get_package_info(self, version: str) -> Iterator[Tuple[str, DebianPackageInfo]]:
meta = self.packages[version]
p_info = DebianPackageInfo.from_metadata(meta, url=self.url, version=version)
yield release_name(version), p_info
def download_package(
self, p_info: DebianPackageInfo, tmpdir: str
) -> List[Tuple[str, Mapping]]:
"""Contrary to other package loaders (1 package, 1 artifact),
`p_info.files` represents the package's datafiles set to fetch:
- <package-version>.orig.tar.gz
- <package-version>.dsc
- <package-version>.diff.gz
This is delegated to the `download_package` function.
"""
all_hashes = download_package(p_info, tmpdir)
logger.debug("all_hashes: %s", all_hashes)
res = []
for hashes in all_hashes.values():
res.append((tmpdir, hashes))
logger.debug("res: %s", res)
return res
def uncompress(
self, dl_artifacts: List[Tuple[str, Mapping[str, Any]]], dest: str
) -> str:
logger.debug("dl_artifacts: %s", dl_artifacts)
return extract_package(dl_artifacts, dest=dest)
def build_release(
- self, p_info: DebianPackageInfo, uncompressed_path: str, directory: Sha1Git,
+ self,
+ p_info: DebianPackageInfo,
+ uncompressed_path: str,
+ directory: Sha1Git,
) -> Optional[Release]:
dsc_url, dsc_name = dsc_information(p_info)
if not dsc_name:
raise ValueError("dsc name for url %s should not be None" % dsc_url)
dsc_path = path.join(path.dirname(uncompressed_path), dsc_name)
intrinsic_metadata = get_intrinsic_package_metadata(
p_info, dsc_path, uncompressed_path
)
logger.debug("intrinsic_metadata: %s", intrinsic_metadata)
logger.debug("p_info: %s", p_info)
msg = (
f"Synthetic release for Debian source package {p_info.name} "
f"version {p_info.intrinsic_version}\n"
)
author = prepare_person(intrinsic_metadata.changelog.person)
date = TimestampWithTimezone.from_iso8601(intrinsic_metadata.changelog.date)
# inspired from swh.loader.debian.converters.package_metadata_to_revision
return Release(
name=p_info.intrinsic_version.encode(),
message=msg.encode(),
author=author,
date=date,
target=directory,
target_type=ObjectType.DIRECTORY,
synthetic=True,
)
def uid_to_person(uid: str) -> Dict[str, str]:
"""Convert an uid to a person suitable for insertion.
Args:
uid: an uid of the form "Name <email@ddress>"
Returns:
a dictionary with the following keys:
- name: the name associated to the uid
- email: the mail associated to the uid
- fullname: the actual uid input
"""
person = Person.from_fullname(uid.encode("utf-8"))
return {k: v.decode("utf-8") for k, v in person.to_dict().items() if v is not None}
def prepare_person(person: Mapping[str, str]) -> Person:
"""Prepare person for swh serialization...
Args:
A person dict
Returns:
A person ready for storage
"""
return Person.from_dict(
{key: value.encode("utf-8") for (key, value) in person.items()}
)
def download_package(p_info: DebianPackageInfo, tmpdir: Any) -> Mapping[str, Any]:
"""Fetch a source package in a temporary directory and check the checksums
for all files.
Args:
p_info: Information on a package
tmpdir: Where to download and extract the files to ingest
Returns:
Dict of swh hashes per filename key
"""
all_hashes = {}
for filename, fileinfo in p_info.files.items():
uri = fileinfo.uri
logger.debug("fileinfo: %s", fileinfo)
extrinsic_hashes = {"md5": fileinfo.md5sum}
if fileinfo.sha256:
extrinsic_hashes["sha256"] = fileinfo.sha256
if fileinfo.sha1:
extrinsic_hashes["sha1"] = fileinfo.sha1
logger.debug("extrinsic_hashes(%s): %s", filename, extrinsic_hashes)
_, hashes = download(
uri, dest=tmpdir, filename=filename, hashes=extrinsic_hashes
)
all_hashes[filename] = hashes
logger.debug("all_hashes: %s", all_hashes)
return all_hashes
def dsc_information(p_info: DebianPackageInfo) -> Tuple[Optional[str], Optional[str]]:
"""Retrieve dsc information from a package.
Args:
p_info: Package metadata information
Returns:
Tuple of dsc file's uri, dsc's full disk path
"""
dsc_name = None
dsc_url = None
for filename, fileinfo in p_info.files.items():
if filename.endswith(".dsc"):
if dsc_name:
raise DscCountError(
"Package %s_%s references several dsc files."
% (p_info.name, p_info.intrinsic_version)
)
dsc_url = fileinfo.uri
dsc_name = filename
return dsc_url, dsc_name
def extract_package(dl_artifacts: List[Tuple[str, Mapping]], dest: str) -> str:
"""Extract a Debian source package to a given directory.
Note that after extraction the target directory will be the root of the
extracted package, rather than containing it.
Args:
package: package information dictionary
dest: directory where the package files are stored
Returns:
Package extraction directory
"""
a_path = dl_artifacts[0][0]
logger.debug("dl_artifacts: %s", dl_artifacts)
for _, hashes in dl_artifacts:
logger.debug("hashes: %s", hashes)
filename = hashes["filename"]
if filename.endswith(".dsc"):
dsc_name = filename
break
dsc_path = path.join(a_path, dsc_name)
destdir = path.join(dest, "extracted")
logfile = path.join(dest, "extract.log")
logger.debug(
"extract Debian source package %s in %s" % (dsc_path, destdir),
- extra={"swh_type": "deb_extract", "swh_dsc": dsc_path, "swh_destdir": destdir,},
+ extra={
+ "swh_type": "deb_extract",
+ "swh_dsc": dsc_path,
+ "swh_destdir": destdir,
+ },
)
cmd = [
"dpkg-source",
"--no-copy",
"--no-check",
"--ignore-bad-version",
"-x",
dsc_path,
destdir,
]
try:
with open(logfile, "w") as stdout:
subprocess.check_call(cmd, stdout=stdout, stderr=subprocess.STDOUT)
except subprocess.CalledProcessError as e:
logdata = open(logfile, "r").read()
raise ValueError(
"dpkg-source exited with code %s: %s" % (e.returncode, logdata)
) from None
return destdir
def get_intrinsic_package_metadata(
p_info: DebianPackageInfo, dsc_path: str, extracted_path: str
) -> IntrinsicPackageMetadata:
"""Get the package metadata from the source package at dsc_path,
extracted in extracted_path.
Args:
p_info: the package information
dsc_path: path to the package's dsc file
extracted_path: the path where the package got extracted
Returns:
dict: a dictionary with the following keys:
- history: list of (package_name, package_version) tuples parsed from
the package changelog
"""
with open(dsc_path, "rb") as dsc:
parsed_dsc = Dsc(dsc)
# Parse the changelog to retrieve the rest of the package information
changelog_path = path.join(extracted_path, "debian/changelog")
with open(changelog_path, "rb") as changelog_file:
try:
parsed_changelog = Changelog(changelog_file)
except UnicodeDecodeError:
logger.warning(
"Unknown encoding for changelog %s,"
" falling back to iso" % changelog_path,
extra={
"swh_type": "deb_changelog_encoding",
"swh_name": p_info.name,
"swh_version": str(p_info.version),
"swh_changelog": changelog_path,
},
)
# need to reset as Changelog scrolls to the end of the file
changelog_file.seek(0)
parsed_changelog = Changelog(changelog_file, encoding="iso-8859-15")
history: List[Tuple[str, str]] = []
for block in parsed_changelog:
assert block.package is not None
history.append((block.package, str(block.version)))
changelog = DebianPackageChangelog(
person=uid_to_person(parsed_changelog.author),
date=parse_date(parsed_changelog.date).isoformat(),
history=history[1:],
)
maintainers = [
uid_to_person(parsed_dsc["Maintainer"]),
]
maintainers.extend(
uid_to_person(person)
for person in UPLOADERS_SPLIT.split(parsed_dsc.get("Uploaders", ""))
)
return IntrinsicPackageMetadata(
name=p_info.name,
version=str(p_info.intrinsic_version),
changelog=changelog,
maintainers=maintainers,
)
diff --git a/swh/loader/package/debian/tests/test_debian.py b/swh/loader/package/debian/tests/test_debian.py
index 3042870..0864ac2 100644
--- a/swh/loader/package/debian/tests/test_debian.py
+++ b/swh/loader/package/debian/tests/test_debian.py
@@ -1,542 +1,551 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
from copy import deepcopy
import datetime
import hashlib
import logging
from os import path
import pytest
import requests
from swh.loader.package.debian.loader import (
DebianLoader,
DebianPackageChangelog,
DebianPackageInfo,
IntrinsicPackageMetadata,
download_package,
dsc_information,
extract_package,
get_intrinsic_package_metadata,
prepare_person,
uid_to_person,
)
from swh.loader.tests import assert_last_visit_matches, check_snapshot, get_stats
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
ObjectType,
Person,
Release,
Snapshot,
SnapshotBranch,
TargetType,
TimestampWithTimezone,
)
logger = logging.getLogger(__name__)
URL = "deb://Debian/packages/cicero"
PACKAGE_FILES = {
"name": "cicero",
"version": "0.7.2-3",
"files": {
"cicero_0.7.2-3.diff.gz": {
"md5sum": "a93661b6a48db48d59ba7d26796fc9ce",
"name": "cicero_0.7.2-3.diff.gz",
"sha256": "f039c9642fe15c75bed5254315e2a29f9f2700da0e29d9b0729b3ffc46c8971c", # noqa
"size": 3964,
"uri": "http://deb.debian.org/debian/pool/contrib/c/cicero/cicero_0.7.2-3.diff.gz", # noqa
},
"cicero_0.7.2-3.dsc": {
"md5sum": "d5dac83eb9cfc9bb52a15eb618b4670a",
"name": "cicero_0.7.2-3.dsc",
"sha256": "35b7f1048010c67adfd8d70e4961aefd8800eb9a83a4d1cc68088da0009d9a03", # noqa
"size": 1864,
"uri": "http://deb.debian.org/debian/pool/contrib/c/cicero/cicero_0.7.2-3.dsc", # noqa
}, # noqa
"cicero_0.7.2.orig.tar.gz": {
"md5sum": "4353dede07c5728319ba7f5595a7230a",
"name": "cicero_0.7.2.orig.tar.gz",
"sha256": "63f40f2436ea9f67b44e2d4bd669dbabe90e2635a204526c20e0b3c8ee957786", # noqa
"size": 96527,
"uri": "http://deb.debian.org/debian/pool/contrib/c/cicero/cicero_0.7.2.orig.tar.gz", # noqa
},
},
}
PACKAGE_FILES2 = {
"name": "cicero",
"version": "0.7.2-4",
"files": {
"cicero_0.7.2-4.diff.gz": {
"md5sum": "1e7e6fc4a59d57c98082a3af78145734",
"name": "cicero_0.7.2-4.diff.gz",
"sha256": "2e6fa296ee7005473ff58d0971f4fd325617b445671480e9f2cfb738d5dbcd01", # noqa
"size": 4038,
"uri": "http://deb.debian.org/debian/pool/contrib/c/cicero/cicero_0.7.2-4.diff.gz", # noqa
},
"cicero_0.7.2-4.dsc": {
"md5sum": "1a6c8855a73b4282bb31d15518f18cde",
"name": "cicero_0.7.2-4.dsc",
"sha256": "913ee52f7093913420de5cbe95d63cfa817f1a1daf997961149501894e754f8b", # noqa
"size": 1881,
"uri": "http://deb.debian.org/debian/pool/contrib/c/cicero/cicero_0.7.2-4.dsc", # noqa
}, # noqa
"cicero_0.7.2.orig.tar.gz": {
"md5sum": "4353dede07c5728319ba7f5595a7230a",
"name": "cicero_0.7.2.orig.tar.gz",
"sha256": "63f40f2436ea9f67b44e2d4bd669dbabe90e2635a204526c20e0b3c8ee957786", # noqa
"size": 96527,
"uri": "http://deb.debian.org/debian/pool/contrib/c/cicero/cicero_0.7.2.orig.tar.gz", # noqa
},
},
}
PACKAGE_PER_VERSION = {
"stretch/contrib/0.7.2-3": PACKAGE_FILES,
}
PACKAGES_PER_VERSION = {
"stretch/contrib/0.7.2-3": PACKAGE_FILES,
"buster/contrib/0.7.2-4": PACKAGE_FILES2,
}
def test_debian_first_visit(swh_storage, requests_mock_datadir):
- """With no prior visit, load a gnu project ends up with 1 snapshot
-
- """
- loader = DebianLoader(swh_storage, URL, packages=PACKAGE_PER_VERSION,)
+ """With no prior visit, load a gnu project ends up with 1 snapshot"""
+ loader = DebianLoader(
+ swh_storage,
+ URL,
+ packages=PACKAGE_PER_VERSION,
+ )
actual_load_status = loader.load()
expected_snapshot_id = "f9e4d0d200433dc998ad2ca40ee1244785fe6ed1"
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id,
}
assert_last_visit_matches(
swh_storage,
URL,
status="full",
type="deb",
snapshot=hash_to_bytes(expected_snapshot_id),
)
release_id = hash_to_bytes("de96ae3d3e136f5c1709117059e2a2c05b8ee5ae")
expected_snapshot = Snapshot(
id=hash_to_bytes(expected_snapshot_id),
branches={
b"releases/stretch/contrib/0.7.2-3": SnapshotBranch(
- target_type=TargetType.RELEASE, target=release_id,
+ target_type=TargetType.RELEASE,
+ target=release_id,
)
},
) # different than the previous loader as no release is done
check_snapshot(expected_snapshot, swh_storage)
assert swh_storage.release_get([release_id])[0] == Release(
id=release_id,
name=b"0.7.2-3",
message=b"Synthetic release for Debian source package cicero version 0.7.2-3\n",
target=hash_to_bytes("798df511408c53bf842a8e54d4d335537836bdc3"),
target_type=ObjectType.DIRECTORY,
synthetic=True,
author=Person(
fullname=b"Samuel Thibault <sthibault@debian.org>",
name=b"Samuel Thibault",
email=b"sthibault@debian.org",
),
date=TimestampWithTimezone.from_datetime(
datetime.datetime(
2014,
10,
19,
16,
52,
35,
tzinfo=datetime.timezone(datetime.timedelta(seconds=7200)),
)
),
)
stats = get_stats(swh_storage)
assert {
"content": 42,
"directory": 2,
"origin": 1,
"origin_visit": 1,
"release": 1, # all artifacts under 1 release
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_debian_first_visit_then_another_visit(swh_storage, requests_mock_datadir):
- """With no prior visit, load a debian project ends up with 1 snapshot
-
- """
- loader = DebianLoader(swh_storage, URL, packages=PACKAGE_PER_VERSION,)
+ """With no prior visit, load a debian project ends up with 1 snapshot"""
+ loader = DebianLoader(
+ swh_storage,
+ URL,
+ packages=PACKAGE_PER_VERSION,
+ )
actual_load_status = loader.load()
expected_snapshot_id = "f9e4d0d200433dc998ad2ca40ee1244785fe6ed1"
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id,
}
assert_last_visit_matches(
swh_storage,
URL,
status="full",
type="deb",
snapshot=hash_to_bytes(expected_snapshot_id),
)
expected_snapshot = Snapshot(
id=hash_to_bytes(expected_snapshot_id),
branches={
b"releases/stretch/contrib/0.7.2-3": SnapshotBranch(
target_type=TargetType.RELEASE,
target=hash_to_bytes("de96ae3d3e136f5c1709117059e2a2c05b8ee5ae"),
)
},
) # different than the previous loader as no release is done
check_snapshot(expected_snapshot, swh_storage)
stats = get_stats(swh_storage)
assert {
"content": 42,
"directory": 2,
"origin": 1,
"origin_visit": 1,
"release": 1, # all artifacts under 1 release
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
# No change in between load
actual_load_status2 = loader.load()
assert actual_load_status2["status"] == "uneventful"
assert_last_visit_matches(
swh_storage,
URL,
status="full",
type="deb",
snapshot=hash_to_bytes(expected_snapshot_id),
)
stats2 = get_stats(swh_storage)
assert {
"content": 42 + 0,
"directory": 2 + 0,
"origin": 1,
"origin_visit": 1 + 1, # a new visit occurred
"release": 1,
"revision": 0,
"skipped_content": 0,
"snapshot": 1, # same snapshot across 2 visits
} == stats2
urls = [
m.url
for m in requests_mock_datadir.request_history
if m.url.startswith("http://deb.debian.org")
]
# visited each package artifact twice across 2 visits
assert len(urls) == len(set(urls))
def test_debian_uid_to_person():
uid = "Someone Name <someone@orga.org>"
actual_person = uid_to_person(uid)
assert actual_person == {
"name": "Someone Name",
"email": "someone@orga.org",
"fullname": uid,
}
def test_debian_prepare_person():
actual_author = prepare_person(
{
"name": "Someone Name",
"email": "someone@orga.org",
"fullname": "Someone Name <someone@orga.org>",
}
)
assert actual_author == Person(
name=b"Someone Name",
email=b"someone@orga.org",
fullname=b"Someone Name <someone@orga.org>",
)
def test_debian_download_package(datadir, tmpdir, requests_mock_datadir):
tmpdir = str(tmpdir) # py3.5 work around (LocalPath issue)
p_info = DebianPackageInfo.from_metadata(
PACKAGE_FILES, url=URL, version="stretch/contrib/0.7.2-3"
)
all_hashes = download_package(p_info, tmpdir)
assert all_hashes == {
"cicero_0.7.2-3.diff.gz": {
"checksums": {
"md5": "a93661b6a48db48d59ba7d26796fc9ce",
"sha1": "0815282053f21601b0ec4adf7a8fe47eace3c0bc",
"sha256": "f039c9642fe15c75bed5254315e2a29f9f2700da0e29d9b0729b3ffc46c8971c", # noqa
},
"filename": "cicero_0.7.2-3.diff.gz",
"length": 3964,
"url": (
"http://deb.debian.org/debian/pool/contrib/c/cicero/"
"cicero_0.7.2-3.diff.gz"
),
},
"cicero_0.7.2-3.dsc": {
"checksums": {
"md5": "d5dac83eb9cfc9bb52a15eb618b4670a",
"sha1": "abbec4e8efbbc80278236e1dd136831eac08accd",
"sha256": "35b7f1048010c67adfd8d70e4961aefd8800eb9a83a4d1cc68088da0009d9a03", # noqa
},
"filename": "cicero_0.7.2-3.dsc",
"length": 1864,
"url": (
"http://deb.debian.org/debian/pool/contrib/c/cicero/cicero_0.7.2-3.dsc"
),
},
"cicero_0.7.2.orig.tar.gz": {
"checksums": {
"md5": "4353dede07c5728319ba7f5595a7230a",
"sha1": "a286efd63fe2c9c9f7bb30255c3d6fcdcf390b43",
"sha256": "63f40f2436ea9f67b44e2d4bd669dbabe90e2635a204526c20e0b3c8ee957786", # noqa
},
"filename": "cicero_0.7.2.orig.tar.gz",
"length": 96527,
"url": (
"http://deb.debian.org/debian/pool/contrib/c/cicero/"
"cicero_0.7.2.orig.tar.gz"
),
},
}
def test_debian_dsc_information_ok():
fname = "cicero_0.7.2-3.dsc"
p_info = DebianPackageInfo.from_metadata(
PACKAGE_FILES, url=URL, version="stretch/contrib/0.7.2-3"
)
dsc_url, dsc_name = dsc_information(p_info)
assert dsc_url == PACKAGE_FILES["files"][fname]["uri"]
assert dsc_name == PACKAGE_FILES["files"][fname]["name"]
def test_debian_dsc_information_not_found():
fname = "cicero_0.7.2-3.dsc"
p_info = DebianPackageInfo.from_metadata(
PACKAGE_FILES, url=URL, version="stretch/contrib/0.7.2-3"
)
p_info.files.pop(fname)
dsc_url, dsc_name = dsc_information(p_info)
assert dsc_url is None
assert dsc_name is None
def test_debian_dsc_information_missing_md5sum():
package_files = deepcopy(PACKAGE_FILES)
for package_metadata in package_files["files"].values():
del package_metadata["md5sum"]
p_info = DebianPackageInfo.from_metadata(
package_files, url=URL, version="stretch/contrib/0.7.2-3"
)
for debian_file_metadata in p_info.files.values():
assert not debian_file_metadata.md5sum
def test_debian_dsc_information_extra_sha1(requests_mock_datadir):
package_files = deepcopy(PACKAGE_FILES)
for package_metadata in package_files["files"].values():
file_bytes = requests.get(package_metadata["uri"]).content
package_metadata["sha1"] = hashlib.sha1(file_bytes).hexdigest()
p_info = DebianPackageInfo.from_metadata(
package_files, url=URL, version="stretch/contrib/0.7.2-3"
)
for debian_file_metadata in p_info.files.values():
assert debian_file_metadata.sha1
def test_debian_dsc_information_too_many_dsc_entries():
# craft an extra dsc file
fname = "cicero_0.7.2-3.dsc"
p_info = DebianPackageInfo.from_metadata(
PACKAGE_FILES, url=URL, version="stretch/contrib/0.7.2-3"
)
data = p_info.files[fname]
fname2 = fname.replace("cicero", "ciceroo")
p_info.files[fname2] = data
with pytest.raises(
ValueError,
match="Package %s_%s references several dsc"
% (PACKAGE_FILES["name"], PACKAGE_FILES["version"]),
):
dsc_information(p_info)
def test_debian_get_intrinsic_package_metadata(
requests_mock_datadir, datadir, tmp_path
):
tmp_path = str(tmp_path) # py3.5 compat.
p_info = DebianPackageInfo.from_metadata(
PACKAGE_FILES, url=URL, version="stretch/contrib/0.7.2-3"
)
logger.debug("p_info: %s", p_info)
# download the packages
all_hashes = download_package(p_info, tmp_path)
# Retrieve information from package
_, dsc_name = dsc_information(p_info)
dl_artifacts = [(tmp_path, hashes) for hashes in all_hashes.values()]
# Extract information from package
extracted_path = extract_package(dl_artifacts, tmp_path)
# Retrieve information on package
dsc_path = path.join(path.dirname(extracted_path), dsc_name)
actual_package_info = get_intrinsic_package_metadata(
p_info, dsc_path, extracted_path
)
logger.debug("actual_package_info: %s", actual_package_info)
assert actual_package_info == IntrinsicPackageMetadata(
changelog=DebianPackageChangelog(
date="2014-10-19T16:52:35+02:00",
history=[
("cicero", "0.7.2-2"),
("cicero", "0.7.2-1"),
("cicero", "0.7-1"),
],
person={
"email": "sthibault@debian.org",
"fullname": "Samuel Thibault <sthibault@debian.org>",
"name": "Samuel Thibault",
},
),
maintainers=[
{
"email": "debian-accessibility@lists.debian.org",
"fullname": "Debian Accessibility Team "
"<debian-accessibility@lists.debian.org>",
"name": "Debian Accessibility Team",
},
{
"email": "sthibault@debian.org",
"fullname": "Samuel Thibault <sthibault@debian.org>",
"name": "Samuel Thibault",
},
],
name="cicero",
version="0.7.2-3",
)
def test_debian_multiple_packages(swh_storage, requests_mock_datadir):
- loader = DebianLoader(swh_storage, URL, packages=PACKAGES_PER_VERSION,)
+ loader = DebianLoader(
+ swh_storage,
+ URL,
+ packages=PACKAGES_PER_VERSION,
+ )
actual_load_status = loader.load()
expected_snapshot_id = "474c0e3d5796d15363031c333533527d659c559e"
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id,
}
assert_last_visit_matches(
swh_storage,
URL,
status="full",
type="deb",
snapshot=hash_to_bytes(expected_snapshot_id),
)
expected_snapshot = Snapshot(
id=hash_to_bytes(expected_snapshot_id),
branches={
b"releases/stretch/contrib/0.7.2-3": SnapshotBranch(
target_type=TargetType.RELEASE,
target=hash_to_bytes("de96ae3d3e136f5c1709117059e2a2c05b8ee5ae"),
),
b"releases/buster/contrib/0.7.2-4": SnapshotBranch(
target_type=TargetType.RELEASE,
target=hash_to_bytes("11824484c585319302ea4fde4917faf78dfb1973"),
),
},
)
check_snapshot(expected_snapshot, swh_storage)
def test_debian_loader_only_md5_sum_in_dsc(swh_storage, requests_mock_datadir):
packages_per_version = deepcopy(PACKAGES_PER_VERSION)
for package_files in packages_per_version.values():
for package_data in package_files["files"].values():
del package_data["sha256"]
loader = DebianLoader(swh_storage, URL, packages=packages_per_version)
actual_load_status = loader.load()
expected_snapshot_id = "474c0e3d5796d15363031c333533527d659c559e"
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id,
}
assert_last_visit_matches(
swh_storage,
URL,
status="full",
type="deb",
snapshot=hash_to_bytes(expected_snapshot_id),
)
expected_snapshot = Snapshot(
id=hash_to_bytes(expected_snapshot_id),
branches={
b"releases/stretch/contrib/0.7.2-3": SnapshotBranch(
target_type=TargetType.RELEASE,
target=hash_to_bytes("de96ae3d3e136f5c1709117059e2a2c05b8ee5ae"),
),
b"releases/buster/contrib/0.7.2-4": SnapshotBranch(
target_type=TargetType.RELEASE,
target=hash_to_bytes("11824484c585319302ea4fde4917faf78dfb1973"),
),
},
)
check_snapshot(expected_snapshot, swh_storage)
diff --git a/swh/loader/package/deposit/loader.py b/swh/loader/package/deposit/loader.py
index 794f33d..c679291 100644
--- a/swh/loader/package/deposit/loader.py
+++ b/swh/loader/package/deposit/loader.py
@@ -1,381 +1,376 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
from datetime import timezone
import json
import logging
from typing import Any, Dict, Iterator, List, Mapping, Optional, Sequence, Tuple, Union
import attr
import requests
from swh.core.config import load_from_envvar
from swh.loader.core.loader import DEFAULT_CONFIG
from swh.loader.package.loader import (
BasePackageInfo,
PackageLoader,
RawExtrinsicMetadataCore,
)
from swh.loader.package.utils import cached_method, download
from swh.model.hashutil import hash_to_bytes, hash_to_hex
from swh.model.model import (
MetadataAuthority,
MetadataAuthorityType,
MetadataFetcher,
ObjectType,
Person,
Release,
Sha1Git,
TimestampWithTimezone,
)
from swh.storage.algos.snapshot import snapshot_get_all_branches
from swh.storage.interface import StorageInterface
logger = logging.getLogger(__name__)
def now() -> datetime.datetime:
return datetime.datetime.now(tz=timezone.utc)
@attr.s
class DepositPackageInfo(BasePackageInfo):
filename = attr.ib(type=str) # instead of Optional[str]
author_date = attr.ib(type=datetime.datetime)
"""codemeta:dateCreated if any, deposit completed_date otherwise"""
commit_date = attr.ib(type=datetime.datetime)
"""codemeta:datePublished if any, deposit completed_date otherwise"""
client = attr.ib(type=str)
id = attr.ib(type=int)
"""Internal ID of the deposit in the deposit DB"""
collection = attr.ib(type=str)
"""The collection in the deposit; see SWORD specification."""
author = attr.ib(type=Person)
committer = attr.ib(type=Person)
release_notes = attr.ib(type=Optional[str])
@classmethod
def from_metadata(
cls, metadata: Dict[str, Any], url: str, filename: str, version: str
) -> "DepositPackageInfo":
# Note:
# `date` and `committer_date` are always transmitted by the deposit read api
# which computes itself the values. The loader needs to use those to create the
# release.
raw_metadata: str = metadata["raw_metadata"]
depo = metadata["deposit"]
return cls(
url=url,
filename=filename,
version=version,
author_date=depo["author_date"],
commit_date=depo["committer_date"],
client=depo["client"],
id=depo["id"],
collection=depo["collection"],
author=parse_author(depo["author"]),
committer=parse_author(depo["committer"]),
release_notes=depo["release_notes"],
directory_extrinsic_metadata=[
RawExtrinsicMetadataCore(
discovery_date=now(),
metadata=raw_metadata.encode(),
format="sword-v2-atom-codemeta-v2",
)
],
)
def extid(self) -> None:
# For now, we don't try to deduplicate deposits. There is little point anyway,
# as it only happens when the exact same tarball was deposited twice.
return None
class DepositLoader(PackageLoader[DepositPackageInfo]):
- """Load a deposited artifact into swh archive.
-
- """
+ """Load a deposited artifact into swh archive."""
visit_type = "deposit"
def __init__(
self,
storage: StorageInterface,
url: str,
deposit_id: str,
deposit_client: "ApiClient",
max_content_size: Optional[int] = None,
default_filename: str = "archive.tar",
):
"""Constructor
Args:
url: Origin url to associate the artifacts/metadata to
deposit_id: Deposit identity
deposit_client: Deposit api client
"""
super().__init__(storage=storage, url=url, max_content_size=max_content_size)
self.deposit_id = deposit_id
self.client = deposit_client
self.default_filename = default_filename
@classmethod
def from_configfile(cls, **kwargs: Any):
"""Instantiate a loader from the configuration loaded from the
SWH_CONFIG_FILENAME envvar, with potential extra keyword arguments if their
value is not None.
Args:
kwargs: kwargs passed to the loader instantiation
"""
config = dict(load_from_envvar(DEFAULT_CONFIG))
config.update({k: v for k, v in kwargs.items() if v is not None})
deposit_client = ApiClient(**config.pop("deposit"))
return cls.from_config(deposit_client=deposit_client, **config)
def get_versions(self) -> Sequence[str]:
# only 1 branch 'HEAD' with no alias since we only have 1 snapshot
# branch
return ["HEAD"]
def get_metadata_authority(self) -> MetadataAuthority:
provider = self.metadata()["provider"]
assert provider["provider_type"] == MetadataAuthorityType.DEPOSIT_CLIENT.value
return MetadataAuthority(
type=MetadataAuthorityType.DEPOSIT_CLIENT,
url=provider["provider_url"],
metadata={
"name": provider["provider_name"],
**(provider["metadata"] or {}),
},
)
def get_metadata_fetcher(self) -> MetadataFetcher:
tool = self.metadata()["tool"]
return MetadataFetcher(
- name=tool["name"], version=tool["version"], metadata=tool["configuration"],
+ name=tool["name"],
+ version=tool["version"],
+ metadata=tool["configuration"],
)
def get_package_info(
self, version: str
) -> Iterator[Tuple[str, DepositPackageInfo]]:
p_info = DepositPackageInfo.from_metadata(
self.metadata(),
url=self.url,
filename=self.default_filename,
version=version,
)
yield "HEAD", p_info
def download_package(
self, p_info: DepositPackageInfo, tmpdir: str
) -> List[Tuple[str, Mapping]]:
- """Override to allow use of the dedicated deposit client
-
- """
+ """Override to allow use of the dedicated deposit client"""
return [self.client.archive_get(self.deposit_id, tmpdir, p_info.filename)]
def build_release(
- self, p_info: DepositPackageInfo, uncompressed_path: str, directory: Sha1Git,
+ self,
+ p_info: DepositPackageInfo,
+ uncompressed_path: str,
+ directory: Sha1Git,
) -> Optional[Release]:
message = (
f"{p_info.client}: Deposit {p_info.id} in collection {p_info.collection}"
)
if p_info.release_notes:
message += "\n\n" + p_info.release_notes
if not message.endswith("\n"):
message += "\n"
return Release(
name=p_info.version.encode(),
message=message.encode(),
author=p_info.author,
date=TimestampWithTimezone.from_dict(p_info.author_date),
target=directory,
target_type=ObjectType.DIRECTORY,
synthetic=True,
)
def get_extrinsic_origin_metadata(self) -> List[RawExtrinsicMetadataCore]:
metadata = self.metadata()
raw_metadata: str = metadata["raw_metadata"]
origin_metadata = json.dumps(
{
"metadata": [raw_metadata],
"provider": metadata["provider"],
"tool": metadata["tool"],
}
).encode()
return [
RawExtrinsicMetadataCore(
discovery_date=now(),
metadata=raw_metadata.encode(),
format="sword-v2-atom-codemeta-v2",
),
RawExtrinsicMetadataCore(
discovery_date=now(),
metadata=origin_metadata,
format="original-artifacts-json",
),
]
@cached_method
def metadata(self):
"""Returns metadata from the deposit server"""
return self.client.metadata_get(self.deposit_id)
def load(self) -> Dict:
# First making sure the deposit is known on the deposit's RPC server
# prior to trigger a loading
try:
self.metadata()
except ValueError:
logger.error(f"Unknown deposit {self.deposit_id}, ignoring")
return {"status": "failed"}
# Then usual loading
return super().load()
def finalize_visit(
self, status_visit: str, errors: Optional[List[str]] = None, **kwargs
) -> Dict[str, Any]:
r = super().finalize_visit(status_visit=status_visit, **kwargs)
success = status_visit == "full"
# Update deposit status
try:
if not success:
self.client.status_update(
- self.deposit_id, status="failed", errors=errors,
+ self.deposit_id,
+ status="failed",
+ errors=errors,
)
return r
snapshot_id = hash_to_bytes(r["snapshot_id"])
snapshot = snapshot_get_all_branches(self.storage, snapshot_id)
if not snapshot:
return r
branches = snapshot.branches
logger.debug("branches: %s", branches)
if not branches:
return r
rel_id = branches[b"HEAD"].target
release = self.storage.release_get([rel_id])[0]
if not release:
return r
# update the deposit's status to success with its
# release-id and directory-id
self.client.status_update(
self.deposit_id,
status="done",
release_id=hash_to_hex(rel_id),
directory_id=hash_to_hex(release.target),
snapshot_id=r["snapshot_id"],
origin_url=self.url,
)
except Exception:
logger.exception("Problem when trying to update the deposit's status")
return {"status": "failed"}
return r
def parse_author(author) -> Person:
- """See prior fixme
-
- """
+ """See prior fixme"""
return Person(
fullname=author["fullname"].encode("utf-8"),
name=author["name"].encode("utf-8"),
email=author["email"].encode("utf-8"),
)
class ApiClient:
- """Private Deposit Api client
-
- """
+ """Private Deposit Api client"""
def __init__(self, url, auth: Optional[Mapping[str, str]]):
self.base_url = url.rstrip("/")
self.auth = None if not auth else (auth["username"], auth["password"])
def do(self, method: str, url: str, *args, **kwargs):
"""Internal method to deal with requests, possibly with basic http
authentication.
Args:
method (str): supported http methods as in get/post/put
Returns:
The request's execution output
"""
method_fn = getattr(requests, method)
if self.auth:
kwargs["auth"] = self.auth
return method_fn(url, *args, **kwargs)
def archive_get(
self, deposit_id: Union[int, str], tmpdir: str, filename: str
) -> Tuple[str, Dict]:
- """Retrieve deposit's archive artifact locally
-
- """
+ """Retrieve deposit's archive artifact locally"""
url = f"{self.base_url}/{deposit_id}/raw/"
return download(url, dest=tmpdir, filename=filename, auth=self.auth)
def metadata_url(self, deposit_id: Union[int, str]) -> str:
return f"{self.base_url}/{deposit_id}/meta/"
def metadata_get(self, deposit_id: Union[int, str]) -> Dict[str, Any]:
- """Retrieve deposit's metadata artifact as json
-
- """
+ """Retrieve deposit's metadata artifact as json"""
url = self.metadata_url(deposit_id)
r = self.do("get", url)
if r.ok:
return r.json()
msg = f"Problem when retrieving deposit metadata at {url}"
logger.error(msg)
raise ValueError(msg)
def status_update(
self,
deposit_id: Union[int, str],
status: str,
errors: Optional[List[str]] = None,
release_id: Optional[str] = None,
directory_id: Optional[str] = None,
snapshot_id: Optional[str] = None,
origin_url: Optional[str] = None,
):
"""Update deposit's information including status, and persistent
- identifiers result of the loading.
+ identifiers result of the loading.
"""
url = f"{self.base_url}/{deposit_id}/update/"
payload: Dict[str, Any] = {"status": status}
if release_id:
payload["release_id"] = release_id
if directory_id:
payload["directory_id"] = directory_id
if snapshot_id:
payload["snapshot_id"] = snapshot_id
if origin_url:
payload["origin_url"] = origin_url
if errors:
payload["status_detail"] = {"loading": errors}
self.do("put", url, json=payload)
diff --git a/swh/loader/package/deposit/tests/conftest.py b/swh/loader/package/deposit/tests/conftest.py
index a326aa1..6fa1d9b 100644
--- a/swh/loader/package/deposit/tests/conftest.py
+++ b/swh/loader/package/deposit/tests/conftest.py
@@ -1,30 +1,33 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import copy
from typing import Any, Dict
import pytest
from swh.loader.package.deposit.loader import ApiClient
@pytest.fixture
def swh_loader_config(swh_loader_config) -> Dict[str, Any]:
config = copy.deepcopy(swh_loader_config)
config.update(
{
"deposit": {
"url": "https://deposit.softwareheritage.org/1/private",
- "auth": {"username": "user", "password": "pass",},
+ "auth": {
+ "username": "user",
+ "password": "pass",
+ },
},
}
)
return config
@pytest.fixture
def deposit_client(swh_loader_config):
return ApiClient(**swh_loader_config["deposit"])
diff --git a/swh/loader/package/deposit/tests/test_deposit.py b/swh/loader/package/deposit/tests/test_deposit.py
index 64476a4..f1a0921 100644
--- a/swh/loader/package/deposit/tests/test_deposit.py
+++ b/swh/loader/package/deposit/tests/test_deposit.py
@@ -1,557 +1,565 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import json
import re
import pytest
from swh.core.pytest_plugin import requests_mock_datadir_factory
from swh.loader.package.deposit.loader import ApiClient, DepositLoader
from swh.loader.package.loader import now
from swh.loader.tests import assert_last_visit_matches, check_snapshot, get_stats
from swh.model.hashutil import hash_to_bytes, hash_to_hex
from swh.model.model import (
Origin,
Person,
RawExtrinsicMetadata,
Release,
Snapshot,
SnapshotBranch,
TargetType,
TimestampWithTimezone,
)
from swh.model.model import MetadataAuthority, MetadataAuthorityType, MetadataFetcher
from swh.model.model import ObjectType as ModelObjectType
from swh.model.swhids import CoreSWHID, ExtendedObjectType, ExtendedSWHID, ObjectType
DEPOSIT_URL = "https://deposit.softwareheritage.org/1/private"
@pytest.fixture
def requests_mock_datadir(requests_mock_datadir):
"""Enhance default mock data to mock put requests as the loader does some
- internal update queries there.
+ internal update queries there.
"""
requests_mock_datadir.put(re.compile("https"))
return requests_mock_datadir
def test_deposit_init_ok(swh_storage, deposit_client, swh_loader_config):
url = "some-url"
deposit_id = 999
loader = DepositLoader(
swh_storage, url, deposit_id, deposit_client, default_filename="archive.zip"
) # Something that does not exist
assert loader.url == url
assert loader.client is not None
assert loader.client.base_url == swh_loader_config["deposit"]["url"]
def test_deposit_from_configfile(swh_config):
- """Ensure the deposit instantiation is ok
-
- """
+ """Ensure the deposit instantiation is ok"""
loader = DepositLoader.from_configfile(
url="some-url", deposit_id="666", default_filename="archive.zip"
)
assert isinstance(loader.client, ApiClient)
def test_deposit_loading_unknown_deposit(
swh_storage, deposit_client, requests_mock_datadir
):
"""Loading an unknown deposit should fail
no origin, no visit, no snapshot
"""
# private api url form: 'https://deposit.s.o/1/private/hal/666/raw/'
url = "some-url"
unknown_deposit_id = 667
loader = DepositLoader(
swh_storage,
url,
unknown_deposit_id,
deposit_client,
default_filename="archive.zip",
) # does not exist
actual_load_status = loader.load()
assert actual_load_status == {"status": "failed"}
stats = get_stats(loader.storage)
assert {
"content": 0,
"directory": 0,
"origin": 0,
"origin_visit": 0,
"release": 0,
"revision": 0,
"skipped_content": 0,
"snapshot": 0,
} == stats
requests_mock_datadir_missing_one = requests_mock_datadir_factory(
- ignore_urls=[f"{DEPOSIT_URL}/666/raw/",]
+ ignore_urls=[
+ f"{DEPOSIT_URL}/666/raw/",
+ ]
)
def test_deposit_loading_failure_to_retrieve_1_artifact(
swh_storage, deposit_client, requests_mock_datadir_missing_one
):
- """Deposit with missing artifact ends up with an uneventful/partial visit
-
- """
+ """Deposit with missing artifact ends up with an uneventful/partial visit"""
# private api url form: 'https://deposit.s.o/1/private/hal/666/raw/'
url = "some-url-2"
deposit_id = 666
requests_mock_datadir_missing_one.put(re.compile("https"))
loader = DepositLoader(
swh_storage, url, deposit_id, deposit_client, default_filename="archive.zip"
)
actual_load_status = loader.load()
assert actual_load_status["status"] == "uneventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(loader.storage, url, status="partial", type="deposit")
stats = get_stats(loader.storage)
assert {
"content": 0,
"directory": 0,
"origin": 1,
"origin_visit": 1,
"release": 0,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
# Retrieve the information for deposit status update query to the deposit
urls = [
m
for m in requests_mock_datadir_missing_one.request_history
if m.url == f"{DEPOSIT_URL}/{deposit_id}/update/"
]
assert len(urls) == 1
update_query = urls[0]
body = update_query.json()
expected_body = {
"status": "failed",
"status_detail": {
"loading": [
"Failed to load branch HEAD for some-url-2: Fail to query "
"'https://deposit.softwareheritage.org/1/private/666/raw/'. Reason: 404"
]
},
}
assert body == expected_body
def test_deposit_loading_ok(swh_storage, deposit_client, requests_mock_datadir):
url = "https://hal-test.archives-ouvertes.fr/some-external-id"
deposit_id = 666
loader = DepositLoader(
swh_storage, url, deposit_id, deposit_client, default_filename="archive.zip"
)
actual_load_status = loader.load()
expected_snapshot_id = "338b45d87e02fb5cbf324694bc4a898623d6a30f"
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id,
}
assert_last_visit_matches(
loader.storage,
url,
status="full",
type="deposit",
snapshot=hash_to_bytes(expected_snapshot_id),
)
release_id_hex = "2566a64a27bc00362e265be9666d7606750530a1"
release_id = hash_to_bytes(release_id_hex)
expected_snapshot = Snapshot(
id=hash_to_bytes(expected_snapshot_id),
branches={
- b"HEAD": SnapshotBranch(target=release_id, target_type=TargetType.RELEASE,),
+ b"HEAD": SnapshotBranch(
+ target=release_id,
+ target_type=TargetType.RELEASE,
+ ),
},
)
check_snapshot(expected_snapshot, storage=loader.storage)
release = loader.storage.release_get([release_id])[0]
date = TimestampWithTimezone.from_datetime(
datetime.datetime(2017, 10, 7, 15, 17, 8, tzinfo=datetime.timezone.utc)
)
person = Person(
fullname=b"Software Heritage",
name=b"Software Heritage",
email=b"robot@softwareheritage.org",
)
assert release == Release(
id=release_id,
name=b"HEAD",
message=b"hal: Deposit 666 in collection hal\n",
author=person,
date=date,
target_type=ModelObjectType.DIRECTORY,
target=b"\xfd-\xf1-\xc5SL\x1d\xa1\xe9\x18\x0b\x91Q\x02\xfbo`\x1d\x19",
synthetic=True,
metadata=None,
)
# check metadata
- fetcher = MetadataFetcher(name="swh-deposit", version="0.0.1",)
+ fetcher = MetadataFetcher(
+ name="swh-deposit",
+ version="0.0.1",
+ )
authority = MetadataAuthority(
type=MetadataAuthorityType.DEPOSIT_CLIENT,
url="https://hal-test.archives-ouvertes.fr/",
)
# Check origin metadata
orig_meta = loader.storage.raw_extrinsic_metadata_get(
Origin(url).swhid(), authority
)
assert orig_meta.next_page_token is None
raw_meta = loader.client.metadata_get(deposit_id)
raw_metadata: str = raw_meta["raw_metadata"]
# 2 raw metadata xml + 1 json dict
assert len(orig_meta.results) == 2
orig_meta0 = orig_meta.results[0]
assert orig_meta0.authority == authority
assert orig_meta0.fetcher == fetcher
# Check directory metadata
assert release.target_type == ModelObjectType.DIRECTORY
directory_swhid = CoreSWHID(
object_type=ObjectType.DIRECTORY, object_id=release.target
)
actual_dir_meta = loader.storage.raw_extrinsic_metadata_get(
directory_swhid, authority
)
assert actual_dir_meta.next_page_token is None
assert len(actual_dir_meta.results) == 1
dir_meta = actual_dir_meta.results[0]
assert dir_meta.authority == authority
assert dir_meta.fetcher == fetcher
assert dir_meta.metadata.decode() == raw_metadata
# Retrieve the information for deposit status update query to the deposit
urls = [
m
for m in requests_mock_datadir.request_history
if m.url == f"{DEPOSIT_URL}/{deposit_id}/update/"
]
assert len(urls) == 1
update_query = urls[0]
body = update_query.json()
expected_body = {
"status": "done",
"release_id": release_id_hex,
"directory_id": hash_to_hex(release.target),
"snapshot_id": expected_snapshot_id,
"origin_url": url,
}
assert body == expected_body
stats = get_stats(loader.storage)
assert {
"content": 303,
"directory": 12,
"origin": 1,
"origin_visit": 1,
"release": 1,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_deposit_loading_ok_2(swh_storage, deposit_client, requests_mock_datadir):
- """Field dates should be se appropriately
-
- """
+ """Field dates should be se appropriately"""
external_id = "some-external-id"
url = f"https://hal-test.archives-ouvertes.fr/{external_id}"
deposit_id = 777
loader = DepositLoader(
swh_storage, url, deposit_id, deposit_client, default_filename="archive.zip"
)
actual_load_status = loader.load()
expected_snapshot_id = "3449b8ff31abeacefd33cca60e3074c1649dc3a1"
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id,
}
assert_last_visit_matches(
loader.storage,
url,
status="full",
type="deposit",
snapshot=hash_to_bytes(expected_snapshot_id),
)
release_id = "ba6c9a59ae3256e765d32b211cc183dc2380aed7"
expected_snapshot = Snapshot(
id=hash_to_bytes(expected_snapshot_id),
branches={
b"HEAD": SnapshotBranch(
target=hash_to_bytes(release_id), target_type=TargetType.RELEASE
)
},
)
check_snapshot(expected_snapshot, storage=loader.storage)
raw_meta = loader.client.metadata_get(deposit_id)
# Ensure the date fields are set appropriately in the release
# Retrieve the release
release = loader.storage.release_get([hash_to_bytes(release_id)])[0]
assert release
# swh-deposit uses the numeric 'offset_minutes' instead of the bytes offset
# attribute, because its dates are always well-formed, and it can only send
# JSON-serializable data.
release_date_dict = {
"timestamp": release.date.timestamp.to_dict(),
"offset": release.date.offset_minutes(),
}
assert release_date_dict == raw_meta["deposit"]["author_date"]
assert not release.metadata
provider = {
"provider_name": "hal",
"provider_type": "deposit_client",
"provider_url": "https://hal-test.archives-ouvertes.fr/",
"metadata": None,
}
tool = {
"name": "swh-deposit",
"version": "0.0.1",
"configuration": {"sword_version": "2"},
}
- fetcher = MetadataFetcher(name="swh-deposit", version="0.0.1",)
+ fetcher = MetadataFetcher(
+ name="swh-deposit",
+ version="0.0.1",
+ )
authority = MetadataAuthority(
type=MetadataAuthorityType.DEPOSIT_CLIENT,
url="https://hal-test.archives-ouvertes.fr/",
)
# Check the origin metadata swh side
origin_extrinsic_metadata = loader.storage.raw_extrinsic_metadata_get(
Origin(url).swhid(), authority
)
assert origin_extrinsic_metadata.next_page_token is None
raw_metadata: str = raw_meta["raw_metadata"]
# 1 raw metadata xml + 1 json dict
assert len(origin_extrinsic_metadata.results) == 2
origin_swhid = Origin(url).swhid()
expected_metadata = []
origin_meta = origin_extrinsic_metadata.results[0]
expected_metadata.append(
RawExtrinsicMetadata(
target=origin_swhid,
discovery_date=origin_meta.discovery_date,
metadata=raw_metadata.encode(),
format="sword-v2-atom-codemeta-v2",
authority=authority,
fetcher=fetcher,
)
)
origin_metadata = {
"metadata": [raw_metadata],
"provider": provider,
"tool": tool,
}
expected_metadata.append(
RawExtrinsicMetadata(
target=origin_swhid,
discovery_date=origin_extrinsic_metadata.results[-1].discovery_date,
metadata=json.dumps(origin_metadata).encode(),
format="original-artifacts-json",
authority=authority,
fetcher=fetcher,
)
)
assert sorted(origin_extrinsic_metadata.results) == sorted(expected_metadata)
# Check the release metadata swh side
assert release.target_type == ModelObjectType.DIRECTORY
directory_swhid = ExtendedSWHID(
object_type=ExtendedObjectType.DIRECTORY, object_id=release.target
)
actual_directory_metadata = loader.storage.raw_extrinsic_metadata_get(
directory_swhid, authority
)
assert actual_directory_metadata.next_page_token is None
assert len(actual_directory_metadata.results) == 1
release_swhid = CoreSWHID(
object_type=ObjectType.RELEASE, object_id=hash_to_bytes(release_id)
)
dir_metadata_template = RawExtrinsicMetadata(
target=directory_swhid,
format="sword-v2-atom-codemeta-v2",
authority=authority,
fetcher=fetcher,
origin=url,
release=release_swhid,
# to satisfy the constructor
discovery_date=now(),
metadata=b"",
)
expected_directory_metadata = []
dir_metadata = actual_directory_metadata.results[0]
expected_directory_metadata.append(
RawExtrinsicMetadata.from_dict(
{
**{
k: v
for (k, v) in dir_metadata_template.to_dict().items()
if k != "id"
},
"discovery_date": dir_metadata.discovery_date,
"metadata": raw_metadata.encode(),
}
)
)
assert sorted(actual_directory_metadata.results) == sorted(
expected_directory_metadata
)
# Retrieve the information for deposit status update query to the deposit
urls = [
m
for m in requests_mock_datadir.request_history
if m.url == f"{DEPOSIT_URL}/{deposit_id}/update/"
]
assert len(urls) == 1
update_query = urls[0]
body = update_query.json()
expected_body = {
"status": "done",
"release_id": release_id,
"directory_id": hash_to_hex(release.target),
"snapshot_id": expected_snapshot_id,
"origin_url": url,
}
assert body == expected_body
def test_deposit_loading_ok_3(swh_storage, deposit_client, requests_mock_datadir):
"""Deposit loading can happen on tarball artifacts as well
The latest deposit changes introduce the internal change.
"""
external_id = "hal-123456"
url = f"https://hal-test.archives-ouvertes.fr/{external_id}"
deposit_id = 888
loader = DepositLoader(swh_storage, url, deposit_id, deposit_client)
actual_load_status = loader.load()
expected_snapshot_id = "4677843de89e398f1d6bfedc9ca9b89c451c55c8"
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id,
}
assert_last_visit_matches(
loader.storage,
url,
status="full",
type="deposit",
snapshot=hash_to_bytes(expected_snapshot_id),
)
def test_deposit_loading_ok_release_notes(
swh_storage, deposit_client, requests_mock_datadir
):
url = "https://hal-test.archives-ouvertes.fr/some-external-id"
deposit_id = 999
loader = DepositLoader(
swh_storage, url, deposit_id, deposit_client, default_filename="archive.zip"
)
actual_load_status = loader.load()
expected_snapshot_id = "a307acffb7c29bebb3daf1bcb680bb3f452890a8"
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id,
}
assert_last_visit_matches(
loader.storage,
url,
status="full",
type="deposit",
snapshot=hash_to_bytes(expected_snapshot_id),
)
release_id_hex = "f5e8ec02ede57edbe061afa7fc2a07bb7d14a700"
release_id = hash_to_bytes(release_id_hex)
expected_snapshot = Snapshot(
id=hash_to_bytes(expected_snapshot_id),
branches={
- b"HEAD": SnapshotBranch(target=release_id, target_type=TargetType.RELEASE,),
+ b"HEAD": SnapshotBranch(
+ target=release_id,
+ target_type=TargetType.RELEASE,
+ ),
},
)
check_snapshot(expected_snapshot, storage=loader.storage)
release = loader.storage.release_get([release_id])[0]
date = TimestampWithTimezone.from_datetime(
datetime.datetime(2017, 10, 7, 15, 17, 8, tzinfo=datetime.timezone.utc)
)
person = Person(
fullname=b"Software Heritage",
name=b"Software Heritage",
email=b"robot@softwareheritage.org",
)
assert release == Release(
id=release_id,
name=b"HEAD",
message=(
b"hal: Deposit 999 in collection hal\n\nThis release adds this and that.\n"
),
author=person,
date=date,
target_type=ModelObjectType.DIRECTORY,
target=b"\xfd-\xf1-\xc5SL\x1d\xa1\xe9\x18\x0b\x91Q\x02\xfbo`\x1d\x19",
synthetic=True,
metadata=None,
)
diff --git a/swh/loader/package/deposit/tests/test_tasks.py b/swh/loader/package/deposit/tests/test_tasks.py
index 248b88b..cd63efd 100644
--- a/swh/loader/package/deposit/tests/test_tasks.py
+++ b/swh/loader/package/deposit/tests/test_tasks.py
@@ -1,24 +1,27 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
def test_tasks_deposit_loader(
mocker, swh_scheduler_celery_app, swh_scheduler_celery_worker, swh_config
):
mock_loader = mocker.patch(
"swh.loader.package.deposit.loader.DepositLoader.from_configfile"
)
mock_loader.return_value = mock_loader
mock_loader.load.return_value = {"status": "eventful"}
res = swh_scheduler_celery_app.send_task(
"swh.loader.package.deposit.tasks.LoadDeposit",
- kwargs=dict(url="some-url", deposit_id="some-d-id",),
+ kwargs=dict(
+ url="some-url",
+ deposit_id="some-d-id",
+ ),
)
assert res
res.wait()
assert res.successful()
assert mock_loader.called
assert res.result == {"status": "eventful"}
diff --git a/swh/loader/package/loader.py b/swh/loader/package/loader.py
index 2a73d56..fe4344e 100644
--- a/swh/loader/package/loader.py
+++ b/swh/loader/package/loader.py
@@ -1,1100 +1,1110 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import hashlib
from itertools import islice
import json
import logging
import os
import string
import sys
import tempfile
from typing import (
Any,
Dict,
Generic,
Iterable,
Iterator,
List,
Mapping,
Optional,
Sequence,
Set,
Tuple,
TypeVar,
)
import attr
from requests.exceptions import ContentDecodingError
import sentry_sdk
from swh.core.tarball import uncompress
from swh.loader.core.loader import BaseLoader
from swh.loader.exception import NotFound
from swh.loader.package.utils import download
from swh.model import from_disk
from swh.model.hashutil import hash_to_hex
from swh.model.model import (
ExtID,
MetadataAuthority,
MetadataAuthorityType,
MetadataFetcher,
)
from swh.model.model import (
Origin,
OriginVisit,
OriginVisitStatus,
RawExtrinsicMetadata,
Release,
Revision,
Sha1Git,
Snapshot,
)
from swh.model.model import ObjectType as ModelObjectType
from swh.model.swhids import CoreSWHID, ExtendedObjectType, ExtendedSWHID, ObjectType
from swh.storage.algos.snapshot import snapshot_get_latest
from swh.storage.interface import StorageInterface
from swh.storage.utils import now
logger = logging.getLogger(__name__)
SWH_METADATA_AUTHORITY = MetadataAuthority(
type=MetadataAuthorityType.REGISTRY,
url="https://softwareheritage.org/",
metadata={},
)
"""Metadata authority for extrinsic metadata generated by Software Heritage.
Used for metadata on "original artifacts", ie. length, filename, and checksums
of downloaded archive files."""
PartialExtID = Tuple[str, int, bytes]
"""The ``extid_type`` and ``extid`` fields of an :class:`ExtID` object."""
@attr.s
class RawExtrinsicMetadataCore:
"""Contains the core of the metadata extracted by a loader, that will be
used to build a full RawExtrinsicMetadata object by adding object identifier,
context, and provenance information."""
format = attr.ib(type=str)
metadata = attr.ib(type=bytes)
discovery_date = attr.ib(type=Optional[datetime.datetime], default=None)
"""Defaults to the visit date."""
@attr.s
class BasePackageInfo:
"""Compute the primary key for a dict using the id_keys as primary key
composite.
Args:
d: A dict entry to compute the primary key on
id_keys: Sequence of keys to use as primary key
Returns:
The identity for that dict entry
"""
url = attr.ib(type=str)
filename = attr.ib(type=Optional[str])
version = attr.ib(type=str)
"""Version name/number."""
MANIFEST_FORMAT: Optional[string.Template] = None
"""If not None, used by the default extid() implementation to format a manifest,
before hashing it to produce an ExtID."""
EXTID_TYPE: str = "package-manifest-sha256"
EXTID_VERSION: int = 0
# The following attribute has kw_only=True in order to allow subclasses
# to add attributes. Without kw_only, attributes without default values cannot
# go after attributes with default values.
# See <https://github.com/python-attrs/attrs/issues/38>
directory_extrinsic_metadata = attr.ib(
- type=List[RawExtrinsicMetadataCore], default=[], kw_only=True,
+ type=List[RawExtrinsicMetadataCore],
+ default=[],
+ kw_only=True,
)
""":term:`extrinsic metadata` collected by the loader, that will be attached to the
loaded directory and added to the Metadata storage."""
# TODO: add support for metadata for releases and contents
def extid(self) -> Optional[PartialExtID]:
"""Returns a unique intrinsic identifier of this package info,
or None if this package info is not 'deduplicatable' (meaning that
we will always load it, instead of checking the ExtID storage
to see if we already did)"""
if self.MANIFEST_FORMAT is None:
return None
else:
manifest = self.MANIFEST_FORMAT.substitute(
{k: str(v) for (k, v) in attr.asdict(self).items()}
)
return (
self.EXTID_TYPE,
self.EXTID_VERSION,
hashlib.sha256(manifest.encode()).digest(),
)
TPackageInfo = TypeVar("TPackageInfo", bound=BasePackageInfo)
class PackageLoader(BaseLoader, Generic[TPackageInfo]):
# Origin visit type (str) set by the loader
visit_type = ""
visit_date: datetime.datetime
def __init__(
self,
storage: StorageInterface,
url: str,
max_content_size: Optional[int] = None,
):
"""Loader's constructor. This raises exception if the minimal required
configuration is missing (cf. fn:`check` method).
Args:
storage: Storage instance
url: Origin url to load data from
"""
super().__init__(storage=storage, max_content_size=max_content_size)
self.url = url
self.visit_date = datetime.datetime.now(tz=datetime.timezone.utc)
def get_versions(self) -> Sequence[str]:
"""Return the list of all published package versions.
Raises:
class:`swh.loader.exception.NotFound` error when failing to read the
published package versions.
Returns:
Sequence of published versions
"""
return []
def get_package_info(self, version: str) -> Iterator[Tuple[str, TPackageInfo]]:
"""Given a release version of a package, retrieve the associated
package information for such version.
Args:
version: Package version
Returns:
(branch name, package metadata)
"""
yield from {}
def build_release(
self, p_info: TPackageInfo, uncompressed_path: str, directory: Sha1Git
) -> Optional[Release]:
"""Build the release from the archive metadata (extrinsic
artifact metadata) and the intrinsic metadata.
Args:
p_info: Package information
uncompressed_path: Artifact uncompressed path on disk
"""
raise NotImplementedError("build_release")
def get_default_version(self) -> str:
"""Retrieve the latest release version if any.
Returns:
Latest version
"""
return ""
def last_snapshot(self) -> Optional[Snapshot]:
- """Retrieve the last snapshot out of the last visit.
-
- """
+ """Retrieve the last snapshot out of the last visit."""
return snapshot_get_latest(self.storage, self.url)
def new_packageinfo_to_extid(self, p_info: TPackageInfo) -> Optional[PartialExtID]:
return p_info.extid()
def _get_known_extids(
self, packages_info: List[TPackageInfo]
) -> Dict[PartialExtID, List[CoreSWHID]]:
"""Compute the ExtIDs from new PackageInfo objects, searches which are already
loaded in the archive, and returns them if any."""
# Compute the ExtIDs of all the new packages, grouped by extid type
new_extids: Dict[Tuple[str, int], List[bytes]] = {}
for p_info in packages_info:
res = p_info.extid()
if res is not None:
(extid_type, extid_version, extid_extid) = res
new_extids.setdefault((extid_type, extid_version), []).append(
extid_extid
)
# For each extid type, call extid_get_from_extid() with all the extids of
# that type, and store them in the '(type, extid) -> target' map.
known_extids: Dict[PartialExtID, List[CoreSWHID]] = {}
for ((extid_type, extid_version), extids) in new_extids.items():
for extid in self.storage.extid_get_from_extid(
extid_type, extids, version=extid_version
):
if extid is not None:
key = (extid.extid_type, extid_version, extid.extid)
known_extids.setdefault(key, []).append(extid.target)
return known_extids
def resolve_object_from_extids(
self,
known_extids: Dict[PartialExtID, List[CoreSWHID]],
p_info: TPackageInfo,
whitelist: Set[Sha1Git],
) -> Optional[CoreSWHID]:
"""Resolve the revision/release from known ExtIDs and a package info object.
If the artifact has already been downloaded, this will return the
existing release (or revision) targeting that uncompressed artifact directory.
Otherwise, this returns None.
Args:
known_extids: Dict built from a list of ExtID, with the target as value
p_info: Package information
whitelist: Any ExtID with target not in this set is filtered out
Returns:
None or release/revision SWHID
"""
new_extid = p_info.extid()
if new_extid is None:
return None
extid_targets = set()
for extid_target in known_extids.get(new_extid, []):
if extid_target.object_id not in whitelist:
# There is a known ExtID for this package, but its target is not
# in the snapshot.
# This can happen for three reasons:
#
# 1. a loader crashed after writing the ExtID, but before writing
# the snapshot
# 2. some other loader loaded the same artifact, but produced
# a different revision, causing an additional ExtID object
# to be written. We will probably find this loader's ExtID
# in a future iteration of this loop.
# Note that for now, this is impossible, as each loader has a
# completely different extid_type, but this is an implementation
# detail of each loader.
# 3. we took a snapshot, then the package disappeared,
# then we took another snapshot, and the package reappeared
#
# In case of 1, we must actually load the package now,
# so let's do it.
# TODO: detect when we are in case 3 using release_missing
# or revision_missing instead of the snapshot.
continue
elif extid_target.object_type in (ObjectType.RELEASE, ObjectType.REVISION):
extid_targets.add(extid_target)
else:
# Note that this case should never be reached unless there is a
# collision between a revision hash and some non-revision object's
# hash, but better safe than sorry.
logger.warning(
"%s is in the whitelist, but is not a revision/release.",
hash_to_hex(extid_target.object_type),
)
if extid_targets:
# This is a known package version, as we have an extid to reference it.
# Let's return one of them.
# If there is a release extid, return it.
release_extid_targets = {
extid_target
for extid_target in extid_targets
if extid_target.object_type == ObjectType.RELEASE
}
# Exclude missing targets
missing_releases = {
CoreSWHID(object_type=ObjectType.RELEASE, object_id=id_)
for id_ in self.storage.release_missing(
[swhid.object_id for swhid in release_extid_targets]
)
}
if missing_releases:
logger.error(
"Found ExtIDs pointing to missing releases: %s", missing_releases
)
release_extid_targets -= missing_releases
extid_target2 = self.select_extid_target(p_info, release_extid_targets)
if extid_target2:
return extid_target2
# If there is no release extid (ie. if the package was only loaded with
# older versions of this loader, which produced revision objects instead
# of releases), return a revision extid when possible.
revision_extid_targets = {
extid_target
for extid_target in extid_targets
if extid_target.object_type == ObjectType.REVISION
}
if revision_extid_targets:
assert len(extid_targets) == 1, extid_targets
extid_target = list(extid_targets)[0]
return extid_target
# No target found (this is probably a new package version)
return None
def select_extid_target(
self, p_info: TPackageInfo, extid_targets: Set[CoreSWHID]
) -> Optional[CoreSWHID]:
"""Given a list of release extid targets, choses one appropriate for the
given package info.
Package loaders shyould implement this if their ExtIDs may map to multiple
releases, so they can fetch releases from the storage and inspect their fields
to select the right one for this ``p_info``.
"""
if extid_targets:
# The base package loader does not have the domain-specific knowledge
# to select the right release -> crash if there is more than one.
assert len(extid_targets) == 1, extid_targets
return list(extid_targets)[0]
return None
def download_package(
self, p_info: TPackageInfo, tmpdir: str
) -> List[Tuple[str, Mapping]]:
"""Download artifacts for a specific package. All downloads happen in
in the tmpdir folder.
Default implementation expects the artifacts package info to be
about one artifact per package.
Note that most implementation have 1 artifact per package. But some
implementation have multiple artifacts per package (debian), some have
none, the package is the artifact (gnu).
Args:
artifacts_package_info: Information on the package artifacts to
download (url, filename, etc...)
tmpdir: Location to retrieve such artifacts
Returns:
List of (path, computed hashes)
"""
try:
return [download(p_info.url, dest=tmpdir, filename=p_info.filename)]
except ContentDecodingError:
# package might be erroneously marked as gzip compressed while is is not,
# try to download its raw bytes again without attempting to uncompress
# the input stream
return [
download(
p_info.url,
dest=tmpdir,
filename=p_info.filename,
extra_request_headers={"Accept-Encoding": "identity"},
)
]
def uncompress(
self, dl_artifacts: List[Tuple[str, Mapping[str, Any]]], dest: str
) -> str:
"""Uncompress the artifact(s) in the destination folder dest.
Optionally, this could need to use the p_info dict for some more
information (debian).
"""
uncompressed_path = os.path.join(dest, "src")
for a_path, _ in dl_artifacts:
uncompress(a_path, dest=uncompressed_path)
return uncompressed_path
def extra_branches(self) -> Dict[bytes, Mapping[str, Any]]:
"""Return an extra dict of branches that are used to update the set of
branches.
"""
return {}
def finalize_visit(
self,
*,
snapshot: Optional[Snapshot],
visit: OriginVisit,
status_visit: str,
status_load: str,
failed_branches: List[str],
errors: Optional[List[str]] = None,
) -> Dict[str, Any]:
"""Finalize the visit:
- flush eventual unflushed data to storage
- update origin visit's status
- return the task's status
"""
self.storage.flush()
snapshot_id: Optional[bytes] = None
if snapshot and snapshot.id: # to prevent the snapshot.id to b""
snapshot_id = snapshot.id
assert visit.visit
visit_status = OriginVisitStatus(
origin=self.url,
visit=visit.visit,
type=self.visit_type,
date=now(),
status=status_visit,
snapshot=snapshot_id,
)
self.storage.origin_visit_status_add([visit_status])
result: Dict[str, Any] = {
"status": status_load,
}
if snapshot_id:
result["snapshot_id"] = hash_to_hex(snapshot_id)
if failed_branches:
logger.warning("%d failed branches", len(failed_branches))
for i, urls in enumerate(islice(failed_branches, 50)):
prefix_url = "Failed branches: " if i == 0 else ""
logger.warning("%s%s", prefix_url, urls)
return result
def load(self) -> Dict:
"""Load for a specific origin the associated contents.
1. Get the list of versions in an origin.
2. Get the snapshot from the previous run of the loader,
and filter out versions that were already loaded, if their
:term:`extids <extid>` match
Then, for each remaining version in the origin
3. Fetch the files for one package version By default, this can be
implemented as a simple HTTP request. Loaders with more specific
requirements can override this, e.g.: the PyPI loader checks the
integrity of the downloaded files; the Debian loader has to download
and check several files for one package version.
4. Extract the downloaded files. By default, this would be a universal
archive/tarball extraction.
Loaders for specific formats can override this method (for instance,
the Debian loader uses dpkg-source -x).
5. Convert the extracted directory to a set of Software Heritage
objects Using swh.model.from_disk.
6. Extract the metadata from the unpacked directories This would only
be applicable for "smart" loaders like npm (parsing the
package.json), PyPI (parsing the PKG-INFO file) or Debian (parsing
debian/changelog and debian/control).
On "minimal-metadata" sources such as the GNU archive, the lister
should provide the minimal set of metadata needed to populate the
revision/release objects (authors, dates) as an argument to the
task.
7. Generate the revision/release objects for the given version. From
the data generated at steps 3 and 4.
end for each
8. Generate and load the snapshot for the visit
Using the revisions/releases collected at step 7., and the branch
information from step 2., generate a snapshot and load it into the
Software Heritage archive
"""
status_load = "uneventful" # either: eventful, uneventful, failed
status_visit = "full" # see swh.model.model.OriginVisitStatus
snapshot = None
failed_branches: List[str] = []
# Prepare origin and origin_visit
origin = Origin(url=self.url)
try:
self.storage.origin_add([origin])
visit = list(
self.storage.origin_visit_add(
[
OriginVisit(
- origin=self.url, date=self.visit_date, type=self.visit_type,
+ origin=self.url,
+ date=self.visit_date,
+ type=self.visit_type,
)
]
)
)[0]
except Exception as e:
logger.exception("Failed to initialize origin_visit for %s", self.url)
sentry_sdk.capture_exception(e)
return {"status": "failed"}
# Get the previous snapshot for this origin. It is then used to see which
# of the package's versions are already loaded in the archive.
try:
last_snapshot = self.last_snapshot()
logger.debug("last snapshot: %s", last_snapshot)
except Exception as e:
logger.exception("Failed to get previous state for %s", self.url)
sentry_sdk.capture_exception(e)
return self.finalize_visit(
snapshot=snapshot,
visit=visit,
failed_branches=failed_branches,
status_visit="failed",
status_load="failed",
errors=[str(e)],
)
load_exceptions: List[Exception] = []
# Get the list of all version names
try:
versions = self.get_versions()
except NotFound as e:
return self.finalize_visit(
snapshot=snapshot,
visit=visit,
failed_branches=failed_branches,
status_visit="not_found",
status_load="failed",
errors=[str(e)],
)
except Exception as e:
return self.finalize_visit(
snapshot=snapshot,
visit=visit,
failed_branches=failed_branches,
status_visit="failed",
status_load="failed",
errors=[str(e)],
)
# Get the metadata of each version's package
packages_info: List[Tuple[str, TPackageInfo]] = [
(branch_name, p_info)
for version in versions
for (branch_name, p_info) in self.get_package_info(version)
]
# Compute the ExtID of each of these packages
known_extids = self._get_known_extids([p_info for (_, p_info) in packages_info])
if last_snapshot is None:
last_snapshot_targets: Set[Sha1Git] = set()
else:
last_snapshot_targets = {
branch.target for branch in last_snapshot.branches.values()
}
new_extids: Set[ExtID] = set()
tmp_releases: Dict[str, List[Tuple[str, Sha1Git]]] = {
version: [] for version in versions
}
errors = []
for (branch_name, p_info) in packages_info:
logger.debug("package_info: %s", p_info)
# Check if the package was already loaded, using its ExtID
swhid = self.resolve_object_from_extids(
known_extids, p_info, last_snapshot_targets
)
if swhid is not None and swhid.object_type == ObjectType.REVISION:
# This package was already loaded, but by an older version
# of this loader, which produced revisions instead of releases.
# Let's fetch the revision's data, and "upgrade" it into a release.
(rev,) = self.storage.revision_get([swhid.object_id])
if not rev:
logger.error(
"Failed to upgrade branch %s from revision to "
"release, %s is missing from the storage. "
"Falling back to re-loading from the origin.",
branch_name,
swhid,
)
else:
rev = None
if swhid is None or (swhid.object_type == ObjectType.REVISION and not rev):
# No matching revision or release found in the last snapshot, load it.
release_id = None
try:
res = self._load_release(p_info, origin)
if res:
(release_id, directory_id) = res
assert release_id
assert directory_id
self._load_extrinsic_directory_metadata(
p_info, release_id, directory_id
)
self.storage.flush()
status_load = "eventful"
except Exception as e:
self.storage.clear_buffers()
load_exceptions.append(e)
sentry_sdk.capture_exception(e)
error = f"Failed to load branch {branch_name} for {self.url}"
logger.exception(error)
failed_branches.append(branch_name)
errors.append(f"{error}: {e}")
continue
if release_id is None:
continue
add_extid = True
elif swhid.object_type == ObjectType.REVISION:
# If 'rev' was None, the previous block would have run.
assert rev is not None
rel = rev2rel(rev, p_info.version)
self.storage.release_add([rel])
logger.debug("Upgraded %s to %s", swhid, rel.swhid())
release_id = rel.id
# Create a new extid for this package, so the next run of this loader
# will be able to find the new release, and use it (instead of the
# old revision)
add_extid = True
elif swhid.object_type == ObjectType.RELEASE:
# This package was already loaded, nothing to do.
release_id = swhid.object_id
add_extid = False
else:
assert False, f"Unexpected object type: {swhid}"
assert release_id is not None
if add_extid:
partial_extid = p_info.extid()
if partial_extid is not None:
(extid_type, extid_version, extid) = partial_extid
release_swhid = CoreSWHID(
object_type=ObjectType.RELEASE, object_id=release_id
)
new_extids.add(
ExtID(
extid_type=extid_type,
extid_version=extid_version,
extid=extid,
target=release_swhid,
)
)
tmp_releases[p_info.version].append((branch_name, release_id))
if load_exceptions:
status_visit = "partial"
if not tmp_releases:
# We could not load any releases; fail completely
return self.finalize_visit(
snapshot=snapshot,
visit=visit,
failed_branches=failed_branches,
status_visit="failed",
status_load="failed",
errors=errors,
)
try:
# Retrieve the default release version (the "latest" one)
default_version = self.get_default_version()
logger.debug("default version: %s", default_version)
# Retrieve extra branches
extra_branches = self.extra_branches()
logger.debug("extra branches: %s", extra_branches)
snapshot = self._load_snapshot(
default_version, tmp_releases, extra_branches
)
self.storage.flush()
except Exception as e:
error = f"Failed to build snapshot for origin {self.url}"
logger.exception(error)
errors.append(f"{error}: {e}")
sentry_sdk.capture_exception(e)
status_visit = "failed"
status_load = "failed"
if snapshot:
try:
metadata_objects = self.build_extrinsic_snapshot_metadata(snapshot.id)
self._load_metadata_objects(metadata_objects)
except Exception as e:
error = f"Failed to load extrinsic snapshot metadata for {self.url}"
logger.exception(error)
errors.append(f"{error}: {e}")
sentry_sdk.capture_exception(e)
status_visit = "partial"
status_load = "failed"
try:
metadata_objects = self.build_extrinsic_origin_metadata()
self._load_metadata_objects(metadata_objects)
except Exception as e:
error = f"Failed to load extrinsic origin metadata for {self.url}"
logger.exception(error)
errors.append(f"{error}: {e}")
sentry_sdk.capture_exception(e)
status_visit = "partial"
status_load = "failed"
if status_load != "failed":
self._load_extids(new_extids)
return self.finalize_visit(
snapshot=snapshot,
visit=visit,
failed_branches=failed_branches,
status_visit=status_visit,
status_load=status_load,
errors=errors,
)
def _load_directory(
self, dl_artifacts: List[Tuple[str, Mapping[str, Any]]], tmpdir: str
) -> Tuple[str, from_disk.Directory]:
uncompressed_path = self.uncompress(dl_artifacts, dest=tmpdir)
logger.debug("uncompressed_path: %s", uncompressed_path)
directory = from_disk.Directory.from_disk(
path=uncompressed_path.encode("utf-8"),
max_content_length=self.max_content_size,
)
contents, skipped_contents, directories = from_disk.iter_directory(directory)
logger.debug("Number of skipped contents: %s", len(skipped_contents))
self.storage.skipped_content_add(skipped_contents)
logger.debug("Number of contents: %s", len(contents))
self.storage.content_add(contents)
logger.debug("Number of directories: %s", len(directories))
self.storage.directory_add(directories)
return (uncompressed_path, directory)
def _load_release(
self, p_info: TPackageInfo, origin
) -> Optional[Tuple[Sha1Git, Sha1Git]]:
"""Does all the loading of a release itself:
* downloads a package and uncompresses it
* loads it from disk
* adds contents, directories, and release to self.storage
* returns (release_id, directory_id)
Raises
exception when unable to download or uncompress artifacts
"""
with tempfile.TemporaryDirectory() as tmpdir:
dl_artifacts = self.download_package(p_info, tmpdir)
(uncompressed_path, directory) = self._load_directory(dl_artifacts, tmpdir)
# FIXME: This should be release. cf. D409
release = self.build_release(
p_info, uncompressed_path, directory=directory.hash
)
if not release:
# Some artifacts are missing intrinsic metadata
# skipping those
return None
metadata = [metadata for (filepath, metadata) in dl_artifacts]
assert release.target is not None, release
assert release.target_type == ModelObjectType.DIRECTORY, release
metadata_target = ExtendedSWHID(
object_type=ExtendedObjectType.DIRECTORY, object_id=release.target
)
original_artifact_metadata = RawExtrinsicMetadata(
target=metadata_target,
discovery_date=self.visit_date,
authority=SWH_METADATA_AUTHORITY,
fetcher=self.get_metadata_fetcher(),
format="original-artifacts-json",
metadata=json.dumps(metadata).encode(),
origin=self.url,
release=release.swhid(),
)
self._load_metadata_objects([original_artifact_metadata])
logger.debug("Release: %s", release)
self.storage.release_add([release])
assert directory.hash
return (release.id, directory.hash)
def _load_snapshot(
self,
default_version: str,
releases: Dict[str, List[Tuple[str, bytes]]],
extra_branches: Dict[bytes, Mapping[str, Any]],
) -> Optional[Snapshot]:
"""Build snapshot out of the current releases stored and extra branches.
- Then load it in the storage.
+ Then load it in the storage.
"""
logger.debug("releases: %s", releases)
# Build and load the snapshot
branches = {} # type: Dict[bytes, Mapping[str, Any]]
for version, branch_name_releases in releases.items():
if version == default_version and len(branch_name_releases) == 1:
# only 1 branch (no ambiguity), we can create an alias
# branch 'HEAD'
branch_name, _ = branch_name_releases[0]
# except for some corner case (deposit)
if branch_name != "HEAD":
branches[b"HEAD"] = {
"target_type": "alias",
"target": branch_name.encode("utf-8"),
}
for branch_name, target in branch_name_releases:
branches[branch_name.encode("utf-8")] = {
"target_type": "release",
"target": target,
}
# Deal with extra-branches
for name, branch_target in extra_branches.items():
if name in branches:
logger.error("Extra branch '%s' has been ignored", name)
else:
branches[name] = branch_target
snapshot_data = {"branches": branches}
logger.debug("snapshot: %s", snapshot_data)
snapshot = Snapshot.from_dict(snapshot_data)
logger.debug("snapshot: %s", snapshot)
self.storage.snapshot_add([snapshot])
return snapshot
def get_loader_name(self) -> str:
"""Returns a fully qualified name of this loader."""
return f"{self.__class__.__module__}.{self.__class__.__name__}"
def get_loader_version(self) -> str:
"""Returns the version of the current loader."""
module_name = self.__class__.__module__ or ""
module_name_parts = module_name.split(".")
# Iterate rootward through the package hierarchy until we find a parent of this
# loader's module with a __version__ attribute.
for prefix_size in range(len(module_name_parts), 0, -1):
package_name = ".".join(module_name_parts[0:prefix_size])
module = sys.modules[package_name]
if hasattr(module, "__version__"):
return module.__version__ # type: ignore
# If this loader's class has no parent package with a __version__,
# it should implement it itself.
raise NotImplementedError(
f"Could not dynamically find the version of {self.get_loader_name()}."
)
def get_metadata_fetcher(self) -> MetadataFetcher:
"""Returns a MetadataFetcher instance representing this package loader;
which is used to for adding provenance information to extracted
extrinsic metadata, if any."""
return MetadataFetcher(
- name=self.get_loader_name(), version=self.get_loader_version(), metadata={},
+ name=self.get_loader_name(),
+ version=self.get_loader_version(),
+ metadata={},
)
def get_metadata_authority(self) -> MetadataAuthority:
"""For package loaders that get extrinsic metadata, returns the authority
the metadata are coming from.
"""
raise NotImplementedError("get_metadata_authority")
def get_extrinsic_origin_metadata(self) -> List[RawExtrinsicMetadataCore]:
"""Returns metadata items, used by build_extrinsic_origin_metadata."""
return []
def build_extrinsic_origin_metadata(self) -> List[RawExtrinsicMetadata]:
"""Builds a list of full RawExtrinsicMetadata objects, using
metadata returned by get_extrinsic_origin_metadata."""
metadata_items = self.get_extrinsic_origin_metadata()
if not metadata_items:
# If this package loader doesn't write metadata, no need to require
# an implementation for get_metadata_authority.
return []
authority = self.get_metadata_authority()
fetcher = self.get_metadata_fetcher()
metadata_objects = []
for item in metadata_items:
metadata_objects.append(
RawExtrinsicMetadata(
target=Origin(self.url).swhid(),
discovery_date=item.discovery_date or self.visit_date,
authority=authority,
fetcher=fetcher,
format=item.format,
metadata=item.metadata,
)
)
return metadata_objects
def get_extrinsic_snapshot_metadata(self) -> List[RawExtrinsicMetadataCore]:
"""Returns metadata items, used by build_extrinsic_snapshot_metadata."""
return []
def build_extrinsic_snapshot_metadata(
self, snapshot_id: Sha1Git
) -> List[RawExtrinsicMetadata]:
"""Builds a list of full RawExtrinsicMetadata objects, using
metadata returned by get_extrinsic_snapshot_metadata."""
metadata_items = self.get_extrinsic_snapshot_metadata()
if not metadata_items:
# If this package loader doesn't write metadata, no need to require
# an implementation for get_metadata_authority.
return []
authority = self.get_metadata_authority()
fetcher = self.get_metadata_fetcher()
metadata_objects = []
for item in metadata_items:
metadata_objects.append(
RawExtrinsicMetadata(
target=ExtendedSWHID(
object_type=ExtendedObjectType.SNAPSHOT, object_id=snapshot_id
),
discovery_date=item.discovery_date or self.visit_date,
authority=authority,
fetcher=fetcher,
format=item.format,
metadata=item.metadata,
origin=self.url,
)
)
return metadata_objects
def build_extrinsic_directory_metadata(
- self, p_info: TPackageInfo, release_id: Sha1Git, directory_id: Sha1Git,
+ self,
+ p_info: TPackageInfo,
+ release_id: Sha1Git,
+ directory_id: Sha1Git,
) -> List[RawExtrinsicMetadata]:
if not p_info.directory_extrinsic_metadata:
# If this package loader doesn't write metadata, no need to require
# an implementation for get_metadata_authority.
return []
authority = self.get_metadata_authority()
fetcher = self.get_metadata_fetcher()
metadata_objects = []
for item in p_info.directory_extrinsic_metadata:
metadata_objects.append(
RawExtrinsicMetadata(
target=ExtendedSWHID(
object_type=ExtendedObjectType.DIRECTORY, object_id=directory_id
),
discovery_date=item.discovery_date or self.visit_date,
authority=authority,
fetcher=fetcher,
format=item.format,
metadata=item.metadata,
origin=self.url,
release=CoreSWHID(
object_type=ObjectType.RELEASE, object_id=release_id
),
)
)
return metadata_objects
def _load_extrinsic_directory_metadata(
- self, p_info: TPackageInfo, release_id: Sha1Git, directory_id: Sha1Git,
+ self,
+ p_info: TPackageInfo,
+ release_id: Sha1Git,
+ directory_id: Sha1Git,
) -> None:
metadata_objects = self.build_extrinsic_directory_metadata(
p_info, release_id, directory_id
)
self._load_metadata_objects(metadata_objects)
def _load_metadata_objects(
self, metadata_objects: List[RawExtrinsicMetadata]
) -> None:
if not metadata_objects:
# If this package loader doesn't write metadata, no need to require
# an implementation for get_metadata_authority.
return
self._create_authorities(mo.authority for mo in metadata_objects)
self._create_fetchers(mo.fetcher for mo in metadata_objects)
self.storage.raw_extrinsic_metadata_add(metadata_objects)
def _create_authorities(self, authorities: Iterable[MetadataAuthority]) -> None:
deduplicated_authorities = {
(authority.type, authority.url): authority for authority in authorities
}
if authorities:
self.storage.metadata_authority_add(list(deduplicated_authorities.values()))
def _create_fetchers(self, fetchers: Iterable[MetadataFetcher]) -> None:
deduplicated_fetchers = {
(fetcher.name, fetcher.version): fetcher for fetcher in fetchers
}
if fetchers:
self.storage.metadata_fetcher_add(list(deduplicated_fetchers.values()))
def _load_extids(self, extids: Set[ExtID]) -> None:
if not extids:
return
try:
self.storage.extid_add(list(extids))
except Exception as e:
logger.exception("Failed to load new ExtIDs for %s", self.url)
sentry_sdk.capture_exception(e)
# No big deal, it just means the next visit will load the same versions
# again.
def rev2rel(rev: Revision, version: str) -> Release:
"""Converts a revision to a release."""
message = rev.message
if message and not message.endswith(b"\n"):
message += b"\n"
return Release(
name=version.encode(),
message=message,
target=rev.directory,
target_type=ModelObjectType.DIRECTORY,
synthetic=rev.synthetic,
author=rev.author,
date=rev.date,
)
diff --git a/swh/loader/package/maven/loader.py b/swh/loader/package/maven/loader.py
index f2403d0..a2003ef 100644
--- a/swh/loader/package/maven/loader.py
+++ b/swh/loader/package/maven/loader.py
@@ -1,198 +1,204 @@
# Copyright (C) 2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
from datetime import datetime, timezone
import json
import logging
from os import path
import string
from typing import Iterator, List, Optional, Sequence, Tuple
import attr
import iso8601
import requests
from typing_extensions import TypedDict
from swh.loader.package.loader import (
BasePackageInfo,
PackageLoader,
RawExtrinsicMetadataCore,
)
from swh.loader.package.utils import EMPTY_AUTHOR, release_name
from swh.model.model import (
MetadataAuthority,
MetadataAuthorityType,
ObjectType,
RawExtrinsicMetadata,
Release,
Sha1Git,
TimestampWithTimezone,
)
from swh.storage.interface import StorageInterface
logger = logging.getLogger(__name__)
class ArtifactDict(TypedDict):
"""Data about a Maven artifact, passed by the Maven Lister."""
time: str
"""the time of the last update of jar file on the server as an iso8601 date string
"""
url: str
"""the artifact url to retrieve filename"""
filename: Optional[str]
"""optionally, the file's name"""
gid: str
"""artifact's groupId"""
aid: str
"""artifact's artifactId"""
version: str
"""artifact's version"""
base_url: str
"""root URL of the Maven instance"""
@attr.s
class MavenPackageInfo(BasePackageInfo):
time = attr.ib(type=datetime)
"""Timestamp of the last update of jar file on the server."""
gid = attr.ib(type=str)
"""Group ID of the maven artifact"""
aid = attr.ib(type=str)
"""Artifact ID of the maven artifact"""
version = attr.ib(type=str)
"""Version of the maven artifact"""
base_url = attr.ib(type=str)
"""Root URL of the Maven instance"""
# default format for maven artifacts
MANIFEST_FORMAT = string.Template("$gid $aid $version $url $time")
EXTID_TYPE = "maven-jar"
EXTID_VERSION = 0
@classmethod
- def from_metadata(cls, a_metadata: ArtifactDict) -> "MavenPackageInfo":
- url = a_metadata["url"]
+ def from_metadata(cls, url: str, a_metadata: ArtifactDict) -> "MavenPackageInfo":
time = iso8601.parse_date(a_metadata["time"]).astimezone(tz=timezone.utc)
return cls(
url=url,
filename=a_metadata.get("filename") or path.split(url)[-1],
time=time,
gid=a_metadata["gid"],
aid=a_metadata["aid"],
version=a_metadata["version"],
base_url=a_metadata["base_url"],
directory_extrinsic_metadata=[
RawExtrinsicMetadataCore(
- format="maven-json", metadata=json.dumps(a_metadata).encode(),
+ format="maven-json",
+ metadata=json.dumps(a_metadata).encode(),
),
],
)
class MavenLoader(PackageLoader[MavenPackageInfo]):
- """Load source code jar origin's artifact files into swh archive
-
- """
+ """Load source code jar origin's artifact files into swh archive"""
visit_type = "maven"
def __init__(
self,
storage: StorageInterface,
url: str,
artifacts: Sequence[ArtifactDict],
max_content_size: Optional[int] = None,
):
"""Loader constructor.
For now, this is the lister's task output.
There is one, and only one, artefact (jar or zip) per version, as guaranteed by
the Maven coordinates system.
Args:
url: Origin url
artifacts: List of single artifact information
"""
super().__init__(storage=storage, url=url, max_content_size=max_content_size)
self.artifacts = artifacts # assume order is enforced in the lister
self.version_artifact = {
jar["version"]: jar for jar in artifacts if jar["version"]
}
if artifacts:
base_urls = {jar["base_url"] for jar in artifacts}
try:
(self.base_url,) = base_urls
except ValueError:
raise ValueError(
"Artifacts originate from more than one Maven instance: "
+ ", ".join(base_urls)
) from None
else:
# There is no artifact, so self.metadata_authority won't be called,
# so self.base_url won't be accessed.
pass
def get_versions(self) -> Sequence[str]:
return list(self.version_artifact)
def get_default_version(self) -> str:
# Default version is the last item
return self.artifacts[-1]["version"]
def get_metadata_authority(self):
return MetadataAuthority(type=MetadataAuthorityType.FORGE, url=self.base_url)
def build_extrinsic_directory_metadata(
- self, p_info: MavenPackageInfo, release_id: Sha1Git, directory_id: Sha1Git,
+ self,
+ p_info: MavenPackageInfo,
+ release_id: Sha1Git,
+ directory_id: Sha1Git,
) -> List[RawExtrinsicMetadata]:
# Rebuild POM URL.
pom_url = path.dirname(p_info.url)
pom_url = f"{pom_url}/{p_info.aid}-{p_info.version}.pom"
r = requests.get(pom_url, allow_redirects=True)
if r.status_code == 200:
metadata_pom = r.content
else:
metadata_pom = b""
p_info.directory_extrinsic_metadata.append(
- RawExtrinsicMetadataCore(format="maven-pom", metadata=metadata_pom,)
+ RawExtrinsicMetadataCore(
+ format="maven-pom",
+ metadata=metadata_pom,
+ )
)
return super().build_extrinsic_directory_metadata(
- p_info=p_info, release_id=release_id, directory_id=directory_id,
+ p_info=p_info,
+ release_id=release_id,
+ directory_id=directory_id,
)
def get_package_info(self, version: str) -> Iterator[Tuple[str, MavenPackageInfo]]:
a_metadata = self.version_artifact[version]
yield release_name(a_metadata["version"]), MavenPackageInfo.from_metadata(
- a_metadata
+ self.url, a_metadata
)
def build_release(
self, p_info: MavenPackageInfo, uncompressed_path: str, directory: Sha1Git
) -> Optional[Release]:
msg = f"Synthetic release for archive at {p_info.url}\n".encode("utf-8")
normalized_time = TimestampWithTimezone.from_datetime(p_info.time)
return Release(
name=p_info.version.encode(),
message=msg,
date=normalized_time,
author=EMPTY_AUTHOR,
target=directory,
target_type=ObjectType.DIRECTORY,
synthetic=True,
)
diff --git a/swh/loader/package/maven/tests/test_maven.py b/swh/loader/package/maven/tests/test_maven.py
index 5958ad3..96b6ad6 100644
--- a/swh/loader/package/maven/tests/test_maven.py
+++ b/swh/loader/package/maven/tests/test_maven.py
@@ -1,615 +1,618 @@
-# Copyright (C) 2019-2021 The Software Heritage developers
+# Copyright (C) 2019-2022 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import hashlib
import json
from pathlib import Path
import pytest
from swh.loader.package import __version__
from swh.loader.package.maven.loader import MavenLoader, MavenPackageInfo
from swh.loader.package.utils import EMPTY_AUTHOR
from swh.loader.tests import assert_last_visit_matches, check_snapshot, get_stats
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
RawExtrinsicMetadata,
Release,
Snapshot,
SnapshotBranch,
TargetType,
TimestampWithTimezone,
)
from swh.model.model import MetadataAuthority, MetadataAuthorityType, MetadataFetcher
from swh.model.model import ObjectType as ModelObjectType
from swh.model.swhids import CoreSWHID, ExtendedObjectType, ExtendedSWHID, ObjectType
from swh.storage.algos.snapshot import snapshot_get_all_branches
-URL = "https://repo1.maven.org/maven2/"
+REPO_BASE_URL = "https://repo1.maven.org/maven2/"
+
+MVN_ARTIFACT_URLS = [
+ f"{REPO_BASE_URL}al/aldi/sprova4j/0.1.0/sprova4j-0.1.0-sources.jar",
+ f"{REPO_BASE_URL}al/aldi/sprova4j/0.1.1/sprova4j-0.1.1-sources.jar",
+]
+
MVN_ARTIFACTS = [
{
"time": "2021-07-12 19:06:59.335000",
- "url": "https://repo1.maven.org/maven2/al/aldi/sprova4j/0.1.0/"
- + "sprova4j-0.1.0-sources.jar",
"gid": "al.aldi",
"aid": "sprova4j",
"filename": "sprova4j-0.1.0-sources.jar",
"version": "0.1.0",
- "base_url": "https://repo1.maven.org/maven2/",
+ "base_url": REPO_BASE_URL,
},
{
"time": "2021-07-12 19:37:05.534000",
- "url": "https://repo1.maven.org/maven2/al/aldi/sprova4j/0.1.1/"
- + "sprova4j-0.1.1-sources.jar",
"gid": "al.aldi",
"aid": "sprova4j",
"filename": "sprova4j-0.1.1-sources.jar",
"version": "0.1.1",
- "base_url": "https://repo1.maven.org/maven2/",
+ "base_url": REPO_BASE_URL,
},
]
MVN_ARTIFACTS_POM = [
- "https://repo1.maven.org/maven2/al/aldi/sprova4j/0.1.0/sprova4j-0.1.0.pom",
- "https://repo1.maven.org/maven2/al/aldi/sprova4j/0.1.1/sprova4j-0.1.1.pom",
+ f"{REPO_BASE_URL}al/aldi/sprova4j/0.1.0/sprova4j-0.1.0.pom",
+ f"{REPO_BASE_URL}al/aldi/sprova4j/0.1.1/sprova4j-0.1.1.pom",
]
_expected_new_contents_first_visit = [
"cd807364cd7730022b3849f90ccf4bababbada84",
"79e33dd52ebdf615e6696ae69add91cb990d81e2",
"8002bd514156f05a0940ae14ef86eb0179cbd510",
"23479553a6ccec30d377dee0496123a65d23fd8c",
"07ffbebb933bc1660e448f07d8196c2b083797f9",
"abf021b581f80035b56153c9aa27195b8d7ebbb8",
"eec70ba80a6862ed2619727663b17eb0d9dfe131",
"81a493dacb44dedf623f29ecf62c0e035bf698de",
"bda85ed0bbecf8cddfea04234bee16f476f64fe4",
"1ec91d561f5bdf59acb417086e04c54ead94e94e",
"d517b423da707fa21378623f35facebff53cb59d",
"3f0f21a764972d79e583908991c893c999613354",
"a2dd4d7dfe6043baf9619081e4e29966989211af",
"f62685cf0c6825a4097c949280b584cf0e16d047",
"56afc1ea60cef6548ce0a34f44e91b0e4b063835",
"cf7c740926e7ebc9ac8978a5c4f0e1e7a0e9e3af",
"86ff828bea1c22ca3d50ed82569b9c59ce2c41a1",
"1d0fa04454d9fec31d8ee3f35b58158ca1e28b15",
"e90239a2c8d9ede61a29671a8b397a743e18fa34",
"ce8851005d084aea089bcd8cf01052f4b234a823",
"2c34ce622aa7fa68d104900840f66671718e6249",
"e6a6fec32dcb3bee93c34fc11b0174a6b0b0ec6d",
"405d3e1be4b658bf26de37f2c90c597b2796b9d7",
"d0d2f5848721e04300e537826ef7d2d6d9441df0",
"399c67e33e38c475fd724d283dd340f6a2e8dc91",
"dea10c1111cc61ac1809fb7e88857e3db054959f",
]
_expected_json_metadata = {
"time": "2021-07-12 19:06:59.335000",
- "url": (
- "https://repo1.maven.org/maven2/al/aldi/sprova4j/0.1.0/"
- "sprova4j-0.1.0-sources.jar"
- ),
"gid": "al.aldi",
"aid": "sprova4j",
"filename": "sprova4j-0.1.0-sources.jar",
"version": "0.1.0",
- "base_url": "https://repo1.maven.org/maven2/",
+ "base_url": REPO_BASE_URL,
}
_expected_pom_metadata = (
"""<?xml version="1.0" encoding="UTF-8"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 """
'http://maven.apache.org/xsd/maven-4.0.0.xsd" '
'xmlns="http://maven.apache.org/POM/4.0.0" '
"""xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
<groupId>al.aldi</groupId>
<artifactId>sprova4j</artifactId>
<version>0.1.0</version>
<name>sprova4j</name>
<description>Java client for Sprova Test Management</description>
<url>https://github.com/aldialimucaj/sprova4j</url>
<inceptionYear>2018</inceptionYear>
<licenses>
<license>
<name>The Apache Software License, Version 2.0</name>
<url>http://www.apache.org/licenses/LICENSE-2.0.txt</url>
<distribution>repo</distribution>
</license>
</licenses>
<developers>
<developer>
<id>aldi</id>
<name>Aldi Alimucaj</name>
<email>aldi.alimucaj@gmail.com</email>
</developer>
</developers>
<scm>
<connection>scm:git:git://github.com/aldialimucaj/sprova4j.git</connection>
<developerConnection>scm:git:git://github.com/aldialimucaj/sprova4j.git</developerConnection>
<url>https://github.com/aldialimucaj/sprova4j</url>
</scm>
<dependencies>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.3</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
<version>3.10.0</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.squareup.okio</groupId>
<artifactId>okio</artifactId>
<version>1.0.0</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
<artifactId>javax.json</artifactId>
<version>1.1.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>javax.json</groupId>
<artifactId>javax.json-api</artifactId>
<version>1.1.2</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>2.0.1.Final</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>mockwebserver</artifactId>
<version>3.10.0</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
"""
)
_expected_new_directories_first_visit = [
"6c9de41e4cebb91a8368da1d89ae9873bd540ec3",
"c1a2ee97fc47426d0179f94d223405336b5cd075",
"9e1bdca292765a9528af18743bd793b80362c768",
"193a7af634592ef27fb341762806f61e8fb8eab3",
"a297aa21e3dbf138b370be3aae7a852dd403bbbb",
"da84026119ae04022f007d5b3362e98d46d09045",
"75bb915942a9c441ca62aeffc3b634f1ec9ce5e2",
"0851d359283b2ad82b116c8d1b55ab14b1ec219c",
"2bcbb8b723a025ee9a36b719cea229ed38c37e46",
]
_expected_new_release_first_visit = "02e83c29ec094db581f939d2e238d0613a4f59ac"
REL_MSG = (
b"Synthetic release for archive at https://repo1.maven.org/maven2/al/aldi/"
b"sprova4j/0.1.0/sprova4j-0.1.0-sources.jar\n"
)
REVISION_DATE = TimestampWithTimezone.from_datetime(
datetime.datetime(2021, 7, 12, 19, 6, 59, 335000, tzinfo=datetime.timezone.utc)
)
@pytest.fixture
def data_jar_1(datadir):
content = Path(
datadir, "https_maven.org", "sprova4j-0.1.0-sources.jar"
).read_bytes()
return content
@pytest.fixture
def data_pom_1(datadir):
content = Path(datadir, "https_maven.org", "sprova4j-0.1.0.pom").read_bytes()
return content
@pytest.fixture
def data_jar_2(datadir):
content = Path(
datadir, "https_maven.org", "sprova4j-0.1.1-sources.jar"
).read_bytes()
return content
@pytest.fixture
def data_pom_2(datadir):
content = Path(datadir, "https_maven.org", "sprova4j-0.1.1.pom").read_bytes()
return content
def test_jar_visit_with_no_artifact_found(swh_storage, requests_mock_datadir):
unknown_artifact_url = "https://ftp.g.o/unknown/8sync-0.1.0.tar.gz"
loader = MavenLoader(
swh_storage,
unknown_artifact_url,
artifacts=[
{
"time": "2021-07-18 08:05:05.187000",
"url": unknown_artifact_url, # unknown artifact
"filename": "8sync-0.1.0.tar.gz",
"gid": "al/aldi",
"aid": "sprova4j",
"version": "0.1.0",
"base_url": "https://repo1.maven.org/maven2/",
}
],
)
actual_load_status = loader.load()
assert actual_load_status["status"] == "uneventful"
assert actual_load_status["snapshot_id"] is not None
expected_snapshot_id = "1a8893e6a86f444e8be8e7bda6cb34fb1735a00e"
assert actual_load_status["snapshot_id"] == expected_snapshot_id
stats = get_stats(swh_storage)
assert_last_visit_matches(
swh_storage, unknown_artifact_url, status="partial", type="maven"
)
assert {
"content": 0,
"directory": 0,
"origin": 1,
"origin_visit": 1,
"release": 0,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_jar_visit_inconsistent_base_url(
swh_storage, requests_mock, data_jar_1, data_pom_1
):
- """With no prior visit, loading a jar ends up with 1 snapshot
-
- """
+ """With no prior visit, loading a jar ends up with 1 snapshot"""
with pytest.raises(ValueError, match="more than one Maven instance"):
MavenLoader(
swh_storage,
- MVN_ARTIFACTS[0]["url"],
+ MVN_ARTIFACT_URLS[0],
artifacts=[
MVN_ARTIFACTS[0],
{**MVN_ARTIFACTS[1], "base_url": "http://maven.example/"},
],
)
def test_jar_visit_with_release_artifact_no_prior_visit(
swh_storage, requests_mock, data_jar_1, data_pom_1
):
- """With no prior visit, loading a jar ends up with 1 snapshot
-
- """
- requests_mock.get(MVN_ARTIFACTS[0]["url"], content=data_jar_1)
+ """With no prior visit, loading a jar ends up with 1 snapshot"""
+ requests_mock.get(MVN_ARTIFACT_URLS[0], content=data_jar_1)
requests_mock.get(MVN_ARTIFACTS_POM[0], content=data_pom_1)
loader = MavenLoader(
- swh_storage, MVN_ARTIFACTS[0]["url"], artifacts=[MVN_ARTIFACTS[0]]
+ swh_storage, MVN_ARTIFACT_URLS[0], artifacts=[MVN_ARTIFACTS[0]]
)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
expected_snapshot_first_visit_id = hash_to_bytes(
"c5195b8ebd148649bf094561877964b131ab27e0"
)
expected_snapshot = Snapshot(
id=expected_snapshot_first_visit_id,
branches={
b"HEAD": SnapshotBranch(
- target_type=TargetType.ALIAS, target=b"releases/0.1.0",
+ target_type=TargetType.ALIAS,
+ target=b"releases/0.1.0",
),
b"releases/0.1.0": SnapshotBranch(
target_type=TargetType.RELEASE,
target=hash_to_bytes(_expected_new_release_first_visit),
),
},
)
actual_snapshot = snapshot_get_all_branches(
swh_storage, hash_to_bytes(actual_load_status["snapshot_id"])
)
assert actual_snapshot == expected_snapshot
check_snapshot(expected_snapshot, swh_storage)
assert (
hash_to_bytes(actual_load_status["snapshot_id"])
== expected_snapshot_first_visit_id
)
stats = get_stats(swh_storage)
assert_last_visit_matches(
- swh_storage, MVN_ARTIFACTS[0]["url"], status="full", type="maven"
+ swh_storage, MVN_ARTIFACT_URLS[0], status="full", type="maven"
)
expected_contents = map(hash_to_bytes, _expected_new_contents_first_visit)
assert list(swh_storage.content_missing_per_sha1(expected_contents)) == []
expected_dirs = map(hash_to_bytes, _expected_new_directories_first_visit)
assert list(swh_storage.directory_missing(expected_dirs)) == []
expected_rels = map(hash_to_bytes, {_expected_new_release_first_visit})
assert list(swh_storage.release_missing(expected_rels)) == []
rel_id = actual_snapshot.branches[b"releases/0.1.0"].target
(rel,) = swh_storage.release_get([rel_id])
assert rel == Release(
id=hash_to_bytes(_expected_new_release_first_visit),
name=b"0.1.0",
message=REL_MSG,
author=EMPTY_AUTHOR,
date=REVISION_DATE,
target_type=ModelObjectType.DIRECTORY,
target=hash_to_bytes("6c9de41e4cebb91a8368da1d89ae9873bd540ec3"),
synthetic=True,
metadata=None,
)
assert {
"content": len(_expected_new_contents_first_visit),
"directory": len(_expected_new_directories_first_visit),
"origin": 1,
"origin_visit": 1,
"release": 1,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_jar_2_visits_without_change(
swh_storage, requests_mock_datadir, requests_mock, data_jar_2, data_pom_2
):
- """With no prior visit, load a gnu project ends up with 1 snapshot
-
- """
- requests_mock.get(MVN_ARTIFACTS[1]["url"], content=data_jar_2)
+ """With no prior visit, load a gnu project ends up with 1 snapshot"""
+ requests_mock.get(MVN_ARTIFACT_URLS[1], content=data_jar_2)
requests_mock.get(MVN_ARTIFACTS_POM[1], content=data_pom_2)
loader = MavenLoader(
- swh_storage, MVN_ARTIFACTS[1]["url"], artifacts=[MVN_ARTIFACTS[1]]
+ swh_storage, MVN_ARTIFACT_URLS[1], artifacts=[MVN_ARTIFACTS[1]]
)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
expected_snapshot_first_visit_id = hash_to_bytes(
"91dcacee7a6d2b54f9cab14bc14cb86d22d2ac2b"
)
assert (
hash_to_bytes(actual_load_status["snapshot_id"])
== expected_snapshot_first_visit_id
)
assert_last_visit_matches(
- swh_storage, MVN_ARTIFACTS[1]["url"], status="full", type="maven"
+ swh_storage, MVN_ARTIFACT_URLS[1], status="full", type="maven"
)
actual_load_status2 = loader.load()
assert actual_load_status2["status"] == "uneventful"
assert actual_load_status2["snapshot_id"] is not None
assert actual_load_status["snapshot_id"] == actual_load_status2["snapshot_id"]
assert_last_visit_matches(
- swh_storage, MVN_ARTIFACTS[1]["url"], status="full", type="maven"
+ swh_storage, MVN_ARTIFACT_URLS[1], status="full", type="maven"
)
# Make sure we have only one entry in history for the pom fetch, one for
# the actual download of jar, and that they're correct.
urls_history = [str(req.url) for req in list(requests_mock_datadir.request_history)]
assert urls_history == [
- MVN_ARTIFACTS[1]["url"],
+ MVN_ARTIFACT_URLS[1],
MVN_ARTIFACTS_POM[1],
]
-def test_metadatata(swh_storage, requests_mock, data_jar_1, data_pom_1):
+def test_metadata(swh_storage, requests_mock, data_jar_1, data_pom_1):
"""With no prior visit, loading a jar ends up with 1 snapshot.
Extrinsic metadata is the pom file associated to the source jar.
"""
- requests_mock.get(MVN_ARTIFACTS[0]["url"], content=data_jar_1)
+ requests_mock.get(MVN_ARTIFACT_URLS[0], content=data_jar_1)
requests_mock.get(MVN_ARTIFACTS_POM[0], content=data_pom_1)
loader = MavenLoader(
- swh_storage, MVN_ARTIFACTS[0]["url"], artifacts=[MVN_ARTIFACTS[0]]
+ swh_storage, MVN_ARTIFACT_URLS[0], artifacts=[MVN_ARTIFACTS[0]]
)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
expected_release_id = hash_to_bytes(_expected_new_release_first_visit)
release = swh_storage.release_get([expected_release_id])[0]
assert release is not None
release_swhid = CoreSWHID(
object_type=ObjectType.RELEASE, object_id=expected_release_id
)
directory_swhid = ExtendedSWHID(
object_type=ExtendedObjectType.DIRECTORY, object_id=release.target
)
metadata_authority = MetadataAuthority(
- type=MetadataAuthorityType.FORGE, url="https://repo1.maven.org/maven2/",
+ type=MetadataAuthorityType.FORGE,
+ url=REPO_BASE_URL,
)
expected_metadata = [
RawExtrinsicMetadata(
target=directory_swhid,
authority=metadata_authority,
fetcher=MetadataFetcher(
- name="swh.loader.package.maven.loader.MavenLoader", version=__version__,
+ name="swh.loader.package.maven.loader.MavenLoader",
+ version=__version__,
),
discovery_date=loader.visit_date,
format="maven-pom",
metadata=_expected_pom_metadata.encode(),
- origin=MVN_ARTIFACTS[0]["url"],
+ origin=MVN_ARTIFACT_URLS[0],
release=release_swhid,
),
RawExtrinsicMetadata(
target=directory_swhid,
authority=metadata_authority,
fetcher=MetadataFetcher(
- name="swh.loader.package.maven.loader.MavenLoader", version=__version__,
+ name="swh.loader.package.maven.loader.MavenLoader",
+ version=__version__,
),
discovery_date=loader.visit_date,
format="maven-json",
metadata=json.dumps(_expected_json_metadata).encode(),
- origin=MVN_ARTIFACTS[0]["url"],
+ origin=MVN_ARTIFACT_URLS[0],
release=release_swhid,
),
]
res = swh_storage.raw_extrinsic_metadata_get(directory_swhid, metadata_authority)
assert res.next_page_token is None
assert set(res.results) == set(expected_metadata)
-def test_metadatata_no_pom(swh_storage, requests_mock, data_jar_1):
+def test_metadata_no_pom(swh_storage, requests_mock, data_jar_1):
"""With no prior visit, loading a jar ends up with 1 snapshot.
Extrinsic metadata is None if the pom file cannot be retrieved.
"""
- requests_mock.get(MVN_ARTIFACTS[0]["url"], content=data_jar_1)
+ artifact_url = MVN_ARTIFACT_URLS[0]
+ requests_mock.get(artifact_url, content=data_jar_1)
requests_mock.get(MVN_ARTIFACTS_POM[0], status_code="404")
- loader = MavenLoader(
- swh_storage, MVN_ARTIFACTS[0]["url"], artifacts=[MVN_ARTIFACTS[0]]
- )
+ loader = MavenLoader(swh_storage, artifact_url, artifacts=[MVN_ARTIFACTS[0]])
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
expected_release_id = hash_to_bytes(_expected_new_release_first_visit)
release = swh_storage.release_get([expected_release_id])[0]
assert release is not None
release_swhid = CoreSWHID(
object_type=ObjectType.RELEASE, object_id=expected_release_id
)
directory_swhid = ExtendedSWHID(
object_type=ExtendedObjectType.DIRECTORY, object_id=release.target
)
metadata_authority = MetadataAuthority(
- type=MetadataAuthorityType.FORGE, url="https://repo1.maven.org/maven2/",
+ type=MetadataAuthorityType.FORGE,
+ url=REPO_BASE_URL,
)
expected_metadata = [
RawExtrinsicMetadata(
target=directory_swhid,
authority=metadata_authority,
fetcher=MetadataFetcher(
- name="swh.loader.package.maven.loader.MavenLoader", version=__version__,
+ name="swh.loader.package.maven.loader.MavenLoader",
+ version=__version__,
),
discovery_date=loader.visit_date,
format="maven-pom",
metadata=b"",
- origin=MVN_ARTIFACTS[0]["url"],
+ origin=artifact_url,
release=release_swhid,
),
RawExtrinsicMetadata(
target=directory_swhid,
authority=metadata_authority,
fetcher=MetadataFetcher(
- name="swh.loader.package.maven.loader.MavenLoader", version=__version__,
+ name="swh.loader.package.maven.loader.MavenLoader",
+ version=__version__,
),
discovery_date=loader.visit_date,
format="maven-json",
metadata=json.dumps(_expected_json_metadata).encode(),
- origin=MVN_ARTIFACTS[0]["url"],
+ origin=artifact_url,
release=release_swhid,
),
]
res = swh_storage.raw_extrinsic_metadata_get(directory_swhid, metadata_authority)
assert res.next_page_token is None
assert set(res.results) == set(expected_metadata)
def test_jar_extid():
- """Compute primary key should return the right identity
-
- """
+ """Compute primary key should return the right identity"""
metadata = MVN_ARTIFACTS[0]
+ # metadata.pop("url", None)
+ url = MVN_ARTIFACT_URLS[0]
+ p_info = MavenPackageInfo(url=url, **metadata)
- p_info = MavenPackageInfo(**metadata)
-
- expected_manifest = "{gid} {aid} {version} {url} {time}".format(**metadata).encode()
+ expected_manifest = "{gid} {aid} {version} {url} {time}".format(
+ url=url, **metadata
+ ).encode()
actual_id = p_info.extid()
- assert actual_id == ("maven-jar", 0, hashlib.sha256(expected_manifest).digest(),)
+ assert actual_id == (
+ "maven-jar",
+ 0,
+ hashlib.sha256(expected_manifest).digest(),
+ )
def test_jar_snapshot_append(
swh_storage,
requests_mock_datadir,
requests_mock,
data_jar_1,
data_pom_1,
data_jar_2,
data_pom_2,
):
# first loading with a first artifact
artifact1 = MVN_ARTIFACTS[0]
- url1 = artifact1["url"]
+ url1 = MVN_ARTIFACT_URLS[0]
requests_mock.get(url1, content=data_jar_1)
requests_mock.get(MVN_ARTIFACTS_POM[0], content=data_pom_1)
loader = MavenLoader(swh_storage, url1, [artifact1])
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(swh_storage, url1, status="full", type="maven")
# check expected snapshot
snapshot = loader.last_snapshot()
assert len(snapshot.branches) == 2
branch_artifact1_name = f"releases/{artifact1['version']}".encode()
assert b"HEAD" in snapshot.branches
assert branch_artifact1_name in snapshot.branches
assert snapshot.branches[b"HEAD"].target == branch_artifact1_name
# second loading with a second artifact
artifact2 = MVN_ARTIFACTS[1]
- url2 = artifact2["url"]
+ url2 = MVN_ARTIFACT_URLS[1]
requests_mock.get(url2, content=data_jar_2)
requests_mock.get(MVN_ARTIFACTS_POM[1], content=data_pom_2)
loader = MavenLoader(swh_storage, url2, [artifact2])
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
assert_last_visit_matches(swh_storage, url2, status="full", type="maven")
# check expected snapshot, should contain a new branch and the
# branch for the first artifact
snapshot = loader.last_snapshot()
assert len(snapshot.branches) == 2
branch_artifact2_name = f"releases/{artifact2['version']}".encode()
assert b"HEAD" in snapshot.branches
assert branch_artifact2_name in snapshot.branches
assert branch_artifact1_name not in snapshot.branches
assert snapshot.branches[b"HEAD"].target == branch_artifact2_name
diff --git a/swh/loader/package/maven/tests/test_tasks.py b/swh/loader/package/maven/tests/test_tasks.py
index 2335af6..479dce0 100644
--- a/swh/loader/package/maven/tests/test_tasks.py
+++ b/swh/loader/package/maven/tests/test_tasks.py
@@ -1,51 +1,54 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
MVN_ARTIFACTS = [
{
"time": 1626109619335,
"url": "https://repo1.maven.org/maven2/al/aldi/sprova4j/0.1.0/"
+ "sprova4j-0.1.0.jar",
"gid": "al.aldi",
"aid": "sprova4j",
"filename": "sprova4j-0.1.0.jar",
"version": "0.1.0",
"base_url": "https://repo1.maven.org/maven2/",
},
]
def test_tasks_maven_loader(
mocker, swh_scheduler_celery_app, swh_scheduler_celery_worker, swh_config
):
mock_load = mocker.patch("swh.loader.package.maven.loader.MavenLoader.load")
mock_load.return_value = {"status": "eventful"}
res = swh_scheduler_celery_app.send_task(
"swh.loader.package.maven.tasks.LoadMaven",
- kwargs=dict(url=MVN_ARTIFACTS[0]["url"], artifacts=MVN_ARTIFACTS,),
+ kwargs=dict(
+ url=MVN_ARTIFACTS[0]["url"],
+ artifacts=MVN_ARTIFACTS,
+ ),
)
assert res
res.wait()
assert res.successful()
assert mock_load.called
assert res.result == {"status": "eventful"}
def test_tasks_maven_loader_snapshot_append(
mocker, swh_scheduler_celery_app, swh_scheduler_celery_worker, swh_config
):
mock_load = mocker.patch("swh.loader.package.maven.loader.MavenLoader.load")
mock_load.return_value = {"status": "eventful"}
res = swh_scheduler_celery_app.send_task(
"swh.loader.package.maven.tasks.LoadMaven",
kwargs=dict(url=MVN_ARTIFACTS[0]["url"], artifacts=[]),
)
assert res
res.wait()
assert res.successful()
assert mock_load.called
assert res.result == {"status": "eventful"}
diff --git a/swh/loader/package/nixguix/loader.py b/swh/loader/package/nixguix/loader.py
index 2497010..f2cc1d5 100644
--- a/swh/loader/package/nixguix/loader.py
+++ b/swh/loader/package/nixguix/loader.py
@@ -1,303 +1,308 @@
# Copyright (C) 2020-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import copy
import json
import logging
import re
from typing import Any, Dict, Iterator, List, Mapping, Optional, Set, Tuple
import attr
from swh.loader.package.loader import (
BasePackageInfo,
PackageLoader,
PartialExtID,
RawExtrinsicMetadataCore,
)
from swh.loader.package.utils import EMPTY_AUTHOR, api_info, cached_method
from swh.model import hashutil
from swh.model.model import (
MetadataAuthority,
MetadataAuthorityType,
ObjectType,
Release,
Sha1Git,
)
from swh.model.swhids import CoreSWHID
from swh.storage.interface import StorageInterface
logger = logging.getLogger(__name__)
EXTID_TYPE = "subresource-integrity"
"""The ExtID is an ASCII string, as defined by
https://w3c.github.io/webappsec-subresource-integrity/"""
EXTID_VERSION = 0
@attr.s
class NixGuixPackageInfo(BasePackageInfo):
raw_info = attr.ib(type=Dict[str, Any])
integrity = attr.ib(type=str)
"""Hash of the archive, formatted as in the Subresource Integrity
specification."""
@classmethod
def from_metadata(
cls, metadata: Dict[str, Any], version: str
) -> "NixGuixPackageInfo":
return cls(
url=metadata["url"],
filename=None,
version=version,
integrity=metadata["integrity"],
raw_info=metadata,
)
def extid(self) -> PartialExtID:
return (EXTID_TYPE, EXTID_VERSION, self.integrity.encode("ascii"))
class NixGuixLoader(PackageLoader[NixGuixPackageInfo]):
"""Load sources from a sources.json file. This loader is used to load
sources used by functional package manager (eg. Nix and Guix).
"""
visit_type = "nixguix"
def __init__(
self,
storage: StorageInterface,
url: str,
unsupported_file_extensions: List[str] = [],
max_content_size: Optional[int] = None,
):
super().__init__(storage=storage, url=url, max_content_size=max_content_size)
self.provider_url = url
self.unsupported_file_extensions = unsupported_file_extensions
# Note: this could be renamed get_artifacts in the PackageLoader
# base class.
@cached_method
def raw_sources(self):
return retrieve_sources(self.url)
@cached_method
def supported_sources(self):
raw_sources = self.raw_sources()
return clean_sources(
parse_sources(raw_sources), self.unsupported_file_extensions
)
@cached_method
def integrity_by_url(self) -> Dict[str, str]:
sources = self.supported_sources()
return {s["urls"][0]: s["integrity"] for s in sources["sources"]}
def get_versions(self) -> List[str]:
"""The first mirror of the mirror list is used as branch name in the
snapshot.
"""
return list(self.integrity_by_url().keys())
def get_metadata_authority(self):
return MetadataAuthority(
- type=MetadataAuthorityType.FORGE, url=self.url, metadata={},
+ type=MetadataAuthorityType.FORGE,
+ url=self.url,
+ metadata={},
)
def get_extrinsic_snapshot_metadata(self):
return [
RawExtrinsicMetadataCore(
- format="nixguix-sources-json", metadata=self.raw_sources(),
+ format="nixguix-sources-json",
+ metadata=self.raw_sources(),
),
]
# Note: this could be renamed get_artifact_info in the PackageLoader
# base class.
def get_package_info(self, url) -> Iterator[Tuple[str, NixGuixPackageInfo]]:
# TODO: try all mirrors and not only the first one. A source
# can be fetched from several urls, called mirrors. We
# currently only use the first one, but if the first one
# fails, we should try the second one and so on.
integrity = self.integrity_by_url()[url]
p_info = NixGuixPackageInfo.from_metadata(
{"url": url, "integrity": integrity}, version=url
)
yield url, p_info
def select_extid_target(
self, p_info: NixGuixPackageInfo, extid_targets: Set[CoreSWHID]
) -> Optional[CoreSWHID]:
if extid_targets:
# The archive URL is part of the release name. As that URL is not
# intrinsic metadata, it means different releases may be created for
# the same SRI so they have the same extid.
# Therefore, we need to pick the one with the right URL.
releases = self.storage.release_get(
[target.object_id for target in extid_targets]
)
extid_targets = {
release.swhid()
for release in releases
if release is not None and release.name == p_info.version.encode()
}
return super().select_extid_target(p_info, extid_targets)
def extra_branches(self) -> Dict[bytes, Mapping[str, Any]]:
"""We add a branch to the snapshot called 'evaluation' pointing to the
revision used to generate the sources.json file. This revision
is specified in the sources.json file itself. For the nixpkgs
origin, this revision is coming from the
github.com/nixos/nixpkgs repository.
Note this repository is not loaded explicitly. So, this
pointer can target a nonexistent revision for a time. However,
the github and gnu loaders are supposed to load this revision
and should create the revision pointed by this branch.
This branch can be used to identify the snapshot associated to
a Nix/Guix evaluation.
"""
# The revision used to create the sources.json file. For Nix,
# this revision belongs to the github.com/nixos/nixpkgs
# repository
revision = self.supported_sources()["revision"]
return {
b"evaluation": {
"target_type": "revision",
"target": hashutil.hash_to_bytes(revision),
}
}
def build_release(
self, p_info: NixGuixPackageInfo, uncompressed_path: str, directory: Sha1Git
) -> Optional[Release]:
return Release(
name=p_info.version.encode(),
message=None,
author=EMPTY_AUTHOR,
date=None,
target=directory,
target_type=ObjectType.DIRECTORY,
synthetic=True,
)
def retrieve_sources(url: str) -> bytes:
"""Retrieve sources. Potentially raise NotFound error."""
return api_info(url, allow_redirects=True)
def parse_sources(raw_sources: bytes) -> Dict[str, Any]:
return json.loads(raw_sources.decode("utf-8"))
-def make_pattern_unsupported_file_extension(unsupported_file_extensions: List[str],):
+def make_pattern_unsupported_file_extension(
+ unsupported_file_extensions: List[str],
+):
"""Make a regexp pattern for unsupported file extension out of a list
of unsupported archive extension list.
"""
return re.compile(
rf".*\.({'|'.join(map(re.escape, unsupported_file_extensions))})$", re.DOTALL
)
def clean_sources(
sources: Dict[str, Any], unsupported_file_extensions=[]
) -> Dict[str, Any]:
"""Validate and clean the sources structure. First, ensure all top level keys are
present. Then, walk the sources list and remove sources that do not contain required
keys.
Filter out source entries whose:
- required keys are missing
- source type is not supported
- urls attribute type is not a list
- extension is known not to be supported by the loader
Raises:
ValueError if:
- a required top level key is missing
- top-level version is not 1
Returns:
source Dict cleaned up
"""
pattern_unsupported_file = make_pattern_unsupported_file_extension(
unsupported_file_extensions
)
# Required top level keys
required_keys = ["version", "revision", "sources"]
missing_keys = []
for required_key in required_keys:
if required_key not in sources:
missing_keys.append(required_key)
if missing_keys != []:
raise ValueError(
f"sources structure invalid, missing: {','.join(missing_keys)}"
)
# Only the version 1 is currently supported
version = int(sources["version"])
if version != 1:
raise ValueError(
f"The sources structure version '{sources['version']}' is not supported"
)
# If a source doesn't contain required attributes, this source is
# skipped but others could still be archived.
verified_sources = []
for source in sources["sources"]:
valid = True
required_keys = ["urls", "integrity", "type"]
for required_key in required_keys:
if required_key not in source:
logger.info(
f"Skip source '{source}' because key '{required_key}' is missing",
)
valid = False
if valid and source["type"] != "url":
logger.info(
f"Skip source '{source}' because the type {source['type']} "
"is not supported",
)
valid = False
if valid and not isinstance(source["urls"], list):
logger.info(
f"Skip source {source} because the urls attribute is not a list"
)
valid = False
if valid and len(source["urls"]) > 0: # Filter out unsupported archives
supported_sources: List[str] = []
for source_url in source["urls"]:
if pattern_unsupported_file.match(source_url):
logger.info(f"Skip unsupported artifact url {source_url}")
continue
supported_sources.append(source_url)
if len(supported_sources) == 0:
logger.info(
f"Skip source {source} because urls only reference "
"unsupported artifacts. Unsupported "
f"artifacts so far: {pattern_unsupported_file}"
)
continue
new_source = copy.deepcopy(source)
new_source["urls"] = supported_sources
verified_sources.append(new_source)
sources["sources"] = verified_sources
return sources
diff --git a/swh/loader/package/nixguix/tests/test_nixguix.py b/swh/loader/package/nixguix/tests/test_nixguix.py
index f9904f6..e7de3b7 100644
--- a/swh/loader/package/nixguix/tests/test_nixguix.py
+++ b/swh/loader/package/nixguix/tests/test_nixguix.py
@@ -1,649 +1,655 @@
# Copyright (C) 2020-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import json
import logging
import os
from typing import Dict, Optional, Tuple
import pytest
from swh.loader.package import __version__
from swh.loader.package.archive.loader import ArchiveLoader
from swh.loader.package.nixguix.loader import (
NixGuixLoader,
clean_sources,
make_pattern_unsupported_file_extension,
parse_sources,
retrieve_sources,
)
from swh.loader.package.utils import download
from swh.loader.tests import assert_last_visit_matches
from swh.loader.tests import check_snapshot as check_snapshot_full
from swh.loader.tests import get_stats
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
MetadataAuthority,
MetadataAuthorityType,
MetadataFetcher,
ObjectType,
Person,
RawExtrinsicMetadata,
Release,
Snapshot,
SnapshotBranch,
TargetType,
)
from swh.model.swhids import ExtendedObjectType, ExtendedSWHID
from swh.storage.algos.origin import origin_get_latest_visit_status
from swh.storage.algos.snapshot import snapshot_get_all_branches
from swh.storage.exc import HashCollision
from swh.storage.interface import PagedResult, StorageInterface
sources_url = "https://nix-community.github.io/nixpkgs-swh/sources.json"
@pytest.fixture
def raw_sources(datadir) -> bytes:
with open(
os.path.join(
datadir, "https_nix-community.github.io", "nixpkgs-swh_sources.json"
),
"rb",
) as f:
return f.read()
SNAPSHOT1 = Snapshot(
id=hash_to_bytes("fafcfe32016d018bd892114fce211f37a36a092a"),
branches={
b"evaluation": SnapshotBranch(
target=hash_to_bytes("cc4e04c26672dd74e5fd0fecb78b435fb55368f7"),
target_type=TargetType.REVISION,
),
b"https://github.com/owner-1/repository-1/revision-1.tgz": SnapshotBranch(
target=hash_to_bytes("df7811b9644ed8ef088e2e7add62ed32b0bab15f"),
target_type=TargetType.RELEASE,
),
b"https://github.com/owner-3/repository-1/revision-1.tgz": SnapshotBranch(
target=hash_to_bytes("dc7dc10a664396d5c88adc56352904db231bde14"),
target_type=TargetType.RELEASE,
),
},
)
def check_snapshot(snapshot: Snapshot, storage: StorageInterface):
# The `evaluation` branch is allowed to be unresolvable. It's possible at current
# nixguix visit time, it is not yet visited (the git loader is in charge of its
# visit for now). For more details, check the
# swh.loader.package.nixguix.NixGuixLoader.extra_branches docstring.
check_snapshot_full(
snapshot, storage, allowed_empty=[(TargetType.REVISION, b"evaluation")]
)
assert isinstance(snapshot, Snapshot)
# then ensure the snapshot revisions are structurally as expected
revision_ids = []
for name, branch in snapshot.branches.items():
if name == b"evaluation":
continue # skipping that particular branch (cf. previous comment)
if branch.target_type == TargetType.REVISION:
revision_ids.append(branch.target)
revisions = storage.revision_get(revision_ids)
for rev in revisions:
assert rev is not None
metadata = rev.metadata
assert not metadata
def test_retrieve_sources(swh_storage, requests_mock_datadir):
j = parse_sources(retrieve_sources(sources_url))
assert "sources" in j.keys()
assert len(j["sources"]) == 3
def test_nixguix_url_not_found(swh_storage, requests_mock_datadir):
"""When failing to read from the url, the visit is marked as not_found.
Here the sources url does not exist, so requests_mock_datadir returns a 404.
Resulting in a NotFound raised within the package loader's main loop.
This results in the task with status failed and a visit_status with status
"not_found".
"""
unknown_url = "https://non-existing-url/"
loader = NixGuixLoader(swh_storage, unknown_url)
# during the retrieval step
load_status = loader.load()
assert load_status == {"status": "failed"}
assert_last_visit_matches(
swh_storage, unknown_url, status="not_found", type="nixguix", snapshot=None
)
assert len(requests_mock_datadir.request_history) == 1
assert requests_mock_datadir.request_history[0].url == unknown_url
def test_nixguix_url_with_decoding_error(swh_storage, requests_mock_datadir):
"""Other errors during communication with the url, the visit is marked as failed
requests_mock_datadir will intercept the requests to sources_url. Since the file
exists, returns a 200 with the requested content of the query. As file.txt is no
json, fails do decode and raises a JSONDecodeError. In effect failing the visit.
"""
sources_url = "https://example.com/file.txt"
loader = NixGuixLoader(swh_storage, sources_url)
load_status = loader.load()
assert load_status == {"status": "failed"}
assert_last_visit_matches(
swh_storage, sources_url, status="failed", type="nixguix", snapshot=None
)
assert len(requests_mock_datadir.request_history) == 1
assert requests_mock_datadir.request_history[0].url == sources_url
def test_clean_sources_invalid_schema(swh_storage, requests_mock_datadir):
sources = {}
with pytest.raises(ValueError, match="sources structure invalid, missing: .*"):
clean_sources(sources)
def test_clean_sources_invalid_version(swh_storage, requests_mock_datadir):
for version_ok in [1, "1"]: # Check those versions are fine
clean_sources({"version": version_ok, "sources": [], "revision": "my-revision"})
for version_ko in [0, "0", 2, "2"]: # Check version != 1 raise an error
with pytest.raises(
ValueError, match="sources structure version .* is not supported"
):
clean_sources(
{"version": version_ko, "sources": [], "revision": "my-revision"}
)
def test_clean_sources_invalid_sources(swh_storage, requests_mock_datadir):
valid_sources = [
# 1 valid source
{"type": "url", "urls": ["my-url.tar.gz"], "integrity": "my-integrity"},
]
sources = {
"version": 1,
"sources": valid_sources
+ [
# integrity is missing
- {"type": "url", "urls": ["my-url.tgz"],},
+ {
+ "type": "url",
+ "urls": ["my-url.tgz"],
+ },
# urls is not a list
{"type": "url", "urls": "my-url.zip", "integrity": "my-integrity"},
# type is not url
{"type": "git", "urls": ["my-url.zip"], "integrity": "my-integrity"},
# missing fields which got double-checked nonetheless...
{"integrity": "my-integrity"},
],
"revision": "my-revision",
}
clean = clean_sources(sources)
assert len(clean["sources"]) == len(valid_sources)
def test_make_pattern_unsupported_file_extension():
unsupported_extensions = ["el", "c", "txt"]
supported_extensions = ["Z", "7z"] # for test
actual_unsupported_pattern = make_pattern_unsupported_file_extension(
unsupported_extensions
)
for supported_ext in supported_extensions:
assert supported_ext not in unsupported_extensions
supported_filepath = f"anything.{supported_ext}"
actual_match = actual_unsupported_pattern.match(supported_filepath)
assert not actual_match
for unsupported_ext in unsupported_extensions:
unsupported_filepath = f"something.{unsupported_ext}"
actual_match = actual_unsupported_pattern.match(unsupported_filepath)
assert actual_match
def test_clean_sources_unsupported_artifacts(swh_storage, requests_mock_datadir):
unsupported_file_extensions = [
"iso",
"whl",
"gem",
"pom",
"msi",
"pod",
"png",
"rock",
"ttf",
"jar",
"c",
"el",
"rpm",
"diff",
"patch",
]
supported_sources = [
{
"type": "url",
"urls": [f"https://server.org/my-url.{ext}"],
"integrity": "my-integrity",
}
for ext in [
"known-unknown-but-ok", # this is fine as well with the current approach
"zip",
"tar.gz",
"tgz",
"tar.bz2",
"tbz",
"tbz2",
"tar.xz",
"tar",
"zip",
"7z",
"Z",
]
]
unsupported_sources = [
{
"type": "url",
"urls": [f"https://server.org/my-url.{ext}"],
"integrity": "my-integrity",
}
for ext in unsupported_file_extensions
]
sources = {
"version": 1,
"sources": supported_sources + unsupported_sources,
"revision": "my-revision",
}
clean = clean_sources(sources, unsupported_file_extensions)
assert len(clean["sources"]) == len(supported_sources)
def test_loader_one_visit(swh_storage, requests_mock_datadir, raw_sources):
loader = NixGuixLoader(swh_storage, sources_url)
load_status = loader.load()
expected_snapshot_id = SNAPSHOT1.id
expected_snapshot_id_hex = expected_snapshot_id.hex()
assert load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id_hex,
}
release_id = SNAPSHOT1.branches[
b"https://github.com/owner-1/repository-1/revision-1.tgz"
].target
check_snapshot(SNAPSHOT1, storage=swh_storage)
assert swh_storage.release_get([release_id])[0] == Release(
id=release_id,
name=b"https://github.com/owner-1/repository-1/revision-1.tgz",
message=None,
target=hash_to_bytes("4de2e07d3742718d928e974b8a4c721b9f7b33bf"),
target_type=ObjectType.DIRECTORY,
synthetic=True,
author=Person.from_fullname(b""),
date=None,
)
stats = get_stats(swh_storage)
assert {
"content": 1,
"directory": 3,
"origin": 1,
"origin_visit": 1,
"release": 2,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
# The visit is partial because urls pointing to non tarball file
# are not handled yet
assert_last_visit_matches(
swh_storage, sources_url, status="partial", type="nixguix"
)
visit_status = origin_get_latest_visit_status(swh_storage, sources_url)
snapshot_swhid = ExtendedSWHID(
object_type=ExtendedObjectType.SNAPSHOT, object_id=visit_status.snapshot
)
metadata_authority = MetadataAuthority(
- type=MetadataAuthorityType.FORGE, url=sources_url,
+ type=MetadataAuthorityType.FORGE,
+ url=sources_url,
)
expected_metadata = [
RawExtrinsicMetadata(
target=snapshot_swhid,
authority=metadata_authority,
fetcher=MetadataFetcher(
name="swh.loader.package.nixguix.loader.NixGuixLoader",
version=__version__,
),
discovery_date=loader.visit_date,
format="nixguix-sources-json",
metadata=raw_sources,
origin=sources_url,
)
]
assert swh_storage.raw_extrinsic_metadata_get(
- snapshot_swhid, metadata_authority,
- ) == PagedResult(next_page_token=None, results=expected_metadata,)
+ snapshot_swhid,
+ metadata_authority,
+ ) == PagedResult(
+ next_page_token=None,
+ results=expected_metadata,
+ )
def test_uncompress_failure(swh_storage, requests_mock_datadir):
"""Non tarball files are currently not supported and the uncompress
function fails on such kind of files.
However, even in this case of failure (because of the url
https://example.com/file.txt), a snapshot and a visit has to be
created (with a status partial since all files are not archived).
"""
loader = NixGuixLoader(swh_storage, sources_url)
loader_status = loader.load()
sources = loader.supported_sources()["sources"]
urls = [s["urls"][0] for s in sources]
assert "https://example.com/file.txt" in urls
assert loader_status["status"] == "eventful"
# The visit is partial because urls pointing to non tarball files
# are not handled yet
assert_last_visit_matches(
swh_storage, sources_url, status="partial", type="nixguix"
)
def test_loader_incremental(swh_storage, requests_mock_datadir):
"""Ensure a second visit do not download artifact already
downloaded by the previous visit.
"""
loader = NixGuixLoader(swh_storage, sources_url)
load_status = loader.load()
loader.load()
assert load_status == {"status": "eventful", "snapshot_id": SNAPSHOT1.id.hex()}
assert_last_visit_matches(
swh_storage,
sources_url,
status="partial",
type="nixguix",
snapshot=SNAPSHOT1.id,
)
check_snapshot(SNAPSHOT1, storage=swh_storage)
urls = [
m.url
for m in requests_mock_datadir.request_history
if m.url == ("https://github.com/owner-1/repository-1/revision-1.tgz")
]
# The artifact
# 'https://github.com/owner-1/repository-1/revision-1.tgz' is only
# visited one time
assert len(urls) == 1
def test_loader_two_visits(swh_storage, requests_mock_datadir_visits):
"""To ensure there is only one origin, but two visits, two revisions
and two snapshots are created.
The first visit creates a snapshot containing one tarball. The
second visit creates a snapshot containing the same tarball and
another tarball.
"""
loader = NixGuixLoader(swh_storage, sources_url)
load_status = loader.load()
assert load_status == {"status": "eventful", "snapshot_id": SNAPSHOT1.id.hex()}
assert_last_visit_matches(
swh_storage,
sources_url,
status="partial",
type="nixguix",
snapshot=SNAPSHOT1.id,
)
check_snapshot(SNAPSHOT1, storage=swh_storage)
stats = get_stats(swh_storage)
assert {
"content": 1,
"directory": 3,
"origin": 1,
"origin_visit": 1,
"release": 2,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
loader = NixGuixLoader(swh_storage, sources_url)
load_status = loader.load()
expected_snapshot_id_hex = "c1983a0a3f647548e1fb92f30339da6848fe9f7a"
expected_snapshot_id = hash_to_bytes(expected_snapshot_id_hex)
assert load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id_hex,
}
assert_last_visit_matches(
swh_storage,
sources_url,
status="partial",
type="nixguix",
snapshot=expected_snapshot_id,
)
# This ensures visits are incremental. Indeed, if we request a
# second time an url, because of the requests_mock_datadir_visits
# fixture, the file has to end with `_visit1`.
expected_snapshot = Snapshot(
id=expected_snapshot_id,
branches={
b"evaluation": SnapshotBranch(
target=hash_to_bytes("602140776b2ce6c9159bcf52ada73a297c063d5e"),
target_type=TargetType.REVISION,
),
b"https://github.com/owner-1/repository-1/revision-1.tgz": SnapshotBranch(
target=hash_to_bytes("df7811b9644ed8ef088e2e7add62ed32b0bab15f"),
target_type=TargetType.RELEASE,
),
b"https://github.com/owner-2/repository-1/revision-1.tgz": SnapshotBranch(
target=hash_to_bytes("5cc0115cd643902b837cb6cfbc9f5865bc5a7cb2"),
target_type=TargetType.RELEASE,
),
},
)
check_snapshot(expected_snapshot, storage=swh_storage)
stats = get_stats(swh_storage)
assert {
"content": 2,
"directory": 5,
"origin": 1,
"origin_visit": 2,
"release": 3,
"revision": 0,
"skipped_content": 0,
"snapshot": 2,
} == stats
def test_evaluation_branch(swh_storage, requests_mock_datadir):
loader = NixGuixLoader(swh_storage, sources_url)
res = loader.load()
assert res["status"] == "eventful"
assert_last_visit_matches(
swh_storage,
sources_url,
status="partial",
type="nixguix",
snapshot=SNAPSHOT1.id,
)
check_snapshot(SNAPSHOT1, storage=swh_storage)
def test_eoferror(swh_storage, requests_mock_datadir):
"""Load a truncated archive which is invalid to make the uncompress
function raising the exception EOFError. We then check if a
snapshot is created, meaning this error is well managed.
"""
sources = (
"https://nix-community.github.io/nixpkgs-swh/sources-EOFError.json" # noqa
)
loader = NixGuixLoader(swh_storage, sources)
loader.load()
expected_snapshot = Snapshot(
id=hash_to_bytes("4257fa2350168c6bfec726a06452ea27a2c0cb33"),
branches={
b"evaluation": SnapshotBranch(
target=hash_to_bytes("cc4e04c26672dd74e5fd0fecb78b435fb55368f7"),
target_type=TargetType.REVISION,
),
},
)
check_snapshot(expected_snapshot, storage=swh_storage)
def fake_download(
url: str,
dest: str,
hashes: Dict = {},
filename: Optional[str] = None,
auth: Optional[Tuple[str, str]] = None,
) -> Tuple[str, Dict]:
"""Fake download which raises HashCollision (for the sake of test simpliciy,
let's accept that makes sense)
For tests purpose only.
"""
if url == "https://example.com/file.txt":
# instead of failing because it's a file not dealt with by the nix guix
# loader, make it raise a hash collision
raise HashCollision("sha1", "f92d74e3874587aaf443d1db961d4e26dde13e9c", [])
return download(url, dest, hashes, filename, auth)
def test_raise_exception(swh_storage, requests_mock_datadir, mocker):
mock_download = mocker.patch("swh.loader.package.loader.download")
mock_download.side_effect = fake_download
loader = NixGuixLoader(swh_storage, sources_url)
res = loader.load()
assert res == {
"status": "eventful",
"snapshot_id": SNAPSHOT1.id.hex(),
}
# The visit is partial because some artifact downloads failed
assert_last_visit_matches(
swh_storage,
sources_url,
status="partial",
type="nixguix",
snapshot=SNAPSHOT1.id,
)
check_snapshot(SNAPSHOT1, storage=swh_storage)
assert len(mock_download.mock_calls) == 3
def test_load_nixguix_one_common_artifact_from_other_loader(
swh_storage, datadir, requests_mock_datadir_visits, caplog
):
- """Misformatted revision should be caught and logged, then loading continues
-
- """
+ """Misformatted revision should be caught and logged, then loading continues"""
caplog.set_level(logging.ERROR, "swh.loader.package.nixguix.loader")
# 1. first ingest with for example the archive loader
gnu_url = "https://ftp.gnu.org/gnu/8sync/"
release = "0.1.0"
artifact_url = f"https://ftp.gnu.org/gnu/8sync/8sync-{release}.tar.gz"
gnu_artifacts = [
{
"time": 944729610,
"url": artifact_url,
"length": 221837,
"filename": f"8sync-{release}.tar.gz",
"version": release,
}
]
archive_loader = ArchiveLoader(swh_storage, url=gnu_url, artifacts=gnu_artifacts)
actual_load_status = archive_loader.load()
expected_snapshot_id = "9efecc835e8f99254934f256b5301b94f348fd17"
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] == expected_snapshot_id # noqa
assert_last_visit_matches(
archive_loader.storage,
gnu_url,
status="full",
type="tar",
snapshot=hash_to_bytes(expected_snapshot_id),
)
# 2. Then ingest with the nixguix loader which lists the same artifact within its
# sources.json
# ensure test setup is ok
data_sources = os.path.join(
datadir, "https_nix-community.github.io", "nixpkgs-swh_sources_special.json"
)
all_sources = json.loads(open(data_sources).read())
found = False
for source in all_sources["sources"]:
if source["urls"][0] == artifact_url:
found = True
assert (
found is True
), f"test setup error: {artifact_url} must be in {data_sources}"
# first visit with a snapshot, ok
sources_url = "https://nix-community.github.io/nixpkgs-swh/sources_special.json"
loader = NixGuixLoader(swh_storage, sources_url)
actual_load_status2 = loader.load()
assert actual_load_status2["status"] == "eventful"
snapshot_id = actual_load_status2["snapshot_id"]
assert_last_visit_matches(
swh_storage,
sources_url,
status="full",
type="nixguix",
snapshot=hash_to_bytes(snapshot_id),
)
snapshot = snapshot_get_all_branches(swh_storage, hash_to_bytes(snapshot_id))
assert snapshot
diff --git a/swh/loader/package/npm/loader.py b/swh/loader/package/npm/loader.py
index b082a0f..bca39c3 100644
--- a/swh/loader/package/npm/loader.py
+++ b/swh/loader/package/npm/loader.py
@@ -1,309 +1,305 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
from codecs import BOM_UTF8
import json
import logging
import os
import string
from typing import Any, Dict, Iterator, List, Optional, Sequence, Tuple, Union
from urllib.parse import quote
import attr
import chardet
from swh.loader.package.loader import (
BasePackageInfo,
PackageLoader,
RawExtrinsicMetadataCore,
)
from swh.loader.package.utils import api_info, cached_method, release_name
from swh.model.model import (
MetadataAuthority,
MetadataAuthorityType,
ObjectType,
Person,
Release,
Sha1Git,
TimestampWithTimezone,
)
from swh.storage.interface import StorageInterface
logger = logging.getLogger(__name__)
EMPTY_PERSON = Person.from_fullname(b"")
@attr.s
class NpmPackageInfo(BasePackageInfo):
raw_info = attr.ib(type=Dict[str, Any])
package_name = attr.ib(type=str)
date = attr.ib(type=Optional[str])
shasum = attr.ib(type=str)
"""sha1 checksum"""
# we cannot rely only on $shasum, as it is technically possible for two versions
# of the same package to have the exact same tarball.
# But the release data (message and date) are extrinsic to the content of the
# package, so they differ between versions.
# So we need every attribute used to build the release object to be part of the
# manifest.
MANIFEST_FORMAT = string.Template(
"date $date\nname $package_name\nshasum $shasum\nurl $url\nversion $version"
)
EXTID_TYPE = "npm-manifest-sha256"
EXTID_VERSION = 0
@classmethod
def from_metadata(
cls, project_metadata: Dict[str, Any], version: str
) -> "NpmPackageInfo":
package_metadata = project_metadata["versions"][version]
url = package_metadata["dist"]["tarball"]
assert package_metadata["name"] == project_metadata["name"]
# No date available in intrinsic metadata: retrieve it from the API
# metadata, using the version number that the API claims this package
# has.
extrinsic_version = package_metadata["version"]
if "time" in project_metadata:
date = project_metadata["time"][extrinsic_version]
elif "mtime" in package_metadata:
date = package_metadata["mtime"]
else:
date = None
return cls(
package_name=package_metadata["name"],
url=url,
filename=os.path.basename(url),
date=date,
shasum=package_metadata["dist"]["shasum"],
version=extrinsic_version,
raw_info=package_metadata,
directory_extrinsic_metadata=[
RawExtrinsicMetadataCore(
format="replicate-npm-package-json",
metadata=json.dumps(package_metadata).encode(),
)
],
)
class NpmLoader(PackageLoader[NpmPackageInfo]):
- """Load npm origin's artifact releases into swh archive.
-
- """
+ """Load npm origin's artifact releases into swh archive."""
visit_type = "npm"
def __init__(
self,
storage: StorageInterface,
url: str,
max_content_size: Optional[int] = None,
):
"""Constructor
Args
str: origin url (e.g. https://www.npmjs.com/package/<package-name>)
"""
super().__init__(storage=storage, url=url, max_content_size=max_content_size)
self.package_name = url.split("https://www.npmjs.com/package/")[1]
safe_name = quote(self.package_name, safe="")
self.provider_url = f"https://replicate.npmjs.com/{safe_name}/"
self._info: Dict[str, Any] = {}
self._versions = None
@cached_method
def _raw_info(self) -> bytes:
return api_info(self.provider_url)
@cached_method
def info(self) -> Dict:
- """Return the project metadata information (fetched from npm registry)
-
- """
+ """Return the project metadata information (fetched from npm registry)"""
return json.loads(self._raw_info())
def get_versions(self) -> Sequence[str]:
return sorted(list(self.info()["versions"].keys()))
def get_default_version(self) -> str:
return self.info()["dist-tags"].get("latest", "")
def get_metadata_authority(self):
return MetadataAuthority(
- type=MetadataAuthorityType.FORGE, url="https://npmjs.com/", metadata={},
+ type=MetadataAuthorityType.FORGE,
+ url="https://npmjs.com/",
+ metadata={},
)
def get_package_info(self, version: str) -> Iterator[Tuple[str, NpmPackageInfo]]:
p_info = NpmPackageInfo.from_metadata(
project_metadata=self.info(), version=version
)
yield release_name(version), p_info
def build_release(
self, p_info: NpmPackageInfo, uncompressed_path: str, directory: Sha1Git
) -> Optional[Release]:
# Metadata from NPM is not intrinsic to tarballs.
# This means two package versions can have the same tarball, but different
# metadata. To avoid mixing up releases, every field used to build the
# release object must be part of NpmPackageInfo.MANIFEST_FORMAT.
i_metadata = extract_intrinsic_metadata(uncompressed_path)
if not i_metadata:
return None
author = extract_npm_package_author(i_metadata)
assert self.package_name == p_info.package_name
msg = (
f"Synthetic release for NPM source package {p_info.package_name} "
f"version {p_info.version}\n"
)
if p_info.date is None:
url = p_info.url
artifact_name = os.path.basename(url)
raise ValueError(
"Origin %s: Cannot determine upload time for artifact %s."
% (p_info.url, artifact_name)
)
date = TimestampWithTimezone.from_iso8601(p_info.date)
# FIXME: this is to remain bug-compatible with earlier versions:
date = attr.evolve(date, timestamp=attr.evolve(date.timestamp, microseconds=0))
r = Release(
name=p_info.version.encode(),
message=msg.encode(),
author=author,
date=date,
target=directory,
target_type=ObjectType.DIRECTORY,
synthetic=True,
)
return r
def _author_str(author_data: Union[Dict, List, str]) -> str:
- """Parse author from package.json author fields
-
- """
+ """Parse author from package.json author fields"""
if isinstance(author_data, dict):
author_str = ""
name = author_data.get("name")
if name is not None:
if isinstance(name, str):
author_str += name
elif isinstance(name, list):
author_str += _author_str(name[0]) if len(name) > 0 else ""
email = author_data.get("email")
if email is not None:
author_str += f" <{email}>"
result = author_str
elif isinstance(author_data, list):
result = _author_str(author_data[0]) if len(author_data) > 0 else ""
else:
result = author_data
return result
def extract_npm_package_author(package_json: Dict[str, Any]) -> Person:
"""
Extract package author from a ``package.json`` file content and
return it in swh format.
Args:
package_json: Dict holding the content of parsed
``package.json`` file
Returns:
Person
"""
for author_key in ("author", "authors"):
if author_key in package_json:
author_data = package_json[author_key]
if author_data is None:
return EMPTY_PERSON
author_str = _author_str(author_data)
return Person.from_fullname(author_str.encode())
return EMPTY_PERSON
def _lstrip_bom(s, bom=BOM_UTF8):
if s.startswith(bom):
return s[len(bom) :]
else:
return s
def load_json(json_bytes):
"""
Try to load JSON from bytes and return a dictionary.
First try to decode from utf-8. If the decoding failed,
try to detect the encoding and decode again with replace
error handling.
If JSON is malformed, an empty dictionary will be returned.
Args:
json_bytes (bytes): binary content of a JSON file
Returns:
dict: JSON data loaded in a dictionary
"""
json_data = {}
try:
json_str = _lstrip_bom(json_bytes).decode("utf-8")
except UnicodeDecodeError:
encoding = chardet.detect(json_bytes)["encoding"]
if encoding:
json_str = json_bytes.decode(encoding, "replace")
try:
json_data = json.loads(json_str)
except json.decoder.JSONDecodeError:
pass
return json_data
def extract_intrinsic_metadata(dir_path: str) -> Dict:
"""Given an uncompressed path holding the pkginfo file, returns a
pkginfo parsed structure as a dict.
The release artifact contains at their root one folder. For example:
$ tar tvf zprint-0.0.6.tar.gz
drwxr-xr-x root/root 0 2018-08-22 11:01 zprint-0.0.6/
...
Args:
dir_path (str): Path to the uncompressed directory
representing a release artifact from npm.
Returns:
the pkginfo parsed structure as a dict if any or None if
none was present.
"""
# Retrieve the root folder of the archive
if not os.path.exists(dir_path):
return {}
lst = os.listdir(dir_path)
if len(lst) == 0:
return {}
project_dirname = lst[0]
package_json_path = os.path.join(dir_path, project_dirname, "package.json")
if not os.path.exists(package_json_path):
return {}
with open(package_json_path, "rb") as package_json_file:
package_json_bytes = package_json_file.read()
return load_json(package_json_bytes)
diff --git a/swh/loader/package/npm/tests/test_npm.py b/swh/loader/package/npm/tests/test_npm.py
index 63e5924..61fe75b 100644
--- a/swh/loader/package/npm/tests/test_npm.py
+++ b/swh/loader/package/npm/tests/test_npm.py
@@ -1,729 +1,773 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import json
import os
import pytest
from swh.loader.package import __version__
from swh.loader.package.npm.loader import (
NpmLoader,
_author_str,
extract_npm_package_author,
)
from swh.loader.tests import assert_last_visit_matches, check_snapshot, get_stats
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
Person,
RawExtrinsicMetadata,
Release,
Snapshot,
SnapshotBranch,
TargetType,
TimestampWithTimezone,
)
from swh.model.model import MetadataAuthority, MetadataAuthorityType, MetadataFetcher
from swh.model.model import ObjectType as ModelObjectType
from swh.model.swhids import CoreSWHID, ExtendedObjectType, ExtendedSWHID, ObjectType
from swh.storage.interface import PagedResult
@pytest.fixture
def org_api_info(datadir) -> bytes:
- with open(os.path.join(datadir, "https_replicate.npmjs.com", "org"), "rb",) as f:
+ with open(
+ os.path.join(datadir, "https_replicate.npmjs.com", "org"),
+ "rb",
+ ) as f:
return f.read()
def test_npm_author_str():
for author, expected_author in [
("author", "author"),
(
["Al from quantum leap", "hal from 2001 space odyssey"],
"Al from quantum leap",
),
([], ""),
- ({"name": "groot", "email": "groot@galaxy.org",}, "groot <groot@galaxy.org>"),
- ({"name": "somebody",}, "somebody"),
+ (
+ {
+ "name": "groot",
+ "email": "groot@galaxy.org",
+ },
+ "groot <groot@galaxy.org>",
+ ),
+ (
+ {
+ "name": "somebody",
+ },
+ "somebody",
+ ),
({"email": "no@one.org"}, " <no@one.org>"), # note first elt is an extra blank
- ({"name": "no one", "email": None,}, "no one"),
- ({"email": None,}, ""),
+ (
+ {
+ "name": "no one",
+ "email": None,
+ },
+ "no one",
+ ),
+ (
+ {
+ "email": None,
+ },
+ "",
+ ),
({"name": None}, ""),
- ({"name": None, "email": None,}, ""),
+ (
+ {
+ "name": None,
+ "email": None,
+ },
+ "",
+ ),
({}, ""),
(None, None),
- ({"name": []}, "",),
(
- {"name": ["Susan McSween", "William H. Bonney", "Doc Scurlock",]},
+ {"name": []},
+ "",
+ ),
+ (
+ {
+ "name": [
+ "Susan McSween",
+ "William H. Bonney",
+ "Doc Scurlock",
+ ]
+ },
"Susan McSween",
),
(None, None),
]:
assert _author_str(author) == expected_author
def test_npm_extract_npm_package_author(datadir):
package_metadata_filepath = os.path.join(
datadir, "https_replicate.npmjs.com", "org_visit1"
)
with open(package_metadata_filepath) as json_file:
package_metadata = json.load(json_file)
extract_npm_package_author(package_metadata["versions"]["0.0.2"]) == Person(
fullname=b"mooz <stillpedant@gmail.com>",
name=b"mooz",
email=b"stillpedant@gmail.com",
)
assert extract_npm_package_author(package_metadata["versions"]["0.0.3"]) == Person(
fullname=b"Masafumi Oyamada <stillpedant@gmail.com>",
name=b"Masafumi Oyamada",
email=b"stillpedant@gmail.com",
)
package_json = json.loads(
"""
{
"name": "highlightjs-line-numbers.js",
"version": "2.7.0",
"description": "Highlight.js line numbers plugin.",
"main": "src/highlightjs-line-numbers.js",
"dependencies": {},
"devDependencies": {
"gulp": "^4.0.0",
"gulp-rename": "^1.4.0",
"gulp-replace": "^0.6.1",
"gulp-uglify": "^1.2.0"
},
"repository": {
"type": "git",
"url": "https://github.com/wcoder/highlightjs-line-numbers.js.git"
},
"author": "Yauheni Pakala <evgeniy.pakalo@gmail.com>",
"license": "MIT",
"bugs": {
"url": "https://github.com/wcoder/highlightjs-line-numbers.js/issues"
},
"homepage": "http://wcoder.github.io/highlightjs-line-numbers.js/"
}"""
)
assert extract_npm_package_author(package_json) == Person(
fullname=b"Yauheni Pakala <evgeniy.pakalo@gmail.com>",
name=b"Yauheni Pakala",
email=b"evgeniy.pakalo@gmail.com",
)
package_json = json.loads(
"""
{
"name": "3-way-diff",
"version": "0.0.1",
"description": "3-way diffing of JavaScript objects",
"main": "index.js",
"authors": [
{
"name": "Shawn Walsh",
"url": "https://github.com/shawnpwalsh"
},
{
"name": "Markham F Rollins IV",
"url": "https://github.com/mrollinsiv"
}
],
"keywords": [
"3-way diff",
"3 way diff",
"three-way diff",
"three way diff"
],
"devDependencies": {
"babel-core": "^6.20.0",
"babel-preset-es2015": "^6.18.0",
"mocha": "^3.0.2"
},
"dependencies": {
"lodash": "^4.15.0"
}
}"""
)
assert extract_npm_package_author(package_json) == Person(
fullname=b"Shawn Walsh", name=b"Shawn Walsh", email=None
)
package_json = json.loads(
"""
{
"name": "yfe-ynpm",
"version": "1.0.0",
"homepage": "http://gitlab.ywwl.com/yfe/yfe-ynpm",
"repository": {
"type": "git",
"url": "git@gitlab.ywwl.com:yfe/yfe-ynpm.git"
},
"author": [
"fengmk2 <fengmk2@gmail.com> (https://fengmk2.com)",
"xufuzi <xufuzi@ywwl.com> (https://7993.org)"
],
"license": "MIT"
}"""
)
assert extract_npm_package_author(package_json) == Person(
fullname=b"fengmk2 <fengmk2@gmail.com> (https://fengmk2.com)",
name=b"fengmk2",
email=b"fengmk2@gmail.com",
)
package_json = json.loads(
"""
{
"name": "umi-plugin-whale",
"version": "0.0.8",
"description": "Internal contract component",
"authors": {
"name": "xiaohuoni",
"email": "448627663@qq.com"
},
"repository": "alitajs/whale",
"devDependencies": {
"np": "^3.0.4",
"umi-tools": "*"
},
"license": "MIT"
}"""
)
assert extract_npm_package_author(package_json) == Person(
fullname=b"xiaohuoni <448627663@qq.com>",
name=b"xiaohuoni",
email=b"448627663@qq.com",
)
package_json_no_authors = json.loads(
"""{
"authors": null,
"license": "MIT"
}"""
)
assert extract_npm_package_author(package_json_no_authors) == Person.from_fullname(
b""
)
def normalize_hashes(hashes):
if isinstance(hashes, str):
return hash_to_bytes(hashes)
if isinstance(hashes, list):
return [hash_to_bytes(x) for x in hashes]
return {hash_to_bytes(k): hash_to_bytes(v) for k, v in hashes.items()}
_expected_new_contents_first_visit = normalize_hashes(
[
"4ce3058e16ab3d7e077f65aabf855c34895bf17c",
"858c3ceee84c8311adc808f8cdb30d233ddc9d18",
"0fa33b4f5a4e0496da6843a38ff1af8b61541996",
"85a410f8ef8eb8920f2c384a9555566ad4a2e21b",
"9163ac8025923d5a45aaac482262893955c9b37b",
"692cf623b8dd2c5df2c2998fd95ae4ec99882fb4",
"18c03aac6d3e910efb20039c15d70ab5e0297101",
"41265c42446aac17ca769e67d1704f99e5a1394d",
"783ff33f5882813dca9239452c4a7cadd4dba778",
"b029cfb85107aee4590c2434a3329bfcf36f8fa1",
"112d1900b4c2e3e9351050d1b542c9744f9793f3",
"5439bbc4bd9a996f1a38244e6892b71850bc98fd",
"d83097a2f994b503185adf4e719d154123150159",
"d0939b4898e83090ee55fd9d8a60e312cfadfbaf",
"b3523a26f7147e4af40d9d462adaae6d49eda13e",
"cd065fb435d6fb204a8871bcd623d0d0e673088c",
"2854a40855ad839a54f4b08f5cff0cf52fca4399",
"b8a53bbaac34ebb8c6169d11a4b9f13b05c583fe",
"0f73d56e1cf480bded8a1ecf20ec6fc53c574713",
"0d9882b2dfafdce31f4e77fe307d41a44a74cefe",
"585fc5caab9ead178a327d3660d35851db713df1",
"e8cd41a48d79101977e3036a87aeb1aac730686f",
"5414efaef33cceb9f3c9eb5c4cc1682cd62d14f7",
"9c3cc2763bf9e9e37067d3607302c4776502df98",
"3649a68410e354c83cd4a38b66bd314de4c8f5c9",
"e96ed0c091de1ebdf587104eaf63400d1974a1fe",
"078ca03d2f99e4e6eab16f7b75fbb7afb699c86c",
"38de737da99514de6559ff163c988198bc91367a",
]
)
_expected_new_directories_first_visit = normalize_hashes(
[
"3370d20d6f96dc1c9e50f083e2134881db110f4f",
"42753c0c2ab00c4501b552ac4671c68f3cf5aece",
"d7895533ef5edbcffdea3f057d9fef3a1ef845ce",
"80579be563e2ef3e385226fe7a3f079b377f142c",
"3b0ddc6a9e58b4b53c222da4e27b280b6cda591c",
"bcad03ce58ac136f26f000990fc9064e559fe1c0",
"5fc7e82a1bc72e074665c6078c6d3fad2f13d7ca",
"e3cd26beba9b1e02f6762ef54bd9ac80cc5f25fd",
"584b5b4b6cf7f038095e820b99386a9c232de931",
"184c8d6d0d242f2b1792ef9d3bf396a5434b7f7a",
"bb5f4ee143c970367eb409f2e4c1104898048b9d",
"1b95491047add1103db0dfdfa84a9735dcb11e88",
"a00c6de13471a2d66e64aca140ddb21ef5521e62",
"5ce6c1cd5cda2d546db513aaad8c72a44c7771e2",
"c337091e349b6ac10d38a49cdf8c2401ef9bb0f2",
"202fafcd7c0f8230e89d5496ad7f44ab12b807bf",
"775cc516543be86c15c1dc172f49c0d4e6e78235",
"ff3d1ead85a14f891e8b3fa3a89de39db1b8de2e",
]
)
_expected_new_releases_first_visit = normalize_hashes(
{
"d38cc0b571cd41f3c85513864e049766b42032a7": (
"42753c0c2ab00c4501b552ac4671c68f3cf5aece"
),
"62bf7076bae9aa2cb4d6cb3bf7ce0ea4fdd5b295": (
"3370d20d6f96dc1c9e50f083e2134881db110f4f"
),
"6e976db82f6c310596b21fb0ed8b11f507631434": (
"d7895533ef5edbcffdea3f057d9fef3a1ef845ce"
),
}
)
def package_url(package):
return "https://www.npmjs.com/package/%s" % package
def package_metadata_url(package):
return "https://replicate.npmjs.com/%s/" % package
def test_npm_loader_first_visit(swh_storage, requests_mock_datadir, org_api_info):
package = "org"
url = package_url(package)
loader = NpmLoader(swh_storage, url)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("0996ca28d6280499abcf485b51c4e3941b057249")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="npm", snapshot=expected_snapshot_id
)
release_id = "d38cc0b571cd41f3c85513864e049766b42032a7"
versions = [
("0.0.2", release_id),
("0.0.3", "62bf7076bae9aa2cb4d6cb3bf7ce0ea4fdd5b295"),
("0.0.4", "6e976db82f6c310596b21fb0ed8b11f507631434"),
]
expected_snapshot = Snapshot(
id=expected_snapshot_id,
branches={
b"HEAD": SnapshotBranch(
target=b"releases/0.0.4", target_type=TargetType.ALIAS
),
**{
b"releases/"
+ version_name.encode(): SnapshotBranch(
- target=hash_to_bytes(version_id), target_type=TargetType.RELEASE,
+ target=hash_to_bytes(version_id),
+ target_type=TargetType.RELEASE,
)
for (version_name, version_id) in versions
},
},
)
check_snapshot(expected_snapshot, swh_storage)
assert swh_storage.release_get([hash_to_bytes(release_id)])[0] == Release(
name=b"0.0.2",
message=b"Synthetic release for NPM source package org version 0.0.2\n",
target=hash_to_bytes("42753c0c2ab00c4501b552ac4671c68f3cf5aece"),
target_type=ModelObjectType.DIRECTORY,
synthetic=True,
author=Person(
fullname=b"mooz <stillpedant@gmail.com>",
name=b"mooz",
email=b"stillpedant@gmail.com",
),
date=TimestampWithTimezone.from_datetime(
datetime.datetime(2014, 1, 1, 15, 40, 33, tzinfo=datetime.timezone.utc)
),
id=hash_to_bytes(release_id),
)
contents = swh_storage.content_get(_expected_new_contents_first_visit)
count = sum(0 if content is None else 1 for content in contents)
assert count == len(_expected_new_contents_first_visit)
assert (
list(swh_storage.directory_missing(_expected_new_directories_first_visit)) == []
)
assert list(swh_storage.release_missing(_expected_new_releases_first_visit)) == []
metadata_authority = MetadataAuthority(
- type=MetadataAuthorityType.FORGE, url="https://npmjs.com/",
+ type=MetadataAuthorityType.FORGE,
+ url="https://npmjs.com/",
)
for (version_name, release_id) in versions:
release = swh_storage.release_get([hash_to_bytes(release_id)])[0]
assert release.target_type == ModelObjectType.DIRECTORY
directory_id = release.target
directory_swhid = ExtendedSWHID(
- object_type=ExtendedObjectType.DIRECTORY, object_id=directory_id,
+ object_type=ExtendedObjectType.DIRECTORY,
+ object_id=directory_id,
)
release_swhid = CoreSWHID(
- object_type=ObjectType.RELEASE, object_id=hash_to_bytes(release_id),
+ object_type=ObjectType.RELEASE,
+ object_id=hash_to_bytes(release_id),
)
expected_metadata = [
RawExtrinsicMetadata(
target=directory_swhid,
authority=metadata_authority,
fetcher=MetadataFetcher(
- name="swh.loader.package.npm.loader.NpmLoader", version=__version__,
+ name="swh.loader.package.npm.loader.NpmLoader",
+ version=__version__,
),
discovery_date=loader.visit_date,
format="replicate-npm-package-json",
metadata=json.dumps(
json.loads(org_api_info)["versions"][version_name]
).encode(),
origin="https://www.npmjs.com/package/org",
release=release_swhid,
)
]
assert swh_storage.raw_extrinsic_metadata_get(
- directory_swhid, metadata_authority,
- ) == PagedResult(next_page_token=None, results=expected_metadata,)
+ directory_swhid,
+ metadata_authority,
+ ) == PagedResult(
+ next_page_token=None,
+ results=expected_metadata,
+ )
stats = get_stats(swh_storage)
assert {
"content": len(_expected_new_contents_first_visit),
"directory": len(_expected_new_directories_first_visit),
"origin": 1,
"origin_visit": 1,
"release": len(_expected_new_releases_first_visit),
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_npm_loader_incremental_visit(swh_storage, requests_mock_datadir_visits):
package = "org"
url = package_url(package)
loader = NpmLoader(swh_storage, url)
expected_snapshot_id = hash_to_bytes("0996ca28d6280499abcf485b51c4e3941b057249")
actual_load_status = loader.load()
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="npm", snapshot=expected_snapshot_id
)
stats = get_stats(swh_storage)
assert {
"content": len(_expected_new_contents_first_visit),
"directory": len(_expected_new_directories_first_visit),
"origin": 1,
"origin_visit": 1,
"release": len(_expected_new_releases_first_visit),
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
# reset loader internal state
del loader._cached_info
del loader._cached__raw_info
actual_load_status2 = loader.load()
assert actual_load_status2["status"] == "eventful"
snap_id2 = actual_load_status2["snapshot_id"]
assert snap_id2 is not None
assert snap_id2 != actual_load_status["snapshot_id"]
assert_last_visit_matches(swh_storage, url, status="full", type="npm")
stats = get_stats(swh_storage)
assert { # 3 new releases artifacts
"content": len(_expected_new_contents_first_visit) + 14,
"directory": len(_expected_new_directories_first_visit) + 15,
"origin": 1,
"origin_visit": 2,
"release": len(_expected_new_releases_first_visit) + 3,
"revision": 0,
"skipped_content": 0,
"snapshot": 2,
} == stats
urls = [
m.url
for m in requests_mock_datadir_visits.request_history
if m.url.startswith("https://registry.npmjs.org")
]
assert len(urls) == len(set(urls)) # we visited each artifact once across
@pytest.mark.usefixtures("requests_mock_datadir")
def test_npm_loader_version_divergence(swh_storage):
package = "@aller/shared"
url = package_url(package)
loader = NpmLoader(swh_storage, url)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("68eed3d3bc852e7f435a84f18ee77e23f6884be2")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="npm", snapshot=expected_snapshot_id
)
expected_snapshot = Snapshot(
id=expected_snapshot_id,
branches={
b"HEAD": SnapshotBranch(
target_type=TargetType.ALIAS, target=b"releases/0.1.0"
),
b"releases/0.1.0": SnapshotBranch(
target_type=TargetType.RELEASE,
target=hash_to_bytes("0c486b50b407f847ef7581f595c2b6c2062f1089"),
),
b"releases/0.1.1-alpha.14": SnapshotBranch(
target_type=TargetType.RELEASE,
target=hash_to_bytes("79d80c87c0a8d104a216cc539baad962a454802a"),
),
},
)
check_snapshot(expected_snapshot, swh_storage)
stats = get_stats(swh_storage)
assert { # 1 new releases artifacts
"content": 534,
"directory": 153,
"origin": 1,
"origin_visit": 1,
"release": 2,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_npm_loader_duplicate_shasum(swh_storage, requests_mock_datadir):
"""Test with two versions that have exactly the same tarball"""
package = "org_version_mismatch"
url = package_url(package)
loader = NpmLoader(swh_storage, url)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("ac867a4c22ba4e22a022d319f309714477412a5a")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="npm", snapshot=expected_snapshot_id
)
beta_release_id = "e6d5490a02ac2a8dcd49702f9ccd5a64c90a46f1"
release_id = "f6985f437e28db6eb1b7533230e05ed99f2c91f0"
versions = [
("0.0.3-beta", beta_release_id),
("0.0.3", release_id),
]
expected_snapshot = Snapshot(
id=expected_snapshot_id,
branches={
b"HEAD": SnapshotBranch(
target=b"releases/0.0.3", target_type=TargetType.ALIAS
),
**{
b"releases/"
+ version_name.encode(): SnapshotBranch(
- target=hash_to_bytes(version_id), target_type=TargetType.RELEASE,
+ target=hash_to_bytes(version_id),
+ target_type=TargetType.RELEASE,
)
for (version_name, version_id) in versions
},
},
)
check_snapshot(expected_snapshot, swh_storage)
assert swh_storage.release_get([hash_to_bytes(beta_release_id)])[0] == Release(
name=b"0.0.3-beta",
message=(
b"Synthetic release for NPM source package org_version_mismatch "
b"version 0.0.3-beta\n"
),
target=hash_to_bytes("3370d20d6f96dc1c9e50f083e2134881db110f4f"),
target_type=ModelObjectType.DIRECTORY,
synthetic=True,
author=Person.from_fullname(b"Masafumi Oyamada <stillpedant@gmail.com>"),
date=TimestampWithTimezone.from_datetime(
datetime.datetime(2014, 1, 1, 15, 40, 33, tzinfo=datetime.timezone.utc)
),
id=hash_to_bytes(beta_release_id),
)
assert swh_storage.release_get([hash_to_bytes(release_id)])[0] == Release(
name=b"0.0.3",
message=(
b"Synthetic release for NPM source package org_version_mismatch "
b"version 0.0.3\n"
),
target=hash_to_bytes("3370d20d6f96dc1c9e50f083e2134881db110f4f"),
target_type=ModelObjectType.DIRECTORY,
synthetic=True,
author=Person.from_fullname(b"Masafumi Oyamada <stillpedant@gmail.com>"),
date=TimestampWithTimezone.from_datetime(
datetime.datetime(2014, 1, 1, 15, 55, 45, tzinfo=datetime.timezone.utc)
),
id=hash_to_bytes(release_id),
)
# Check incremental re-load keeps it unchanged
loader = NpmLoader(swh_storage, url)
actual_load_status = loader.load()
assert actual_load_status == {
"status": "uneventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="npm", snapshot=expected_snapshot_id
)
def test_npm_artifact_with_no_intrinsic_metadata(swh_storage, requests_mock_datadir):
- """Skip artifact with no intrinsic metadata during ingestion
-
- """
+ """Skip artifact with no intrinsic metadata during ingestion"""
package = "nativescript-telerik-analytics"
url = package_url(package)
loader = NpmLoader(swh_storage, url)
actual_load_status = loader.load()
# no branch as one artifact without any intrinsic metadata
expected_snapshot = Snapshot(
- id=hash_to_bytes("1a8893e6a86f444e8be8e7bda6cb34fb1735a00e"), branches={},
+ id=hash_to_bytes("1a8893e6a86f444e8be8e7bda6cb34fb1735a00e"),
+ branches={},
)
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot.id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="npm", snapshot=expected_snapshot.id
)
check_snapshot(expected_snapshot, swh_storage)
def test_npm_artifact_with_no_upload_time(swh_storage, requests_mock_datadir):
- """With no time upload, artifact is skipped
-
- """
+ """With no time upload, artifact is skipped"""
package = "jammit-no-time"
url = package_url(package)
loader = NpmLoader(swh_storage, url)
actual_load_status = loader.load()
# no branch as one artifact without any intrinsic metadata
expected_snapshot = Snapshot(
- id=hash_to_bytes("1a8893e6a86f444e8be8e7bda6cb34fb1735a00e"), branches={},
+ id=hash_to_bytes("1a8893e6a86f444e8be8e7bda6cb34fb1735a00e"),
+ branches={},
)
assert actual_load_status == {
"status": "uneventful",
"snapshot_id": expected_snapshot.id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="partial", type="npm", snapshot=expected_snapshot.id
)
check_snapshot(expected_snapshot, swh_storage)
def test_npm_artifact_use_mtime_if_no_time(swh_storage, requests_mock_datadir):
- """With no time upload, artifact is skipped
-
- """
+ """With no time upload, artifact is skipped"""
package = "jammit-express"
url = package_url(package)
loader = NpmLoader(swh_storage, url)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("33b8f105d48ce16b6c59158af660e0cc78bcbef4")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
# artifact is used
expected_snapshot = Snapshot(
id=expected_snapshot_id,
branches={
b"HEAD": SnapshotBranch(
target_type=TargetType.ALIAS, target=b"releases/0.0.1"
),
b"releases/0.0.1": SnapshotBranch(
target_type=TargetType.RELEASE,
target=hash_to_bytes("3e3b800570869fa9b3dbc302500553e62400cc06"),
),
},
)
assert_last_visit_matches(
swh_storage, url, status="full", type="npm", snapshot=expected_snapshot.id
)
check_snapshot(expected_snapshot, swh_storage)
def test_npm_no_artifact(swh_storage, requests_mock_datadir):
- """If no artifacts at all is found for origin, the visit fails completely
-
- """
+ """If no artifacts at all is found for origin, the visit fails completely"""
package = "catify"
url = package_url(package)
loader = NpmLoader(swh_storage, url)
actual_load_status = loader.load()
assert actual_load_status == {
"status": "failed",
}
assert_last_visit_matches(swh_storage, url, status="failed", type="npm")
def test_npm_origin_not_found(swh_storage, requests_mock_datadir):
url = package_url("non-existent-url")
loader = NpmLoader(swh_storage, url)
assert loader.load() == {"status": "failed"}
assert_last_visit_matches(
swh_storage, url, status="not_found", type="npm", snapshot=None
)
diff --git a/swh/loader/package/opam/loader.py b/swh/loader/package/opam/loader.py
index a2bb808..8fb4482 100644
--- a/swh/loader/package/opam/loader.py
+++ b/swh/loader/package/opam/loader.py
@@ -1,261 +1,265 @@
# Copyright (C) 2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import io
import os
from subprocess import PIPE, Popen, call
from typing import Iterator, List, Optional, Tuple
import attr
from swh.loader.package.loader import (
BasePackageInfo,
PackageLoader,
RawExtrinsicMetadataCore,
)
from swh.loader.package.utils import cached_method
from swh.model.model import (
MetadataAuthority,
MetadataAuthorityType,
ObjectType,
Person,
Release,
Sha1Git,
)
from swh.storage.interface import StorageInterface
@attr.s
class OpamPackageInfo(BasePackageInfo):
author = attr.ib(type=Person)
committer = attr.ib(type=Person)
def opam_read(
cmd: List[str], init_error_msg_if_any: Optional[str] = None
) -> Optional[str]:
"""This executes an opam command and returns the first line of the output.
Args:
cmd: Opam command to execute as a list of string
init_error_msg_if_any: Error message to raise in case a problem occurs
during initialization
Raises:
ValueError with the init_error_msg_if_any content in case stdout is not
consumable and the variable is provided with non empty value.
Returns:
the first line of the executed command output
"""
with Popen(cmd, stdout=PIPE) as proc:
if proc.stdout is not None:
for line in io.TextIOWrapper(proc.stdout):
# care only for the first line output result (mostly blank separated
# values, callers will deal with the parsing of the line)
return line
elif init_error_msg_if_any:
raise ValueError(init_error_msg_if_any)
return None
class OpamLoader(PackageLoader[OpamPackageInfo]):
"""Load all versions of a given package in a given opam repository.
The state of the opam repository is stored in a directory called an opam root. This
folder is a requisite for the opam binary to actually list information on package.
When initialize_opam_root is False (the default for production workers), the opam
root must already have been configured outside of the loading process. If not an
error is raised, thus failing the loading.
For standalone workers, initialize_opam_root must be set to True, so the ingestion
can take care of installing the required opam root properly.
The remaining ingestion uses the opam binary to give the versions of the given
package. Then, for each version, the loader uses the opam binary to list the tarball
url to fetch and ingest.
"""
visit_type = "opam"
def __init__(
self,
storage: StorageInterface,
url: str,
opam_root: str,
opam_instance: str,
opam_url: str,
opam_package: str,
max_content_size: Optional[int] = None,
initialize_opam_root: bool = False,
):
super().__init__(storage=storage, url=url, max_content_size=max_content_size)
self.opam_root = opam_root
self.opam_instance = opam_instance
self.opam_url = opam_url
self.opam_package = opam_package
self.initialize_opam_root = initialize_opam_root
def get_package_dir(self) -> str:
return (
f"{self.opam_root}/repo/{self.opam_instance}/packages/{self.opam_package}"
)
def get_package_name(self, version: str) -> str:
return f"{self.opam_package}.{version}"
def get_package_file(self, version: str) -> str:
return f"{self.get_package_dir()}/{self.get_package_name(version)}/opam"
def get_metadata_authority(self):
return MetadataAuthority(type=MetadataAuthorityType.FORGE, url=self.opam_url)
@cached_method
def _compute_versions(self) -> List[str]:
"""Compute the versions using opam internals
Raises:
ValueError in case the lister is not able to determine the list of versions
Returns:
The list of versions for the package
"""
# TODO: use `opam show` instead of this workaround when it support the `--repo`
# flag
package_dir = self.get_package_dir()
if not os.path.exists(package_dir):
raise ValueError(
f"can't get versions for package {self.opam_package} "
f"(at url {self.url})."
)
versions = [
".".join(version.split(".")[1:]) for version in os.listdir(package_dir)
]
if not versions:
raise ValueError(
f"can't get versions for package {self.opam_package} "
f"(at url {self.url})"
)
versions.sort()
return versions
def get_versions(self) -> List[str]:
"""First initialize the opam root directory if needed then start listing the
package versions.
Raises:
ValueError in case the lister is not able to determine the list of
versions or if the opam root directory is invalid.
"""
if self.initialize_opam_root:
# for standalone loader (e.g docker), loader must initialize the opam root
# folder
call(
[
"opam",
"init",
"--reinit",
"--bare",
"--no-setup",
"--root",
self.opam_root,
self.opam_instance,
self.opam_url,
]
)
else:
# for standard/production loaders, no need to initialize the opam root
# folder. It must be present though so check for it, if not present, raise
if not os.path.isfile(os.path.join(self.opam_root, "config")):
# so if not correctly setup, raise immediately
raise ValueError("Invalid opam root")
return self._compute_versions()
def get_default_version(self) -> str:
"""Return the most recent version of the package as default."""
return self._compute_versions()[-1]
def _opam_show_args(self, version: str):
package_file = self.get_package_file(version)
return [
"opam",
"show",
"--color",
"never",
"--safe",
"--normalise",
"--root",
self.opam_root,
"--file",
package_file,
]
def get_enclosed_single_line_field(self, field, version) -> Optional[str]:
result = opam_read(self._opam_show_args(version) + ["--field", field])
# Sanitize the result if any (remove trailing \n and enclosing ")
return result.strip().strip('"') if result else None
def get_package_info(self, version: str) -> Iterator[Tuple[str, OpamPackageInfo]]:
url = self.get_enclosed_single_line_field("url.src:", version)
if url is None:
raise ValueError(
- f"can't get field url.src: for version {version} of package {self.opam_package} \
- (at url {self.url}) from `opam show`"
+ f"can't get field url.src: for version {version} of package {self.opam_package}"
+ f" (at url {self.url}) from `opam show`"
)
authors_field = self.get_enclosed_single_line_field("authors:", version)
fullname = b"" if authors_field is None else str.encode(authors_field)
author = Person.from_fullname(fullname)
maintainer_field = self.get_enclosed_single_line_field("maintainer:", version)
fullname = b"" if maintainer_field is None else str.encode(maintainer_field)
committer = Person.from_fullname(fullname)
with Popen(self._opam_show_args(version) + ["--raw"], stdout=PIPE) as proc:
assert proc.stdout is not None
metadata = proc.stdout.read()
yield self.get_package_name(version), OpamPackageInfo(
url=url,
filename=None,
author=author,
committer=committer,
version=version,
directory_extrinsic_metadata=[
RawExtrinsicMetadataCore(
- metadata=metadata, format="opam-package-definition",
+ metadata=metadata,
+ format="opam-package-definition",
)
],
)
def build_release(
- self, p_info: OpamPackageInfo, uncompressed_path: str, directory: Sha1Git,
+ self,
+ p_info: OpamPackageInfo,
+ uncompressed_path: str,
+ directory: Sha1Git,
) -> Optional[Release]:
msg = (
f"Synthetic release for OPAM source package {self.opam_package} "
f"version {p_info.version}\n"
)
return Release(
name=p_info.version.encode(),
author=p_info.author,
message=msg.encode(),
date=None,
target=directory,
target_type=ObjectType.DIRECTORY,
synthetic=True,
)
diff --git a/swh/loader/package/opam/tests/test_opam.py b/swh/loader/package/opam/tests/test_opam.py
index b37d971..1ab1cdc 100644
--- a/swh/loader/package/opam/tests/test_opam.py
+++ b/swh/loader/package/opam/tests/test_opam.py
@@ -1,394 +1,414 @@
# Copyright (C) 2019-2022 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
from os.path import exists
import shutil
import pytest
from swh.loader.package import __version__
from swh.loader.package.loader import RawExtrinsicMetadataCore
from swh.loader.package.opam.loader import OpamLoader, OpamPackageInfo
from swh.loader.tests import assert_last_visit_matches, check_snapshot, get_stats
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
Person,
RawExtrinsicMetadata,
Release,
Snapshot,
SnapshotBranch,
TargetType,
)
from swh.model.model import MetadataAuthority, MetadataAuthorityType, MetadataFetcher
from swh.model.model import ObjectType as ModelObjectType
from swh.model.swhids import CoreSWHID, ExtendedObjectType, ExtendedSWHID, ObjectType
from swh.storage.interface import PagedResult
OCB_METADATA = b"""\
opam-version: "2.0"
name: "ocb"
version: "0.1"
synopsis: "SVG badge generator"
description:
"An OCaml library for SVG badge generation. There\'s also a command-line tool provided."
maintainer: "OCamlPro <contact@ocamlpro.com>"
authors: "OCamlPro <contact@ocamlpro.com>"
license: "ISC"
homepage: "https://ocamlpro.github.io/ocb/"
doc: "https://ocamlpro.github.io/ocb/api/"
bug-reports: "https://github.com/OCamlPro/ocb/issues"
depends: [
"ocaml" {>= "4.05"}
"dune" {>= "2.0"}
"odoc" {with-doc}
]
build: [
["dune" "subst"] {dev}
[
"dune"
"build"
"-p"
name
"-j"
jobs
"@install"
"@runtest" {with-test}
"@doc" {with-doc}
]
]
dev-repo: "git+https://github.com/OCamlPro/ocb.git"
url {
src: "https://github.com/OCamlPro/ocb/archive/0.1.tar.gz"
checksum: [
"sha256=aa27684fbda1b8036ae7e3c87de33a98a9cd2662bcc91c8447e00e41476b6a46"
"sha512=1260344f184dd8c8074b0439dbcc8a5d59550a654c249cd61913d4c150c664f37b76195ddca38f7f6646d08bddb320ceb8d420508450b4f09a233cd5c22e6b9b"
]
}
""" # noqa
@pytest.fixture
def fake_opam_root(mocker, tmpdir, datadir):
"""Fixture to initialize the actual opam in test context. It mocks the actual opam init
calls and installs a fake opam root out of the one present in datadir.
"""
# inhibits the real `subprocess.call` which prepares the required internal opam
# state
module_name = "swh.loader.package.opam.loader"
mock_init = mocker.patch(f"{module_name}.call", return_value=None)
# Installs the fake opam root for the tests to use
fake_opam_root_src = f"{datadir}/fake_opam_repo"
fake_opam_root_dst = f"{tmpdir}/opam"
# old version does not support dirs_exist_ok...
# TypeError: copytree() got an unexpected keyword argument 'dirs_exist_ok'
# see: https://docs.python.org/3.7/library/shutil.html
if exists(fake_opam_root_dst):
shutil.rmtree(fake_opam_root_dst)
shutil.copytree(fake_opam_root_src, fake_opam_root_dst)
yield fake_opam_root_dst
# loader are initialized with `initialize_opam_root=True` so this should be called
assert mock_init.called, "This should be called when loader use this fixture"
def test_opam_loader_no_opam_repository_fails(swh_storage, tmpdir, datadir):
"""Running opam loader without a prepared opam repository fails"""
opam_url = f"file://{datadir}/fake_opam_repo"
opam_root = tmpdir
opam_instance = "loadertest"
opam_package = "agrid"
url = f"opam+{opam_url}/packages/{opam_package}"
loader = OpamLoader(
swh_storage,
url,
opam_root,
opam_instance,
opam_url,
opam_package,
initialize_opam_root=False, # The opam directory must be present and no init...
)
# No opam root directory init directory from loader. So, at the opam root does not
# exist, the loading fails. That's the expected use for the production workers
# (whose opam_root maintenance will be externally managed).
actual_load_status = loader.load()
assert actual_load_status == {"status": "failed"}
def test_opam_loader_one_version(
tmpdir, requests_mock_datadir, fake_opam_root, datadir, swh_storage
):
opam_url = f"file://{datadir}/fake_opam_repo"
opam_root = fake_opam_root
opam_instance = "loadertest"
opam_package = "agrid"
url = f"opam+{opam_url}/packages/{opam_package}"
loader = OpamLoader(
swh_storage,
url,
opam_root,
opam_instance,
opam_url,
opam_package,
initialize_opam_root=True, # go through the initialization while mocking it
)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("e1159446b00745ba4daa7ee26d74fbd81ecc081c")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="opam", snapshot=expected_snapshot_id
)
release_id = hash_to_bytes("d4d8d3df4f34609a3eeabd48aea49002c5f54f41")
expected_snapshot = Snapshot(
id=expected_snapshot_id,
branches={
- b"HEAD": SnapshotBranch(target=b"agrid.0.1", target_type=TargetType.ALIAS,),
+ b"HEAD": SnapshotBranch(
+ target=b"agrid.0.1",
+ target_type=TargetType.ALIAS,
+ ),
b"agrid.0.1": SnapshotBranch(
- target=release_id, target_type=TargetType.RELEASE,
+ target=release_id,
+ target_type=TargetType.RELEASE,
),
},
)
check_snapshot(expected_snapshot, swh_storage)
assert swh_storage.release_get([release_id])[0] == Release(
name=b"0.1",
message=b"Synthetic release for OPAM source package agrid version 0.1\n",
target=hash_to_bytes("00412ee5bc601deb462e55addd1004715116785e"),
target_type=ModelObjectType.DIRECTORY,
synthetic=True,
author=Person.from_fullname(b"OCamlPro <contact@ocamlpro.com>"),
date=None,
id=release_id,
)
stats = get_stats(swh_storage)
assert {
"content": 18,
"directory": 8,
"origin": 1,
"origin_visit": 1,
"release": 1,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_opam_loader_many_version(
tmpdir, requests_mock_datadir, fake_opam_root, datadir, swh_storage
):
opam_url = f"file://{datadir}/fake_opam_repo"
opam_root = fake_opam_root
opam_instance = "loadertest"
opam_package = "directories"
url = f"opam+{opam_url}/packages/{opam_package}"
loader = OpamLoader(
swh_storage,
url,
opam_root,
opam_instance,
opam_url,
opam_package,
initialize_opam_root=True,
)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("f498f7f3b0edbce5cf5834b487a4f8360f6a6a43")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
expected_snapshot = Snapshot(
id=expected_snapshot_id,
branches={
b"HEAD": SnapshotBranch(
- target=b"directories.0.3", target_type=TargetType.ALIAS,
+ target=b"directories.0.3",
+ target_type=TargetType.ALIAS,
),
b"directories.0.1": SnapshotBranch(
target=hash_to_bytes("1c88d466b3d57a619e296999322d096fa37bb1c2"),
target_type=TargetType.RELEASE,
),
b"directories.0.2": SnapshotBranch(
target=hash_to_bytes("d6f30684039ad485511a138e2ae504ff67a13075"),
target_type=TargetType.RELEASE,
),
b"directories.0.3": SnapshotBranch(
target=hash_to_bytes("6cf92c0ff052074e69ac18809a9c8198bcc2e746"),
target_type=TargetType.RELEASE,
),
},
)
assert_last_visit_matches(
swh_storage, url, status="full", type="opam", snapshot=expected_snapshot_id
)
check_snapshot(expected_snapshot, swh_storage)
def test_opam_release(
tmpdir, requests_mock_datadir, fake_opam_root, swh_storage, datadir
):
opam_url = f"file://{datadir}/fake_opam_repo"
opam_root = fake_opam_root
opam_instance = "loadertest"
opam_package = "ocb"
url = f"opam+{opam_url}/packages/{opam_package}"
loader = OpamLoader(
swh_storage,
url,
opam_root,
opam_instance,
opam_url,
opam_package,
initialize_opam_root=True,
)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("8ba39f050243a72ca667c5587a87413240cbaa47")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
info_iter = loader.get_package_info("0.1")
branch_name, package_info = next(info_iter)
expected_branch_name = "ocb.0.1"
expected_package_info = OpamPackageInfo(
url="https://github.com/OCamlPro/ocb/archive/0.1.tar.gz",
filename=None,
author=Person.from_fullname(b"OCamlPro <contact@ocamlpro.com>"),
committer=Person.from_fullname(b"OCamlPro <contact@ocamlpro.com>"),
version="0.1",
directory_extrinsic_metadata=[
RawExtrinsicMetadataCore(
- metadata=OCB_METADATA, format="opam-package-definition",
+ metadata=OCB_METADATA,
+ format="opam-package-definition",
)
],
)
assert branch_name == expected_branch_name
assert package_info == expected_package_info
release_id = hash_to_bytes("c231e541eb29c712635ada394b04127ac69e9fb0")
expected_snapshot = Snapshot(
id=hash_to_bytes(actual_load_status["snapshot_id"]),
branches={
- b"HEAD": SnapshotBranch(target=b"ocb.0.1", target_type=TargetType.ALIAS,),
+ b"HEAD": SnapshotBranch(
+ target=b"ocb.0.1",
+ target_type=TargetType.ALIAS,
+ ),
b"ocb.0.1": SnapshotBranch(
- target=release_id, target_type=TargetType.RELEASE,
+ target=release_id,
+ target_type=TargetType.RELEASE,
),
},
)
assert_last_visit_matches(
swh_storage, url, status="full", type="opam", snapshot=expected_snapshot.id
)
check_snapshot(expected_snapshot, swh_storage)
release = swh_storage.release_get([release_id])[0]
assert release is not None
assert release.author == expected_package_info.author
def test_opam_metadata(
tmpdir, requests_mock_datadir, fake_opam_root, swh_storage, datadir
):
opam_url = f"file://{datadir}/fake_opam_repo"
opam_root = fake_opam_root
opam_instance = "loadertest"
opam_package = "ocb"
url = f"opam+{opam_url}/packages/{opam_package}"
loader = OpamLoader(
swh_storage,
url,
opam_root,
opam_instance,
opam_url,
opam_package,
initialize_opam_root=True,
)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
expected_release_id = hash_to_bytes("c231e541eb29c712635ada394b04127ac69e9fb0")
expected_snapshot = Snapshot(
id=hash_to_bytes(actual_load_status["snapshot_id"]),
branches={
- b"HEAD": SnapshotBranch(target=b"ocb.0.1", target_type=TargetType.ALIAS,),
+ b"HEAD": SnapshotBranch(
+ target=b"ocb.0.1",
+ target_type=TargetType.ALIAS,
+ ),
b"ocb.0.1": SnapshotBranch(
- target=expected_release_id, target_type=TargetType.RELEASE,
+ target=expected_release_id,
+ target_type=TargetType.RELEASE,
),
},
)
assert_last_visit_matches(
swh_storage, url, status="full", type="opam", snapshot=expected_snapshot.id
)
check_snapshot(expected_snapshot, swh_storage)
release = swh_storage.release_get([expected_release_id])[0]
assert release is not None
release_swhid = CoreSWHID(
object_type=ObjectType.RELEASE, object_id=expected_release_id
)
directory_swhid = ExtendedSWHID(
object_type=ExtendedObjectType.DIRECTORY, object_id=release.target
)
metadata_authority = MetadataAuthority(
- type=MetadataAuthorityType.FORGE, url=opam_url,
+ type=MetadataAuthorityType.FORGE,
+ url=opam_url,
)
expected_metadata = [
RawExtrinsicMetadata(
target=directory_swhid,
authority=metadata_authority,
fetcher=MetadataFetcher(
- name="swh.loader.package.opam.loader.OpamLoader", version=__version__,
+ name="swh.loader.package.opam.loader.OpamLoader",
+ version=__version__,
),
discovery_date=loader.visit_date,
format="opam-package-definition",
metadata=OCB_METADATA,
origin=url,
release=release_swhid,
)
]
assert swh_storage.raw_extrinsic_metadata_get(
- directory_swhid, metadata_authority,
- ) == PagedResult(next_page_token=None, results=expected_metadata,)
+ directory_swhid,
+ metadata_authority,
+ ) == PagedResult(
+ next_page_token=None,
+ results=expected_metadata,
+ )
diff --git a/swh/loader/package/pypi/loader.py b/swh/loader/package/pypi/loader.py
index 3ceca15..d50e19d 100644
--- a/swh/loader/package/pypi/loader.py
+++ b/swh/loader/package/pypi/loader.py
@@ -1,251 +1,248 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import json
import logging
import os
from typing import Any, Dict, Iterator, Optional, Sequence, Tuple
from urllib.parse import urlparse
import attr
from pkginfo import UnpackedSDist
from swh.loader.package.loader import (
BasePackageInfo,
PackageLoader,
PartialExtID,
RawExtrinsicMetadataCore,
)
from swh.loader.package.utils import EMPTY_AUTHOR, api_info, cached_method, release_name
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
MetadataAuthority,
MetadataAuthorityType,
ObjectType,
Person,
Release,
Sha1Git,
TimestampWithTimezone,
)
from swh.storage.interface import StorageInterface
logger = logging.getLogger(__name__)
EXTID_TYPE = "pypi-archive-sha256"
EXTID_VERSION = 0
@attr.s
class PyPIPackageInfo(BasePackageInfo):
raw_info = attr.ib(type=Dict[str, Any])
name = attr.ib(type=str)
comment_text = attr.ib(type=Optional[str])
sha256 = attr.ib(type=str)
upload_time = attr.ib(type=str)
@classmethod
def from_metadata(
cls, metadata: Dict[str, Any], name: str, version: str
) -> "PyPIPackageInfo":
return cls(
url=metadata["url"],
filename=metadata["filename"],
version=version,
raw_info=metadata,
name=name,
comment_text=metadata.get("comment_text"),
sha256=metadata["digests"]["sha256"],
upload_time=metadata["upload_time"],
directory_extrinsic_metadata=[
RawExtrinsicMetadataCore(
- format="pypi-project-json", metadata=json.dumps(metadata).encode(),
+ format="pypi-project-json",
+ metadata=json.dumps(metadata).encode(),
)
],
)
def extid(self) -> PartialExtID:
return (EXTID_TYPE, EXTID_VERSION, hash_to_bytes(self.sha256))
class PyPILoader(PackageLoader[PyPIPackageInfo]):
- """Load pypi origin's artifact releases into swh archive.
-
- """
+ """Load pypi origin's artifact releases into swh archive."""
visit_type = "pypi"
def __init__(
self,
storage: StorageInterface,
url: str,
max_content_size: Optional[int] = None,
):
super().__init__(storage=storage, url=url, max_content_size=max_content_size)
self.provider_url = pypi_api_url(self.url)
@cached_method
def _raw_info(self) -> bytes:
return api_info(self.provider_url)
@cached_method
def info(self) -> Dict:
- """Return the project metadata information (fetched from pypi registry)
-
- """
+ """Return the project metadata information (fetched from pypi registry)"""
return json.loads(self._raw_info())
def get_versions(self) -> Sequence[str]:
return self.info()["releases"].keys()
def get_default_version(self) -> str:
return self.info()["info"]["version"]
def get_metadata_authority(self):
p_url = urlparse(self.url)
return MetadataAuthority(
type=MetadataAuthorityType.FORGE,
url=f"{p_url.scheme}://{p_url.netloc}/",
metadata={},
)
def get_package_info(self, version: str) -> Iterator[Tuple[str, PyPIPackageInfo]]:
res = []
for meta in self.info()["releases"][version]:
# process only standard sdist archives
if meta["packagetype"] != "sdist" or meta["filename"].lower().endswith(
(".deb", ".egg", ".rpm", ".whl")
):
continue
p_info = PyPIPackageInfo.from_metadata(
meta, name=self.info()["info"]["name"], version=version
)
res.append((version, p_info))
if len(res) == 1:
version, p_info = res[0]
yield release_name(version), p_info
else:
for version, p_info in res:
yield release_name(version, p_info.filename), p_info
def build_release(
self, p_info: PyPIPackageInfo, uncompressed_path: str, directory: Sha1Git
) -> Optional[Release]:
i_metadata = extract_intrinsic_metadata(uncompressed_path)
if not i_metadata:
return None
# from intrinsic metadata
version_ = i_metadata.get("version", p_info.version)
author_ = author(i_metadata)
if p_info.comment_text:
msg = p_info.comment_text
else:
msg = (
f"Synthetic release for PyPI source package {p_info.name} "
f"version {version_}\n"
)
date = TimestampWithTimezone.from_iso8601(p_info.upload_time)
return Release(
name=p_info.version.encode(),
message=msg.encode(),
author=author_,
date=date,
target=directory,
target_type=ObjectType.DIRECTORY,
synthetic=True,
)
def pypi_api_url(url: str) -> str:
"""Compute api url from a project url
Args:
url (str): PyPI instance's url (e.g: https://pypi.org/project/requests)
This deals with correctly transforming the project's api url (e.g
https://pypi.org/pypi/requests/json)
Returns:
api url
"""
p_url = urlparse(url)
project_name = p_url.path.rstrip("/").split("/")[-1]
url = "%s://%s/pypi/%s/json" % (p_url.scheme, p_url.netloc, project_name)
return url
def extract_intrinsic_metadata(dir_path: str) -> Dict:
"""Given an uncompressed path holding the pkginfo file, returns a
pkginfo parsed structure as a dict.
The release artifact contains at their root one folder. For example:
$ tar tvf zprint-0.0.6.tar.gz
drwxr-xr-x root/root 0 2018-08-22 11:01 zprint-0.0.6/
...
Args:
dir_path (str): Path to the uncompressed directory
representing a release artifact from pypi.
Returns:
the pkginfo parsed structure as a dict if any or None if
none was present.
"""
# Retrieve the root folder of the archive
if not os.path.exists(dir_path):
return {}
lst = os.listdir(dir_path)
if len(lst) != 1:
return {}
project_dirname = lst[0]
pkginfo_path = os.path.join(dir_path, project_dirname, "PKG-INFO")
if not os.path.exists(pkginfo_path):
return {}
pkginfo = UnpackedSDist(pkginfo_path)
raw = pkginfo.__dict__
raw.pop("filename") # this gets added with the ondisk location
return raw
def author(data: Dict) -> Person:
"""Given a dict of project/release artifact information (coming from
PyPI), returns an author subset.
Args:
data (dict): Representing either artifact information or
release information.
Returns:
swh-model dict representing a person.
"""
name = data.get("author")
email = data.get("author_email")
fullname = None # type: Optional[str]
if email:
fullname = "%s <%s>" % (name, email)
else:
fullname = name
if not fullname:
return EMPTY_AUTHOR
if name is not None:
name = name.encode("utf-8")
if email is not None:
email = email.encode("utf-8")
return Person(fullname=fullname.encode("utf-8"), name=name, email=email)
diff --git a/swh/loader/package/pypi/tests/test_pypi.py b/swh/loader/package/pypi/tests/test_pypi.py
index 7cb487a..971d275 100644
--- a/swh/loader/package/pypi/tests/test_pypi.py
+++ b/swh/loader/package/pypi/tests/test_pypi.py
@@ -1,806 +1,815 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import json
import os
from os import path
from unittest.mock import patch
import pytest
from swh.core.pytest_plugin import requests_mock_datadir_factory
from swh.core.tarball import uncompress
from swh.loader.package import __version__
from swh.loader.package.pypi.loader import (
PyPILoader,
PyPIPackageInfo,
author,
extract_intrinsic_metadata,
pypi_api_url,
)
from swh.loader.tests import assert_last_visit_matches, check_snapshot, get_stats
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
MetadataAuthority,
MetadataAuthorityType,
MetadataFetcher,
Person,
RawExtrinsicMetadata,
Snapshot,
SnapshotBranch,
TargetType,
)
from swh.model.swhids import CoreSWHID, ExtendedObjectType, ExtendedSWHID, ObjectType
from swh.storage.interface import PagedResult
@pytest.fixture
def _0805nexter_api_info(datadir) -> bytes:
with open(
- os.path.join(datadir, "https_pypi.org", "pypi_0805nexter_json"), "rb",
+ os.path.join(datadir, "https_pypi.org", "pypi_0805nexter_json"),
+ "rb",
) as f:
return f.read()
def test_pypi_author_basic():
data = {
"author": "i-am-groot",
"author_email": "iam@groot.org",
}
actual_author = author(data)
expected_author = Person(
fullname=b"i-am-groot <iam@groot.org>",
name=b"i-am-groot",
email=b"iam@groot.org",
)
assert actual_author == expected_author
def test_pypi_author_empty_email():
data = {
"author": "i-am-groot",
"author_email": "",
}
actual_author = author(data)
- expected_author = Person(fullname=b"i-am-groot", name=b"i-am-groot", email=b"",)
+ expected_author = Person(
+ fullname=b"i-am-groot",
+ name=b"i-am-groot",
+ email=b"",
+ )
assert actual_author == expected_author
def test_pypi_author_empty_name():
data = {
"author": "",
"author_email": "iam@groot.org",
}
actual_author = author(data)
expected_author = Person(
- fullname=b" <iam@groot.org>", name=b"", email=b"iam@groot.org",
+ fullname=b" <iam@groot.org>",
+ name=b"",
+ email=b"iam@groot.org",
)
assert actual_author == expected_author
def test_pypi_author_malformed():
data = {
"author": "['pierre', 'paul', 'jacques']",
"author_email": None,
}
actual_author = author(data)
expected_author = Person(
fullname=b"['pierre', 'paul', 'jacques']",
name=b"['pierre', 'paul', 'jacques']",
email=None,
)
assert actual_author == expected_author
def test_pypi_author_malformed_2():
data = {
"author": "[marie, jeanne]",
"author_email": "[marie@some, jeanne@thing]",
}
actual_author = author(data)
expected_author = Person(
fullname=b"[marie, jeanne] <[marie@some, jeanne@thing]>",
name=b"[marie, jeanne]",
email=b"[marie@some, jeanne@thing]",
)
assert actual_author == expected_author
def test_pypi_author_malformed_3():
data = {
"author": "[marie, jeanne, pierre]",
"author_email": "[marie@somewhere.org, jeanne@somewhere.org]",
}
actual_author = author(data)
expected_author = Person(
fullname=(
b"[marie, jeanne, pierre] " b"<[marie@somewhere.org, jeanne@somewhere.org]>"
),
name=b"[marie, jeanne, pierre]",
email=b"[marie@somewhere.org, jeanne@somewhere.org]",
)
actual_author == expected_author
# configuration error #
def test_pypi_api_url():
"""Compute pypi api url from the pypi project url should be ok"""
url = pypi_api_url("https://pypi.org/project/requests")
assert url == "https://pypi.org/pypi/requests/json"
def test_pypi_api_url_with_slash():
"""Compute pypi api url from the pypi project url should be ok"""
url = pypi_api_url("https://pypi.org/project/requests/")
assert url == "https://pypi.org/pypi/requests/json"
@pytest.mark.fs
def test_pypi_extract_intrinsic_metadata(tmp_path, datadir):
"""Parsing existing archive's PKG-INFO should yield results"""
uncompressed_archive_path = str(tmp_path)
archive_path = path.join(
datadir, "https_files.pythonhosted.org", "0805nexter-1.1.0.zip"
)
uncompress(archive_path, dest=uncompressed_archive_path)
actual_metadata = extract_intrinsic_metadata(uncompressed_archive_path)
expected_metadata = {
"metadata_version": "1.0",
"name": "0805nexter",
"version": "1.1.0",
"summary": "a simple printer of nested lest",
"home_page": "http://www.hp.com",
"author": "hgtkpython",
"author_email": "2868989685@qq.com",
"platforms": ["UNKNOWN"],
}
assert actual_metadata == expected_metadata
@pytest.mark.fs
def test_pypi_extract_intrinsic_metadata_failures(tmp_path):
"""Parsing inexistent path/archive/PKG-INFO yield None"""
tmp_path = str(tmp_path) # py3.5 work around (PosixPath issue)
# inexistent first level path
assert extract_intrinsic_metadata("/something-inexistent") == {}
# inexistent second level path (as expected by pypi archives)
assert extract_intrinsic_metadata(tmp_path) == {}
# inexistent PKG-INFO within second level path
existing_path_no_pkginfo = path.join(tmp_path, "something")
os.mkdir(existing_path_no_pkginfo)
assert extract_intrinsic_metadata(tmp_path) == {}
# LOADER SCENARIO #
# "edge" cases (for the same origin) #
# no release artifact:
# {visit full, status: uneventful, no contents, etc...}
requests_mock_datadir_missing_all = requests_mock_datadir_factory(
ignore_urls=[
"https://files.pythonhosted.org/packages/ec/65/c0116953c9a3f47de89e71964d6c7b0c783b01f29fa3390584dbf3046b4d/0805nexter-1.1.0.zip", # noqa
"https://files.pythonhosted.org/packages/c4/a0/4562cda161dc4ecbbe9e2a11eb365400c0461845c5be70d73869786809c4/0805nexter-1.2.0.zip", # noqa
]
)
def test_pypi_no_release_artifact(swh_storage, requests_mock_datadir_missing_all):
- """Load a pypi project with all artifacts missing ends up with no snapshot
-
- """
+ """Load a pypi project with all artifacts missing ends up with no snapshot"""
url = "https://pypi.org/project/0805nexter"
loader = PyPILoader(swh_storage, url)
actual_load_status = loader.load()
assert actual_load_status["status"] == "uneventful"
assert actual_load_status["snapshot_id"] is not None
empty_snapshot = Snapshot(branches={})
assert_last_visit_matches(
swh_storage, url, status="partial", type="pypi", snapshot=empty_snapshot.id
)
stats = get_stats(swh_storage)
assert {
"content": 0,
"directory": 0,
"origin": 1,
"origin_visit": 1,
"release": 0,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_pypi_fail__load_snapshot(swh_storage, requests_mock_datadir):
- """problem during loading: {visit: failed, status: failed, no snapshot}
-
- """
+ """problem during loading: {visit: failed, status: failed, no snapshot}"""
url = "https://pypi.org/project/0805nexter"
with patch(
"swh.loader.package.pypi.loader.PyPILoader._load_snapshot",
side_effect=ValueError("Fake problem to fail visit"),
):
loader = PyPILoader(swh_storage, url)
actual_load_status = loader.load()
assert actual_load_status == {"status": "failed"}
assert_last_visit_matches(swh_storage, url, status="failed", type="pypi")
stats = get_stats(loader.storage)
assert {
"content": 6,
"directory": 4,
"origin": 1,
"origin_visit": 1,
"release": 2,
"revision": 0,
"skipped_content": 0,
"snapshot": 0,
} == stats
# problem during loading:
# {visit: partial, status: uneventful, no snapshot}
def test_pypi_release_with_traceback(swh_storage, requests_mock_datadir):
url = "https://pypi.org/project/0805nexter"
with patch(
"swh.loader.package.pypi.loader.PyPILoader.last_snapshot",
side_effect=ValueError("Fake problem to fail the visit"),
):
loader = PyPILoader(swh_storage, url)
actual_load_status = loader.load()
assert actual_load_status == {"status": "failed"}
assert_last_visit_matches(swh_storage, url, status="failed", type="pypi")
stats = get_stats(swh_storage)
assert {
"content": 0,
"directory": 0,
"origin": 1,
"origin_visit": 1,
"release": 0,
"revision": 0,
"skipped_content": 0,
"snapshot": 0,
} == stats
# problem during loading: failure early enough in between swh contents...
# some contents (contents, directories, etc...) have been written in storage
# {visit: partial, status: eventful, no snapshot}
# problem during loading: failure late enough we can have snapshots (some
# revisions are written in storage already)
# {visit: partial, status: eventful, snapshot}
# "normal" cases (for the same origin) #
requests_mock_datadir_missing_one = requests_mock_datadir_factory(
ignore_urls=[
"https://files.pythonhosted.org/packages/ec/65/c0116953c9a3f47de89e71964d6c7b0c783b01f29fa3390584dbf3046b4d/0805nexter-1.1.0.zip", # noqa
]
)
# some missing release artifacts:
# {visit partial, status: eventful, 1 snapshot}
def test_pypi_release_metadata_structure(
swh_storage, requests_mock_datadir, _0805nexter_api_info
):
url = "https://pypi.org/project/0805nexter"
loader = PyPILoader(swh_storage, url)
actual_load_status = loader.load()
assert actual_load_status["status"] == "eventful"
assert actual_load_status["snapshot_id"] is not None
expected_release_id = hash_to_bytes("fbbcb817f01111b06442cdcc93140ab3cc777d68")
expected_snapshot = Snapshot(
id=hash_to_bytes(actual_load_status["snapshot_id"]),
branches={
b"HEAD": SnapshotBranch(
- target=b"releases/1.2.0", target_type=TargetType.ALIAS,
+ target=b"releases/1.2.0",
+ target_type=TargetType.ALIAS,
),
b"releases/1.1.0": SnapshotBranch(
target=hash_to_bytes("f8789ff3ed70a5f570c35d885c7bcfda7b23b091"),
target_type=TargetType.RELEASE,
),
b"releases/1.2.0": SnapshotBranch(
- target=expected_release_id, target_type=TargetType.RELEASE,
+ target=expected_release_id,
+ target_type=TargetType.RELEASE,
),
},
)
assert_last_visit_matches(
swh_storage, url, status="full", type="pypi", snapshot=expected_snapshot.id
)
check_snapshot(expected_snapshot, swh_storage)
release = swh_storage.release_get([expected_release_id])[0]
assert release is not None
release_swhid = CoreSWHID(
object_type=ObjectType.RELEASE, object_id=expected_release_id
)
directory_swhid = ExtendedSWHID(
object_type=ExtendedObjectType.DIRECTORY, object_id=release.target
)
metadata_authority = MetadataAuthority(
- type=MetadataAuthorityType.FORGE, url="https://pypi.org/",
+ type=MetadataAuthorityType.FORGE,
+ url="https://pypi.org/",
)
expected_metadata = [
RawExtrinsicMetadata(
target=directory_swhid,
authority=metadata_authority,
fetcher=MetadataFetcher(
- name="swh.loader.package.pypi.loader.PyPILoader", version=__version__,
+ name="swh.loader.package.pypi.loader.PyPILoader",
+ version=__version__,
),
discovery_date=loader.visit_date,
format="pypi-project-json",
metadata=json.dumps(
json.loads(_0805nexter_api_info)["releases"]["1.2.0"][0]
).encode(),
origin=url,
release=release_swhid,
)
]
assert swh_storage.raw_extrinsic_metadata_get(
- directory_swhid, metadata_authority,
- ) == PagedResult(next_page_token=None, results=expected_metadata,)
+ directory_swhid,
+ metadata_authority,
+ ) == PagedResult(
+ next_page_token=None,
+ results=expected_metadata,
+ )
def test_pypi_visit_with_missing_artifact(
swh_storage, requests_mock_datadir_missing_one
):
- """Load a pypi project with some missing artifacts ends up with 1 snapshot
-
- """
+ """Load a pypi project with some missing artifacts ends up with 1 snapshot"""
url = "https://pypi.org/project/0805nexter"
loader = PyPILoader(swh_storage, url)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("00785a38479abe5fbfa402df96be26d2ddf89c97")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
- swh_storage, url, status="partial", type="pypi", snapshot=expected_snapshot_id,
+ swh_storage,
+ url,
+ status="partial",
+ type="pypi",
+ snapshot=expected_snapshot_id,
)
expected_snapshot = Snapshot(
id=hash_to_bytes(expected_snapshot_id),
branches={
b"releases/1.2.0": SnapshotBranch(
target=hash_to_bytes("fbbcb817f01111b06442cdcc93140ab3cc777d68"),
target_type=TargetType.RELEASE,
),
b"HEAD": SnapshotBranch(
- target=b"releases/1.2.0", target_type=TargetType.ALIAS,
+ target=b"releases/1.2.0",
+ target_type=TargetType.ALIAS,
),
},
)
check_snapshot(expected_snapshot, storage=swh_storage)
stats = get_stats(swh_storage)
assert {
"content": 3,
"directory": 2,
"origin": 1,
"origin_visit": 1,
"release": 1,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_pypi_visit_with_1_release_artifact(swh_storage, requests_mock_datadir):
- """With no prior visit, load a pypi project ends up with 1 snapshot
-
- """
+ """With no prior visit, load a pypi project ends up with 1 snapshot"""
url = "https://pypi.org/project/0805nexter"
loader = PyPILoader(swh_storage, url)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("3dd50c1a0e48a7625cf1427e3190a65b787c774e")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="pypi", snapshot=expected_snapshot_id
)
expected_snapshot = Snapshot(
id=expected_snapshot_id,
branches={
b"releases/1.1.0": SnapshotBranch(
target=hash_to_bytes("f8789ff3ed70a5f570c35d885c7bcfda7b23b091"),
target_type=TargetType.RELEASE,
),
b"releases/1.2.0": SnapshotBranch(
target=hash_to_bytes("fbbcb817f01111b06442cdcc93140ab3cc777d68"),
target_type=TargetType.RELEASE,
),
b"HEAD": SnapshotBranch(
- target=b"releases/1.2.0", target_type=TargetType.ALIAS,
+ target=b"releases/1.2.0",
+ target_type=TargetType.ALIAS,
),
},
)
check_snapshot(expected_snapshot, swh_storage)
stats = get_stats(swh_storage)
assert {
"content": 6,
"directory": 4,
"origin": 1,
"origin_visit": 1,
"release": 2,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
def test_pypi_multiple_visits_with_no_change(swh_storage, requests_mock_datadir):
- """Multiple visits with no changes results in 1 same snapshot
-
- """
+ """Multiple visits with no changes results in 1 same snapshot"""
url = "https://pypi.org/project/0805nexter"
loader = PyPILoader(swh_storage, url)
actual_load_status = loader.load()
snapshot_id = hash_to_bytes("3dd50c1a0e48a7625cf1427e3190a65b787c774e")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="pypi", snapshot=snapshot_id
)
expected_snapshot = Snapshot(
id=snapshot_id,
branches={
b"releases/1.1.0": SnapshotBranch(
target=hash_to_bytes("f8789ff3ed70a5f570c35d885c7bcfda7b23b091"),
target_type=TargetType.RELEASE,
),
b"releases/1.2.0": SnapshotBranch(
target=hash_to_bytes("fbbcb817f01111b06442cdcc93140ab3cc777d68"),
target_type=TargetType.RELEASE,
),
b"HEAD": SnapshotBranch(
- target=b"releases/1.2.0", target_type=TargetType.ALIAS,
+ target=b"releases/1.2.0",
+ target_type=TargetType.ALIAS,
),
},
)
check_snapshot(expected_snapshot, swh_storage)
stats = get_stats(swh_storage)
assert {
"content": 6,
"directory": 4,
"origin": 1,
"origin_visit": 1,
"release": 2,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == stats
actual_load_status2 = loader.load()
assert actual_load_status2 == {
"status": "uneventful",
"snapshot_id": actual_load_status2["snapshot_id"],
}
visit_status2 = assert_last_visit_matches(
swh_storage, url, status="full", type="pypi"
)
stats2 = get_stats(swh_storage)
expected_stats2 = stats.copy()
expected_stats2["origin_visit"] = 1 + 1
assert expected_stats2 == stats2
# same snapshot
assert visit_status2.snapshot == snapshot_id
def test_pypi_incremental_visit(swh_storage, requests_mock_datadir_visits):
- """With prior visit, 2nd load will result with a different snapshot
-
- """
+ """With prior visit, 2nd load will result with a different snapshot"""
url = "https://pypi.org/project/0805nexter"
loader = PyPILoader(swh_storage, url)
visit1_actual_load_status = loader.load()
visit1_stats = get_stats(swh_storage)
expected_snapshot_id = hash_to_bytes("3dd50c1a0e48a7625cf1427e3190a65b787c774e")
assert visit1_actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="pypi", snapshot=expected_snapshot_id
)
assert {
"content": 6,
"directory": 4,
"origin": 1,
"origin_visit": 1,
"release": 2,
"revision": 0,
"skipped_content": 0,
"snapshot": 1,
} == visit1_stats
# Reset internal state
del loader._cached__raw_info
del loader._cached_info
visit2_actual_load_status = loader.load()
visit2_stats = get_stats(swh_storage)
assert visit2_actual_load_status["status"] == "eventful", visit2_actual_load_status
expected_snapshot_id2 = hash_to_bytes("77febe6ff0faf6cc00dd015a6c9763579a9fb6c7")
assert visit2_actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id2.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="pypi", snapshot=expected_snapshot_id2
)
expected_snapshot = Snapshot(
id=expected_snapshot_id2,
branches={
b"releases/1.1.0": SnapshotBranch(
target=hash_to_bytes("f8789ff3ed70a5f570c35d885c7bcfda7b23b091"),
target_type=TargetType.RELEASE,
),
b"releases/1.2.0": SnapshotBranch(
target=hash_to_bytes("fbbcb817f01111b06442cdcc93140ab3cc777d68"),
target_type=TargetType.RELEASE,
),
b"releases/1.3.0": SnapshotBranch(
target=hash_to_bytes("a21b09cbec8e31f47307f196bb1f939effc26e11"),
target_type=TargetType.RELEASE,
),
b"HEAD": SnapshotBranch(
- target=b"releases/1.3.0", target_type=TargetType.ALIAS,
+ target=b"releases/1.3.0",
+ target_type=TargetType.ALIAS,
),
},
)
assert_last_visit_matches(
swh_storage, url, status="full", type="pypi", snapshot=expected_snapshot.id
)
check_snapshot(expected_snapshot, swh_storage)
assert {
"content": 6 + 1, # 1 more content
"directory": 4 + 2, # 2 more directories
"origin": 1,
"origin_visit": 1 + 1,
"release": 2 + 1, # 1 more release
"revision": 0,
"skipped_content": 0,
"snapshot": 1 + 1, # 1 more snapshot
} == visit2_stats
urls = [
m.url
for m in requests_mock_datadir_visits.request_history
if m.url.startswith("https://files.pythonhosted.org")
]
# visited each artifact once across 2 visits
assert len(urls) == len(set(urls))
# release artifact, no new artifact
# {visit full, status uneventful, same snapshot as before}
# release artifact, old artifact with different checksums
# {visit full, status full, new snapshot with shared history and some new
# different history}
# release with multiple sdist artifacts per pypi "version"
# snapshot branch output is different
def test_pypi_visit_1_release_with_2_artifacts(swh_storage, requests_mock_datadir):
- """With no prior visit, load a pypi project ends up with 1 snapshot
-
- """
+ """With no prior visit, load a pypi project ends up with 1 snapshot"""
url = "https://pypi.org/project/nexter"
loader = PyPILoader(swh_storage, url)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("1394b2e59351a944cc763bd9d26d90ce8e8121a8")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
assert_last_visit_matches(
swh_storage, url, status="full", type="pypi", snapshot=expected_snapshot_id
)
expected_snapshot = Snapshot(
id=expected_snapshot_id,
branches={
b"releases/1.1.0/nexter-1.1.0.zip": SnapshotBranch(
target=hash_to_bytes("f7d43faeb65b64d3faa67e4f46559db57d26b9a4"),
target_type=TargetType.RELEASE,
),
b"releases/1.1.0/nexter-1.1.0.tar.gz": SnapshotBranch(
target=hash_to_bytes("732bb9dc087e6015884daaebb8b82559be729b5a"),
target_type=TargetType.RELEASE,
),
},
)
check_snapshot(expected_snapshot, swh_storage)
def test_pypi_artifact_with_no_intrinsic_metadata(swh_storage, requests_mock_datadir):
- """Skip artifact with no intrinsic metadata during ingestion
-
- """
+ """Skip artifact with no intrinsic metadata during ingestion"""
url = "https://pypi.org/project/upymenu"
loader = PyPILoader(swh_storage, url)
actual_load_status = loader.load()
expected_snapshot_id = hash_to_bytes("1a8893e6a86f444e8be8e7bda6cb34fb1735a00e")
assert actual_load_status == {
"status": "eventful",
"snapshot_id": expected_snapshot_id.hex(),
}
# no branch as one artifact without any intrinsic metadata
expected_snapshot = Snapshot(id=expected_snapshot_id, branches={})
assert_last_visit_matches(
swh_storage, url, status="full", type="pypi", snapshot=expected_snapshot.id
)
check_snapshot(expected_snapshot, swh_storage)
def test_pypi_origin_not_found(swh_storage, requests_mock_datadir):
url = "https://pypi.org/project/unknown"
loader = PyPILoader(swh_storage, url)
assert loader.load() == {"status": "failed"}
assert_last_visit_matches(
swh_storage, url, status="not_found", type="pypi", snapshot=None
)
def test_pypi_build_release_missing_version_in_pkg_info(swh_storage, tmp_path):
"""Simulate release build when Version field is missing in PKG-INFO file."""
url = "https://pypi.org/project/GermlineFilter"
# create package info
p_info = PyPIPackageInfo(
url=url,
filename="GermlineFilter-1.2.tar.gz",
version="1.2",
name="GermlineFilter",
directory_extrinsic_metadata=[],
raw_info={},
comment_text="",
sha256="e4982353c544d94b34f02c5690ab3d3ebc93480d5b62fe6f3317f23c515acc05",
upload_time="2015-02-18T20:39:13",
)
# create PKG-INFO file with missing Version field
package_path = tmp_path / "GermlineFilter-1.2"
pkg_info_path = package_path / "PKG-INFO"
package_path.mkdir()
pkg_info_path.write_text(
"""Metadata-Version: 1.2
Name: germline_filter
Home-page:
Author: Cristian Caloian (OICR)
Author-email: cristian.caloian@oicr.on.ca
License: UNKNOWN
Description: UNKNOWN
Platform: UNKNOWN"""
)
directory = hash_to_bytes("8b864d66f356afe35033d58f8e03b7c23a66751f")
# attempt to build release
loader = PyPILoader(swh_storage, url)
release = loader.build_release(p_info, str(tmp_path), directory)
# without comment_text and version in PKG-INFO, message should be empty
assert (
release.message
== b"Synthetic release for PyPI source package GermlineFilter version 1.2\n"
)
def test_filter_out_invalid_sdists(swh_storage, requests_mock):
project_name = "swh-test-sdist-filtering"
version = "1.0.0"
url = f"https://pypi.org/project/{project_name}"
json_url = f"https://pypi.org/pypi/{project_name}/json"
common_sdist_entries = {
"url": "",
"comment_text": "",
"digests": {"sha256": ""},
"upload_time": "",
"packagetype": "sdist",
}
requests_mock.get(
json_url,
json={
- "info": {"name": project_name,},
+ "info": {
+ "name": project_name,
+ },
"releases": {
version: [
{
**common_sdist_entries,
"filename": f"{project_name}-{version}.{ext}",
}
for ext in ("tar.gz", "deb", "egg", "rpm", "whl")
]
},
},
)
loader = PyPILoader(swh_storage, url)
packages = list(loader.get_package_info(version=version))
assert len(packages) == 1
assert packages[0][1].filename.endswith(".tar.gz")
diff --git a/swh/loader/package/tests/test_loader.py b/swh/loader/package/tests/test_loader.py
index 9851a22..be329c7 100644
--- a/swh/loader/package/tests/test_loader.py
+++ b/swh/loader/package/tests/test_loader.py
@@ -1,508 +1,518 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import hashlib
import logging
import string
from unittest.mock import Mock, call, patch
import attr
import pytest
from swh.loader.package.loader import BasePackageInfo, PackageLoader
from swh.model.model import (
Origin,
OriginVisit,
OriginVisitStatus,
Person,
Release,
Revision,
RevisionType,
Snapshot,
SnapshotBranch,
TargetType,
TimestampWithTimezone,
)
from swh.model.model import ExtID
from swh.model.model import ObjectType as ModelObjectType
from swh.model.swhids import CoreSWHID, ObjectType
from swh.storage import get_storage
from swh.storage.algos.snapshot import snapshot_get_latest
class FakeStorage:
def origin_add(self, origins):
raise ValueError("We refuse to add an origin")
def origin_visit_get_latest(self, origin):
return None
class FakeStorage2(FakeStorage):
def origin_add(self, origins):
pass
def origin_visit_add(self, visits):
raise ValueError("We refuse to add an origin visit")
class StubPackageInfo(BasePackageInfo):
pass
class StubPackageLoader(PackageLoader[StubPackageInfo]):
def get_versions(self):
return ["v1.0", "v2.0", "v3.0", "v4.0"]
def get_package_info(self, version):
p_info = StubPackageInfo(
"http://example.org", f"example-{version}.tar", version=version
)
extid_type = "extid-type1" if version in ("v1.0", "v2.0") else "extid-type2"
# Versions 1.0 and 2.0 have an extid of a given type, v3.0 has an extid
# of a different type
patch.object(
p_info,
"extid",
return_value=(extid_type, 0, f"extid-of-{version}".encode()),
autospec=True,
).start()
yield (f"branch-{version}", p_info)
def _load_release(self, p_info, origin):
return None
def test_loader_origin_visit_failure(swh_storage):
- """Failure to add origin or origin visit should failed immediately
-
- """
+ """Failure to add origin or origin visit should failed immediately"""
loader = PackageLoader(swh_storage, "some-url")
loader.storage = FakeStorage()
actual_load_status = loader.load()
assert actual_load_status == {"status": "failed"}
loader.storage = FakeStorage2()
actual_load_status2 = loader.load()
assert actual_load_status2 == {"status": "failed"}
def test_resolve_object_from_extids() -> None:
storage = get_storage("memory")
target = b"\x01" * 20
rel1 = Release(
name=b"aaaa",
message=b"aaaa",
target=target,
target_type=ModelObjectType.DIRECTORY,
synthetic=False,
)
rel2 = Release(
name=b"bbbb",
message=b"bbbb",
target=target,
target_type=ModelObjectType.DIRECTORY,
synthetic=False,
)
storage.release_add([rel1, rel2])
loader = PackageLoader(storage, "http://example.org/") # type: ignore
p_info = Mock(wraps=BasePackageInfo(None, None, None)) # type: ignore
# The PackageInfo does not support extids
p_info.extid.return_value = None
known_extids = {("extid-type", 0, b"extid-of-aaaa"): [rel1.swhid()]}
whitelist = {b"unused"}
assert loader.resolve_object_from_extids(known_extids, p_info, whitelist) is None
# Some known extid, and the PackageInfo is not one of them (ie. cache miss)
p_info.extid.return_value = ("extid-type", 0, b"extid-of-cccc")
assert loader.resolve_object_from_extids(known_extids, p_info, whitelist) is None
# Some known extid, and the PackageInfo is one of them (ie. cache hit),
# but the target release was not in the previous snapshot
p_info.extid.return_value = ("extid-type", 0, b"extid-of-aaaa")
assert loader.resolve_object_from_extids(known_extids, p_info, whitelist) is None
# Some known extid, and the PackageInfo is one of them (ie. cache hit),
# and the target release was in the previous snapshot
whitelist = {rel1.id}
assert (
loader.resolve_object_from_extids(known_extids, p_info, whitelist)
== rel1.swhid()
)
# Same as before, but there is more than one extid, and only one is an allowed
# release
whitelist = {rel1.id}
known_extids = {("extid-type", 0, b"extid-of-aaaa"): [rel2.swhid(), rel1.swhid()]}
assert (
loader.resolve_object_from_extids(known_extids, p_info, whitelist)
== rel1.swhid()
)
def test_resolve_object_from_extids_missing_target() -> None:
storage = get_storage("memory")
target = b"\x01" * 20
rel = Release(
name=b"aaaa",
message=b"aaaa",
target=target,
target_type=ModelObjectType.DIRECTORY,
synthetic=False,
)
loader = PackageLoader(storage, "http://example.org/") # type: ignore
p_info = Mock(wraps=BasePackageInfo(None, None, None)) # type: ignore
known_extids = {("extid-type", 0, b"extid-of-aaaa"): [rel.swhid()]}
p_info.extid.return_value = ("extid-type", 0, b"extid-of-aaaa")
whitelist = {rel.id}
# Targeted release is missing from the storage
assert loader.resolve_object_from_extids(known_extids, p_info, whitelist) is None
storage.release_add([rel])
# Targeted release now exists
assert (
loader.resolve_object_from_extids(known_extids, p_info, whitelist)
== rel.swhid()
)
def test_load_get_known_extids() -> None:
"""Checks PackageLoader.load() fetches known extids efficiently"""
storage = Mock(wraps=get_storage("memory"))
loader = StubPackageLoader(storage, "http://example.org")
loader.load()
# Calls should be grouped by extid type
storage.extid_get_from_extid.assert_has_calls(
[
call("extid-type1", [b"extid-of-v1.0", b"extid-of-v2.0"], version=0),
call("extid-type2", [b"extid-of-v3.0", b"extid-of-v4.0"], version=0),
],
any_order=True,
)
def test_load_extids() -> None:
"""Checks PackageLoader.load() skips iff it should, and writes (only)
the new ExtIDs"""
storage = get_storage("memory")
dir_swhid = CoreSWHID(object_type=ObjectType.DIRECTORY, object_id=b"e" * 20)
rels = [
Release(
name=f"v{i}.0".encode(),
message=b"blah\n",
target=dir_swhid.object_id,
target_type=ModelObjectType.DIRECTORY,
synthetic=True,
)
for i in (1, 2, 3, 4)
]
storage.release_add(rels[0:3])
origin = "http://example.org"
rel1_swhid = rels[0].swhid()
rel2_swhid = rels[1].swhid()
rel3_swhid = rels[2].swhid()
rel4_swhid = rels[3].swhid()
# Results of a previous load
storage.extid_add(
[
ExtID("extid-type1", b"extid-of-v1.0", rel1_swhid),
ExtID("extid-type2", b"extid-of-v2.0", rel2_swhid),
]
)
last_snapshot = Snapshot(
branches={
b"v1.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel1_swhid.object_id
),
b"v2.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel2_swhid.object_id
),
b"v3.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel3_swhid.object_id
),
}
)
storage.snapshot_add([last_snapshot])
date = datetime.datetime.now(tz=datetime.timezone.utc)
storage.origin_add([Origin(url=origin)])
storage.origin_visit_add(
[OriginVisit(origin="http://example.org", visit=1, date=date, type="tar")]
)
storage.origin_visit_status_add(
[
OriginVisitStatus(
origin=origin,
visit=1,
status="full",
date=date,
snapshot=last_snapshot.id,
)
]
)
loader = StubPackageLoader(storage, "http://example.org")
patch.object(
loader,
"_load_release",
return_value=(rel4_swhid.object_id, dir_swhid.object_id),
autospec=True,
).start()
loader.load()
assert loader._load_release.mock_calls == [ # type: ignore
# v1.0: not loaded because there is already its (extid_type, extid, rel)
# in the storage.
# v2.0: loaded, because there is already a similar extid, but different type
- call(StubPackageInfo(origin, "example-v2.0.tar", "v2.0"), Origin(url=origin),),
+ call(
+ StubPackageInfo(origin, "example-v2.0.tar", "v2.0"),
+ Origin(url=origin),
+ ),
# v3.0: loaded despite having an (extid_type, extid) in storage, because
# the target of the extid is not in the previous snapshot
- call(StubPackageInfo(origin, "example-v3.0.tar", "v3.0"), Origin(url=origin),),
+ call(
+ StubPackageInfo(origin, "example-v3.0.tar", "v3.0"),
+ Origin(url=origin),
+ ),
# v4.0: loaded, because there isn't its extid
- call(StubPackageInfo(origin, "example-v4.0.tar", "v4.0"), Origin(url=origin),),
+ call(
+ StubPackageInfo(origin, "example-v4.0.tar", "v4.0"),
+ Origin(url=origin),
+ ),
]
# then check the snapshot has all the branches.
# versions 2.0 to 4.0 all point to rel4_swhid (instead of the value of the last
# snapshot), because they had to be loaded (mismatched extid), and the mocked
# _load_release always returns rel4_swhid.
snapshot = Snapshot(
branches={
b"branch-v1.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel1_swhid.object_id
),
b"branch-v2.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel4_swhid.object_id
),
b"branch-v3.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel4_swhid.object_id
),
b"branch-v4.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel4_swhid.object_id
),
}
)
assert snapshot_get_latest(storage, origin) == snapshot
extids = storage.extid_get_from_target(
ObjectType.RELEASE,
[
rel1_swhid.object_id,
rel2_swhid.object_id,
rel3_swhid.object_id,
rel4_swhid.object_id,
],
)
assert set(extids) == {
# What we inserted at the beginning of the test:
ExtID("extid-type1", b"extid-of-v1.0", rel1_swhid),
ExtID("extid-type2", b"extid-of-v2.0", rel2_swhid),
# Added by the loader:
ExtID("extid-type1", b"extid-of-v2.0", rel4_swhid),
ExtID("extid-type2", b"extid-of-v3.0", rel4_swhid),
ExtID("extid-type2", b"extid-of-v4.0", rel4_swhid),
}
def test_load_upgrade_from_revision_extids(caplog):
"""Tests that, when loading incrementally based on a snapshot made by an old
version of the loader, the loader will convert revisions to releases
and add them to the storage.
Also checks that, if an extid exists pointing to a non-existent revision
(which should never happen, but you never know...), the release is loaded from
scratch."""
storage = get_storage("memory")
origin = "http://example.org"
dir1_swhid = CoreSWHID(object_type=ObjectType.DIRECTORY, object_id=b"d" * 20)
dir2_swhid = CoreSWHID(object_type=ObjectType.DIRECTORY, object_id=b"e" * 20)
date = TimestampWithTimezone.from_datetime(
datetime.datetime.now(tz=datetime.timezone.utc)
)
person = Person.from_fullname(b"Jane Doe <jdoe@example.org>")
rev1 = Revision(
message=b"blah",
author=person,
date=date,
committer=person,
committer_date=date,
directory=dir1_swhid.object_id,
type=RevisionType.TAR,
synthetic=True,
)
rel1 = Release(
name=b"v1.0",
message=b"blah\n",
author=person,
date=date,
target=dir1_swhid.object_id,
target_type=ModelObjectType.DIRECTORY,
synthetic=True,
)
rev1_swhid = rev1.swhid()
rel1_swhid = rel1.swhid()
rev2_swhid = CoreSWHID(object_type=ObjectType.REVISION, object_id=b"b" * 20)
rel2_swhid = CoreSWHID(object_type=ObjectType.RELEASE, object_id=b"c" * 20)
# Results of a previous load
storage.extid_add(
[
ExtID("extid-type1", b"extid-of-v1.0", rev1_swhid, 0),
ExtID("extid-type1", b"extid-of-v2.0", rev2_swhid, 0),
]
)
storage.revision_add([rev1])
last_snapshot = Snapshot(
branches={
b"v1.0": SnapshotBranch(
target_type=TargetType.REVISION, target=rev1_swhid.object_id
),
b"v2.0": SnapshotBranch(
target_type=TargetType.REVISION, target=rev2_swhid.object_id
),
}
)
storage.snapshot_add([last_snapshot])
date = datetime.datetime.now(tz=datetime.timezone.utc)
storage.origin_add([Origin(url=origin)])
storage.origin_visit_add(
[OriginVisit(origin="http://example.org", visit=1, date=date, type="tar")]
)
storage.origin_visit_status_add(
[
OriginVisitStatus(
origin=origin,
visit=1,
status="full",
date=date,
snapshot=last_snapshot.id,
)
]
)
loader = StubPackageLoader(storage, "http://example.org")
patch.object(
loader,
"_load_release",
return_value=(rel2_swhid.object_id, dir2_swhid.object_id),
autospec=True,
).start()
patch.object(
- loader, "get_versions", return_value=["v1.0", "v2.0", "v3.0"], autospec=True,
+ loader,
+ "get_versions",
+ return_value=["v1.0", "v2.0", "v3.0"],
+ autospec=True,
).start()
caplog.set_level(logging.ERROR)
loader.load()
assert len(caplog.records) == 1
(record,) = caplog.records
assert record.levelname == "ERROR"
assert "Failed to upgrade branch branch-v2.0" in record.message
assert loader._load_release.mock_calls == [
# v1.0: not loaded because there is already a revision matching it
# v2.0: loaded, as the revision is missing from the storage even though there
# is an extid
call(StubPackageInfo(origin, "example-v2.0.tar", "v2.0"), Origin(url=origin)),
# v3.0: loaded (did not exist yet)
call(StubPackageInfo(origin, "example-v3.0.tar", "v3.0"), Origin(url=origin)),
]
snapshot = Snapshot(
branches={
b"branch-v1.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel1_swhid.object_id
),
b"branch-v2.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel2_swhid.object_id
),
b"branch-v3.0": SnapshotBranch(
target_type=TargetType.RELEASE, target=rel2_swhid.object_id
),
}
)
assert snapshot_get_latest(storage, origin) == snapshot
extids = storage.extid_get_from_target(
- ObjectType.RELEASE, [rel1_swhid.object_id, rel2_swhid.object_id,],
+ ObjectType.RELEASE,
+ [
+ rel1_swhid.object_id,
+ rel2_swhid.object_id,
+ ],
)
assert set(extids) == {
ExtID("extid-type1", b"extid-of-v1.0", rel1_swhid),
ExtID("extid-type1", b"extid-of-v2.0", rel2_swhid),
ExtID("extid-type2", b"extid-of-v3.0", rel2_swhid),
}
def test_manifest_extid():
- """Compute primary key should return the right identity
-
- """
+ """Compute primary key should return the right identity"""
@attr.s
class TestPackageInfo(BasePackageInfo):
a = attr.ib()
b = attr.ib()
length = attr.ib()
filename = attr.ib()
MANIFEST_FORMAT = string.Template("$a $b")
p_info = TestPackageInfo(
url="http://example.org/",
a=1,
b=2,
length=221837,
filename="8sync-0.1.0.tar.gz",
version="0.1.0",
)
actual_id = p_info.extid()
assert actual_id == ("package-manifest-sha256", 0, hashlib.sha256(b"1 2").digest())
def test_no_env_swh_config_filename_raise(monkeypatch):
- """No SWH_CONFIG_FILENAME environment variable makes package loader init raise
-
- """
+ """No SWH_CONFIG_FILENAME environment variable makes package loader init raise"""
class DummyPackageLoader(PackageLoader):
"""A dummy package loader for test purpose"""
pass
monkeypatch.delenv("SWH_CONFIG_FILENAME", raising=False)
with pytest.raises(
AssertionError, match="SWH_CONFIG_FILENAME environment variable is undefined"
):
DummyPackageLoader.from_configfile(url="some-url")
diff --git a/swh/loader/package/tests/test_loader_metadata.py b/swh/loader/package/tests/test_loader_metadata.py
index 76c38aa..527a72d 100644
--- a/swh/loader/package/tests/test_loader_metadata.py
+++ b/swh/loader/package/tests/test_loader_metadata.py
@@ -1,215 +1,235 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
from typing import Iterator, List, Sequence, Tuple
import attr
from swh.loader.package import __version__
from swh.loader.package.loader import (
BasePackageInfo,
PackageLoader,
RawExtrinsicMetadataCore,
)
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
MetadataAuthority,
MetadataAuthorityType,
MetadataFetcher,
ObjectType,
Origin,
Person,
RawExtrinsicMetadata,
Release,
Sha1Git,
)
from swh.model.swhids import CoreSWHID, ExtendedSWHID
EMPTY_SNAPSHOT_ID = "1a8893e6a86f444e8be8e7bda6cb34fb1735a00e"
FULL_SNAPSHOT_ID = "4ac5730a9393f5099b63a35a17b6c33d36d70c3a"
AUTHORITY = MetadataAuthority(
- type=MetadataAuthorityType.FORGE, url="http://example.org/",
+ type=MetadataAuthorityType.FORGE,
+ url="http://example.org/",
)
ORIGIN_URL = "http://example.org/archive.tgz"
ORIGIN_SWHID = Origin(ORIGIN_URL).swhid()
REVISION_ID = hash_to_bytes("8ff44f081d43176474b267de5451f2c2e88089d0")
RELEASE_ID = hash_to_bytes("9477a708196b44e59efb4e47b7d979a4146bd428")
RELEASE_SWHID = CoreSWHID.from_string(f"swh:1:rel:{RELEASE_ID.hex()}")
DIRECTORY_ID = hash_to_bytes("aa" * 20)
DIRECTORY_SWHID = ExtendedSWHID.from_string(f"swh:1:dir:{DIRECTORY_ID.hex()}")
FETCHER = MetadataFetcher(
name="swh.loader.package.tests.test_loader_metadata.MetadataTestLoader",
version=__version__,
)
DISCOVERY_DATE = datetime.datetime.now(tz=datetime.timezone.utc)
DIRECTORY_METADATA = [
RawExtrinsicMetadata(
target=DIRECTORY_SWHID,
discovery_date=DISCOVERY_DATE,
authority=AUTHORITY,
fetcher=FETCHER,
format="test-format1",
metadata=b"foo bar",
origin=ORIGIN_URL,
release=RELEASE_SWHID,
),
RawExtrinsicMetadata(
target=DIRECTORY_SWHID,
discovery_date=DISCOVERY_DATE + datetime.timedelta(seconds=1),
authority=AUTHORITY,
fetcher=FETCHER,
format="test-format2",
metadata=b"bar baz",
origin=ORIGIN_URL,
release=RELEASE_SWHID,
),
]
ORIGIN_METADATA = [
RawExtrinsicMetadata(
target=ORIGIN_SWHID,
discovery_date=datetime.datetime.now(tz=datetime.timezone.utc),
authority=AUTHORITY,
fetcher=FETCHER,
format="test-format3",
metadata=b"baz qux",
),
]
class MetadataTestLoader(PackageLoader[BasePackageInfo]):
def get_versions(self) -> Sequence[str]:
return ["v1.0.0"]
def _load_directory(self, dl_artifacts, tmpdir):
class directory:
hash = DIRECTORY_ID
return (None, directory) # just enough for _load_release to work
def download_package(self, p_info: BasePackageInfo, tmpdir: str):
return [("path", {"artifact_key": "value", "length": 0})]
def build_release(
- self, p_info: BasePackageInfo, uncompressed_path: str, directory: Sha1Git,
+ self,
+ p_info: BasePackageInfo,
+ uncompressed_path: str,
+ directory: Sha1Git,
):
return Release(
name=p_info.version.encode(),
message=b"",
author=Person.from_fullname(b""),
date=None,
target=DIRECTORY_ID,
target_type=ObjectType.DIRECTORY,
synthetic=False,
)
def get_metadata_authority(self):
return attr.evolve(AUTHORITY, metadata={})
def get_package_info(self, version: str) -> Iterator[Tuple[str, BasePackageInfo]]:
m0 = DIRECTORY_METADATA[0]
m1 = DIRECTORY_METADATA[1]
p_info = BasePackageInfo(
url=ORIGIN_URL,
filename="archive.tgz",
version=version,
directory_extrinsic_metadata=[
RawExtrinsicMetadataCore(m0.format, m0.metadata, m0.discovery_date),
RawExtrinsicMetadataCore(m1.format, m1.metadata, m1.discovery_date),
],
)
yield (version, p_info)
def get_extrinsic_origin_metadata(self) -> List[RawExtrinsicMetadataCore]:
m = ORIGIN_METADATA[0]
return [RawExtrinsicMetadataCore(m.format, m.metadata, m.discovery_date)]
def test_load_artifact_metadata(swh_storage, caplog):
loader = MetadataTestLoader(swh_storage, ORIGIN_URL)
load_status = loader.load()
assert load_status == {
"status": "eventful",
"snapshot_id": FULL_SNAPSHOT_ID,
}
authority = MetadataAuthority(
- type=MetadataAuthorityType.REGISTRY, url="https://softwareheritage.org/",
+ type=MetadataAuthorityType.REGISTRY,
+ url="https://softwareheritage.org/",
)
- result = swh_storage.raw_extrinsic_metadata_get(DIRECTORY_SWHID, authority,)
+ result = swh_storage.raw_extrinsic_metadata_get(
+ DIRECTORY_SWHID,
+ authority,
+ )
assert result.next_page_token is None
assert len(result.results) == 1
assert result.results[0] == RawExtrinsicMetadata(
target=DIRECTORY_SWHID,
discovery_date=result.results[0].discovery_date,
authority=authority,
fetcher=FETCHER,
format="original-artifacts-json",
metadata=b'[{"artifact_key": "value", "length": 0}]',
origin=ORIGIN_URL,
release=RELEASE_SWHID,
)
def test_load_metadata(swh_storage, caplog):
loader = MetadataTestLoader(swh_storage, ORIGIN_URL)
load_status = loader.load()
assert load_status == {
"status": "eventful",
"snapshot_id": FULL_SNAPSHOT_ID,
}
- result = swh_storage.raw_extrinsic_metadata_get(DIRECTORY_SWHID, AUTHORITY,)
+ result = swh_storage.raw_extrinsic_metadata_get(
+ DIRECTORY_SWHID,
+ AUTHORITY,
+ )
assert result.next_page_token is None
assert result.results == DIRECTORY_METADATA
- result = swh_storage.raw_extrinsic_metadata_get(ORIGIN_SWHID, AUTHORITY,)
+ result = swh_storage.raw_extrinsic_metadata_get(
+ ORIGIN_SWHID,
+ AUTHORITY,
+ )
assert result.next_page_token is None
assert result.results == ORIGIN_METADATA
assert caplog.text == ""
def test_existing_authority(swh_storage, caplog):
loader = MetadataTestLoader(swh_storage, ORIGIN_URL)
load_status = loader.load()
assert load_status == {
"status": "eventful",
"snapshot_id": FULL_SNAPSHOT_ID,
}
- result = swh_storage.raw_extrinsic_metadata_get(DIRECTORY_SWHID, AUTHORITY,)
+ result = swh_storage.raw_extrinsic_metadata_get(
+ DIRECTORY_SWHID,
+ AUTHORITY,
+ )
assert result.next_page_token is None
assert result.results == DIRECTORY_METADATA
assert caplog.text == ""
def test_existing_fetcher(swh_storage, caplog):
loader = MetadataTestLoader(swh_storage, ORIGIN_URL)
load_status = loader.load()
assert load_status == {
"status": "eventful",
"snapshot_id": FULL_SNAPSHOT_ID,
}
- result = swh_storage.raw_extrinsic_metadata_get(DIRECTORY_SWHID, AUTHORITY,)
+ result = swh_storage.raw_extrinsic_metadata_get(
+ DIRECTORY_SWHID,
+ AUTHORITY,
+ )
assert result.next_page_token is None
assert result.results == DIRECTORY_METADATA
assert caplog.text == ""
diff --git a/swh/loader/package/tests/test_utils.py b/swh/loader/package/tests/test_utils.py
index 772379f..75373e7 100644
--- a/swh/loader/package/tests/test_utils.py
+++ b/swh/loader/package/tests/test_utils.py
@@ -1,238 +1,236 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import json
import os
from unittest.mock import MagicMock
from urllib.error import URLError
from urllib.parse import quote
import pytest
from swh.loader.exception import NotFound
import swh.loader.package
from swh.loader.package.utils import api_info, download, release_name
def test_version_generation():
assert (
swh.loader.package.__version__ != "devel"
), "Make sure swh.loader.core is installed (e.g. pip install -e .)"
@pytest.mark.fs
def test_download_fail_to_download(tmp_path, requests_mock):
url = "https://pypi.org/pypi/arrow/json"
status_code = 404
requests_mock.get(url, status_code=status_code)
with pytest.raises(ValueError) as e:
download(url, tmp_path)
assert e.value.args[0] == "Fail to query '%s'. Reason: %s" % (url, status_code)
_filename = "requests-0.0.1.tar.gz"
_data = "this is something"
def _check_download_ok(url, dest, filename=_filename, hashes={}):
actual_filepath, actual_hashes = download(url, dest, hashes=hashes)
actual_filename = os.path.basename(actual_filepath)
assert actual_filename == filename
assert actual_hashes["length"] == len(_data)
assert (
actual_hashes["checksums"]["sha1"] == "fdd1ce606a904b08c816ba84f3125f2af44d92b2"
)
assert (
actual_hashes["checksums"]["sha256"]
== "1d9224378d77925d612c9f926eb9fb92850e6551def8328011b6a972323298d5"
)
@pytest.mark.fs
def test_download_ok(tmp_path, requests_mock):
"""Download without issue should provide filename and hashes"""
url = f"https://pypi.org/pypi/requests/{_filename}"
requests_mock.get(url, text=_data, headers={"content-length": str(len(_data))})
_check_download_ok(url, dest=str(tmp_path))
@pytest.mark.fs
def test_download_ok_no_header(tmp_path, requests_mock):
"""Download without issue should provide filename and hashes"""
url = f"https://pypi.org/pypi/requests/{_filename}"
requests_mock.get(url, text=_data) # no header information
_check_download_ok(url, dest=str(tmp_path))
@pytest.mark.fs
def test_download_ok_with_hashes(tmp_path, requests_mock):
"""Download without issue should provide filename and hashes"""
url = f"https://pypi.org/pypi/requests/{_filename}"
requests_mock.get(url, text=_data, headers={"content-length": str(len(_data))})
# good hashes for such file
good = {
"sha1": "fdd1ce606a904b08c816ba84f3125f2af44d92b2",
"sha256": "1d9224378d77925d612c9f926eb9fb92850e6551def8328011b6a972323298d5", # noqa
}
_check_download_ok(url, dest=str(tmp_path), hashes=good)
@pytest.mark.fs
def test_download_fail_hashes_mismatch(tmp_path, requests_mock):
- """Mismatch hash after download should raise
-
- """
+ """Mismatch hash after download should raise"""
url = f"https://pypi.org/pypi/requests/{_filename}"
requests_mock.get(url, text=_data, headers={"content-length": str(len(_data))})
# good hashes for such file
good = {
"sha1": "fdd1ce606a904b08c816ba84f3125f2af44d92b2",
"sha256": "1d9224378d77925d612c9f926eb9fb92850e6551def8328011b6a972323298d5", # noqa
}
for hash_algo in good.keys():
wrong_hash = good[hash_algo].replace("1", "0")
expected_hashes = good.copy()
expected_hashes[hash_algo] = wrong_hash # set the wrong hash
expected_msg = "Failure when fetching %s. " "Checksum mismatched: %s != %s" % (
url,
wrong_hash,
good[hash_algo],
)
with pytest.raises(ValueError, match=expected_msg):
download(url, dest=str(tmp_path), hashes=expected_hashes)
@pytest.mark.fs
def test_ftp_download_ok(tmp_path, mocker):
"""Download without issue should provide filename and hashes"""
url = f"ftp://pypi.org/pypi/requests/{_filename}"
cm = MagicMock()
cm.getstatus.return_value = 200
cm.read.side_effect = [_data.encode(), b""]
cm.__enter__.return_value = cm
mocker.patch("swh.loader.package.utils.urlopen").return_value = cm
_check_download_ok(url, dest=str(tmp_path))
@pytest.mark.fs
def test_ftp_download_ko(tmp_path, mocker):
"""Download without issue should provide filename and hashes"""
filename = "requests-0.0.1.tar.gz"
url = "ftp://pypi.org/pypi/requests/%s" % filename
mocker.patch("swh.loader.package.utils.urlopen").side_effect = URLError("FTP error")
with pytest.raises(URLError):
download(url, dest=str(tmp_path))
@pytest.mark.fs
def test_download_with_redirection(tmp_path, requests_mock):
"""Download with redirection should use the targeted URL to extract filename"""
url = "https://example.org/project/requests/download"
redirection_url = f"https://example.org/project/requests/files/{_filename}"
requests_mock.get(url, status_code=302, headers={"location": redirection_url})
requests_mock.get(
redirection_url, text=_data, headers={"content-length": str(len(_data))}
)
_check_download_ok(url, dest=str(tmp_path))
def test_download_extracting_filename_from_url(tmp_path, requests_mock):
"""Extracting filename from url must sanitize the filename first"""
url = "https://example.org/project/requests-0.0.1.tar.gz?a=b&c=d&foo=bar"
requests_mock.get(
url, status_code=200, text=_data, headers={"content-length": str(len(_data))}
)
_check_download_ok(url, dest=str(tmp_path))
@pytest.mark.fs
@pytest.mark.parametrize(
"filename", [f'"{_filename}"', _filename, '"filename with spaces.tar.gz"']
)
def test_download_filename_from_content_disposition(tmp_path, requests_mock, filename):
"""Filename should be extracted from content-disposition request header
when available."""
url = "https://example.org/download/requests/tar.gz/v0.0.1"
requests_mock.get(
url,
text=_data,
headers={
"content-length": str(len(_data)),
"content-disposition": f"attachment; filename={filename}",
},
)
_check_download_ok(url, dest=str(tmp_path), filename=filename.strip('"'))
@pytest.mark.fs
@pytest.mark.parametrize("filename", ['"archive école.tar.gz"', "archive_école.tgz"])
def test_download_utf8_filename_from_content_disposition(
tmp_path, requests_mock, filename
):
"""Filename should be extracted from content-disposition request header
when available."""
url = "https://example.org/download/requests/tar.gz/v0.0.1"
data = "this is something"
requests_mock.get(
url,
text=data,
headers={
"content-length": str(len(data)),
"content-disposition": f"attachment; filename*=utf-8''{quote(filename)}",
},
)
_check_download_ok(url, dest=str(tmp_path), filename=filename.strip('"'))
def test_api_info_failure(requests_mock):
"""Failure to fetch info/release information should raise"""
url = "https://pypi.org/pypi/requests/json"
status_code = 400
requests_mock.get(url, status_code=status_code)
with pytest.raises(NotFound) as e0:
api_info(url)
assert e0.value.args[0] == "Fail to query '%s'. Reason: %s" % (url, status_code)
def test_api_info(requests_mock):
"""Fetching json info from pypi project should be ok"""
url = "https://pypi.org/pypi/requests/json"
requests_mock.get(url, text='{"version": "0.0.1"}')
actual_info = json.loads(api_info(url))
assert actual_info == {
"version": "0.0.1",
}
def test_release_name():
for version, filename, expected_release in [
("0.0.1", None, "releases/0.0.1"),
("0.0.2", "something", "releases/0.0.2/something"),
]:
assert release_name(version, filename) == expected_release
diff --git a/swh/loader/pytest_plugin.py b/swh/loader/pytest_plugin.py
index 7f90c40..e501f02 100644
--- a/swh/loader/pytest_plugin.py
+++ b/swh/loader/pytest_plugin.py
@@ -1,54 +1,54 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import os
from typing import Any, Dict
import pytest
import yaml
@pytest.fixture
def swh_storage_backend_config(swh_storage_postgresql) -> Dict[str, Any]:
return {
"cls": "retry",
"storage": {
"cls": "filter",
"storage": {
"cls": "buffer",
"storage": {
"cls": "postgresql",
"db": swh_storage_postgresql.dsn,
"objstorage": {"cls": "memory"},
},
},
},
}
@pytest.fixture
def swh_loader_config(swh_storage_backend_config) -> Dict[str, Any]:
return {
"storage": swh_storage_backend_config,
}
@pytest.fixture
def swh_config(swh_loader_config, monkeypatch, tmp_path) -> str:
conffile = os.path.join(str(tmp_path), "loader.yml")
with open(conffile, "w") as f:
f.write(yaml.dump(swh_loader_config))
monkeypatch.setenv("SWH_CONFIG_FILENAME", conffile)
return conffile
@pytest.fixture(autouse=True, scope="session")
def swh_proxy():
"""Automatically inject this fixture in all tests to ensure no outside
- connection takes place.
+ connection takes place.
"""
os.environ["http_proxy"] = "http://localhost:999"
os.environ["https_proxy"] = "http://localhost:999"
diff --git a/swh/loader/tests/__init__.py b/swh/loader/tests/__init__.py
index f5970fe..32adfed 100644
--- a/swh/loader/tests/__init__.py
+++ b/swh/loader/tests/__init__.py
@@ -1,265 +1,263 @@
# Copyright (C) 2018-2020 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
from collections import defaultdict
import os
from pathlib import PosixPath
import subprocess
from typing import Dict, Iterable, List, Optional, Tuple, Union
from swh.model.hashutil import hash_to_bytes
from swh.model.model import OriginVisitStatus, Snapshot, TargetType
from swh.storage.algos.origin import origin_get_latest_visit_status
from swh.storage.algos.snapshot import snapshot_get_all_branches
from swh.storage.interface import StorageInterface
def assert_last_visit_matches(
storage,
url: str,
status: str,
type: Optional[str] = None,
snapshot: Optional[bytes] = None,
) -> OriginVisitStatus:
"""This retrieves the last visit and visit_status which are expected to exist.
This also checks that the {visit|visit_status} have their respective properties
correctly set.
This returns the last visit_status for that given origin.
Args:
url: Origin url
status: Check that the visit status has the given status
type: Check that the returned visit has the given type
snapshot: Check that the visit status points to the given snapshot
Raises:
AssertionError in case visit or visit status is not found, or any of the type,
status and snapshot mismatch
Returns:
the visit status for further check during the remaining part of the test.
"""
__tracebackhide__ = True # Hide from pytest tracebacks on failure
visit_status = origin_get_latest_visit_status(storage, url)
assert visit_status is not None, f"Origin {url} has no visits"
if type:
assert (
visit_status.type == type
), f"Visit has type {visit_status.type} instead of {type}"
assert (
visit_status.status == status
), f"Visit_status has status {visit_status.status} instead of {status}"
if snapshot is not None:
assert visit_status.snapshot is not None
assert visit_status.snapshot == snapshot, (
f"Visit_status points to snapshot {visit_status.snapshot.hex()} "
f"instead of {snapshot.hex()}"
)
return visit_status
def prepare_repository_from_archive(
archive_path: str,
filename: Optional[str] = None,
tmp_path: Union[PosixPath, str] = "/tmp",
) -> str:
"""Given an existing archive_path, uncompress it.
Returns a file repo url which can be used as origin url.
This does not deal with the case where the archive passed along does not exist.
"""
if not isinstance(tmp_path, str):
tmp_path = str(tmp_path)
# uncompress folder/repositories/dump for the loader to ingest
subprocess.check_output(["tar", "xf", archive_path, "-C", tmp_path])
# build the origin url (or some derivative form)
_fname = filename if filename else os.path.basename(archive_path)
repo_url = f"file://{tmp_path}/{_fname}"
return repo_url
def encode_target(target: Dict) -> Dict:
- """Test helper to ease readability in test
-
- """
+ """Test helper to ease readability in test"""
if not target:
return target
target_type = target["target_type"]
target_data = target["target"]
if target_type == "alias" and isinstance(target_data, str):
encoded_target = target_data.encode("utf-8")
elif isinstance(target_data, str):
encoded_target = hash_to_bytes(target_data)
else:
encoded_target = target_data
return {"target": encoded_target, "target_type": target_type}
class InconsistentAliasBranchError(AssertionError):
"""When an alias branch targets an inexistent branch."""
pass
class InexistentObjectsError(AssertionError):
"""When a targeted branch reference does not exist in the storage"""
pass
def check_snapshot(
expected_snapshot: Snapshot,
storage: StorageInterface,
allowed_empty: Iterable[Tuple[TargetType, bytes]] = [],
) -> Snapshot:
"""Check that:
- snapshot exists in the storage and match
- each object reference up to the revision/release targets exists
Args:
expected_snapshot: full snapshot to check for existence and consistency
storage: storage to lookup information into
allowed_empty: Iterable of branch we allow to be empty (some edge case loaders
allows this case to happen, nixguix for example allows the branch evaluation"
to target the nixpkgs git commit reference, which may not yet be resolvable at
loading time)
Returns:
the snapshot stored in the storage for further test assertion if any is
needed.
"""
__tracebackhide__ = True # Hide from pytest tracebacks on failure
if not isinstance(expected_snapshot, Snapshot):
raise AssertionError(
f"argument 'expected_snapshot' must be a snapshot: {expected_snapshot!r}"
)
snapshot = snapshot_get_all_branches(storage, expected_snapshot.id)
if snapshot is None:
raise AssertionError(f"Snapshot {expected_snapshot.id.hex()} is not found")
assert snapshot == expected_snapshot
objects_by_target_type = defaultdict(list)
object_to_branch = {}
for branch, target in expected_snapshot.branches.items():
if (target.target_type, branch) in allowed_empty:
# safe for those elements to not be checked for existence
continue
objects_by_target_type[target.target_type].append(target.target)
object_to_branch[target.target] = branch
# check that alias references target something that exists, otherwise raise
aliases: List[bytes] = objects_by_target_type.get(TargetType.ALIAS, [])
for alias in aliases:
if alias not in expected_snapshot.branches:
raise InconsistentAliasBranchError(
f"Alias branch {alias.decode('utf-8')} "
f"should be in {list(expected_snapshot.branches)}"
)
revs = objects_by_target_type.get(TargetType.REVISION)
if revs:
revisions = storage.revision_get(revs)
not_found = [rev_id for rev_id, rev in zip(revs, revisions) if rev is None]
if not_found:
missing_objs = ", ".join(
str((object_to_branch[rev], rev.hex())) for rev in not_found
)
raise InexistentObjectsError(
f"Branch/Revision(s) {missing_objs} should exist in storage"
)
# retrieve information from revision
for revision in revisions:
assert revision is not None
objects_by_target_type[TargetType.DIRECTORY].append(revision.directory)
object_to_branch[revision.directory] = revision.id
rels = objects_by_target_type.get(TargetType.RELEASE)
if rels:
not_found = list(storage.release_missing(rels))
if not_found:
missing_objs = ", ".join(
str((object_to_branch[rel], rel.hex())) for rel in not_found
)
raise InexistentObjectsError(
f"Branch/Release(s) {missing_objs} should exist in storage"
)
# first level dirs exist?
dirs = objects_by_target_type.get(TargetType.DIRECTORY)
if dirs:
not_found = list(storage.directory_missing(dirs))
if not_found:
missing_objs = ", ".join(
str((object_to_branch[dir_].hex(), dir_.hex())) for dir_ in not_found
)
raise InexistentObjectsError(
f"Missing directories {missing_objs}: "
"(revision exists, directory target does not)"
)
for dir_ in dirs: # retrieve new objects to check for existence
paths = storage.directory_ls(dir_, recursive=True)
for path in paths:
if path["type"] == "dir":
target_type = TargetType.DIRECTORY
else:
target_type = TargetType.CONTENT
target = path["target"]
objects_by_target_type[target_type].append(target)
object_to_branch[target] = dir_
# check nested directories
dirs = objects_by_target_type.get(TargetType.DIRECTORY)
if dirs:
not_found = list(storage.directory_missing(dirs))
if not_found:
missing_objs = ", ".join(
str((object_to_branch[dir_].hex(), dir_.hex())) for dir_ in not_found
)
raise InexistentObjectsError(
f"Missing directories {missing_objs}: "
"(revision exists, directory target does not)"
)
# check contents directories
cnts = objects_by_target_type.get(TargetType.CONTENT)
if cnts:
not_found = list(storage.content_missing_per_sha1_git(cnts))
if not_found:
missing_objs = ", ".join(
str((object_to_branch[cnt].hex(), cnt.hex())) for cnt in not_found
)
raise InexistentObjectsError(f"Missing contents {missing_objs}")
return snapshot
def get_stats(storage) -> Dict:
"""Adaptation utils to unify the stats counters across storage
- implementation.
+ implementation.
"""
storage.refresh_stat_counters()
stats = storage.stat_counters()
keys = [
"content",
"directory",
"origin",
"origin_visit",
"release",
"revision",
"skipped_content",
"snapshot",
]
return {k: stats.get(k) for k in keys}
diff --git a/swh/loader/tests/conftest.py b/swh/loader/tests/conftest.py
index 6639b5d..0494f2b 100644
--- a/swh/loader/tests/conftest.py
+++ b/swh/loader/tests/conftest.py
@@ -1,19 +1,24 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
from typing import Any, Dict
import pytest
@pytest.fixture
def swh_loader_config() -> Dict[str, Any]:
return {
- "storage": {"cls": "memory",},
+ "storage": {
+ "cls": "memory",
+ },
"deposit": {
"url": "https://deposit.softwareheritage.org/1/private",
- "auth": {"username": "user", "password": "pass",},
+ "auth": {
+ "username": "user",
+ "password": "pass",
+ },
},
}
diff --git a/swh/loader/tests/test_cli.py b/swh/loader/tests/test_cli.py
index 151d0ea..9f43735 100644
--- a/swh/loader/tests/test_cli.py
+++ b/swh/loader/tests/test_cli.py
@@ -1,152 +1,156 @@
# Copyright (C) 2019-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import os
from click.formatting import HelpFormatter
from click.testing import CliRunner
import pytest
import yaml
from swh.loader.cli import SUPPORTED_LOADERS, get_loader
from swh.loader.cli import loader as loader_cli
from swh.loader.package.loader import PackageLoader
def test_get_loader_wrong_input(swh_config):
- """Unsupported loader should raise
-
- """
+ """Unsupported loader should raise"""
loader_type = "unknown"
assert loader_type not in SUPPORTED_LOADERS
with pytest.raises(ValueError, match="Invalid loader"):
get_loader(loader_type, url="db-url")
def test_get_loader(swh_loader_config):
- """Instantiating a supported loader should be ok
-
- """
+ """Instantiating a supported loader should be ok"""
loader_input = {
"archive": {"url": "some-url", "artifacts": []},
- "debian": {"url": "some-url", "packages": [],},
- "npm": {"url": "https://www.npmjs.com/package/onepackage",},
- "pypi": {"url": "some-url",},
+ "debian": {
+ "url": "some-url",
+ "packages": [],
+ },
+ "npm": {
+ "url": "https://www.npmjs.com/package/onepackage",
+ },
+ "pypi": {
+ "url": "some-url",
+ },
}
for loader_type, kwargs in loader_input.items():
kwargs["storage"] = swh_loader_config["storage"]
loader = get_loader(loader_type, **kwargs)
assert isinstance(loader, PackageLoader)
def _write_usage(command, args, max_width=80):
hf = HelpFormatter(width=max_width)
hf.write_usage(command, args)
return hf.getvalue()[:-1]
def test_run_help(swh_config):
- """Usage message should contain list of available loaders
-
- """
+ """Usage message should contain list of available loaders"""
runner = CliRunner()
result = runner.invoke(loader_cli, ["run", "-h"])
assert result.exit_code == 0
# Syntax depends on dependencies' versions
supported_loaders = "|".join(SUPPORTED_LOADERS)
usage_prefix = _write_usage("loader run", "[OPTIONS] [%s]\n" % supported_loaders)
usage_prefix2 = _write_usage("loader run", "[OPTIONS] {%s}\n" % supported_loaders)
assert result.output.startswith((usage_prefix, usage_prefix2))
def test_run_with_configuration_failure(tmp_path):
- """Triggering a load should fail since configuration is incomplete
-
- """
+ """Triggering a load should fail since configuration is incomplete"""
runner = CliRunner()
conf_path = os.path.join(str(tmp_path), "cli.yml")
with open(conf_path, "w") as f:
f.write(yaml.dump({}))
with pytest.raises(ValueError, match="Missing storage"):
runner.invoke(
loader_cli,
- ["-C", conf_path, "run", "pypi", "url=https://some-url",],
+ [
+ "-C",
+ conf_path,
+ "run",
+ "pypi",
+ "url=https://some-url",
+ ],
catch_exceptions=False,
)
def test_run_pypi(mocker, swh_config):
- """Triggering a load should be ok
-
- """
+ """Triggering a load should be ok"""
mock_loader = mocker.patch("swh.loader.package.pypi.loader.PyPILoader.load")
runner = CliRunner()
result = runner.invoke(
- loader_cli, ["-C", swh_config, "run", "pypi", "url=https://some-url",]
+ loader_cli,
+ [
+ "-C",
+ swh_config,
+ "run",
+ "pypi",
+ "url=https://some-url",
+ ],
)
assert result.exit_code == 0
mock_loader.assert_called_once_with()
def test_run_with_visit_date(mocker, swh_config):
- """iso visit_date parameter should be parsed as datetime
-
- """
+ """iso visit_date parameter should be parsed as datetime"""
mock_loader = mocker.patch("swh.loader.cli.get_loader")
runner = CliRunner()
input_date = "2016-05-03 15:16:32+00"
result = runner.invoke(
loader_cli, ["run", "npm", "https://some-url", f"visit_date='{input_date}'"]
)
assert result.exit_code == 0
expected_parsed_date = datetime.datetime(
2016, 5, 3, 15, 16, 32, tzinfo=datetime.timezone.utc
)
mock_loader.assert_called_once_with(
"npm",
storage={"cls": "memory"},
url="https://some-url",
visit_date=expected_parsed_date,
)
def test_list_help(mocker, swh_config):
- """Usage message should contain list of available loaders
-
- """
+ """Usage message should contain list of available loaders"""
runner = CliRunner()
result = runner.invoke(loader_cli, ["list", "--help"])
assert result.exit_code == 0
usage_prefix = _write_usage(
"loader list", f"[OPTIONS] [[{'|'.join(['all'] + SUPPORTED_LOADERS)}]]"
)
expected_help_msg = f"""{usage_prefix}
List supported loaders and optionally their arguments
Options:
-h, --help Show this message and exit.
"""
assert result.output.startswith(expected_help_msg)
def test_list_help_npm(mocker, swh_config):
- """Triggering a load should be ok
-
- """
+ """Triggering a load should be ok"""
runner = CliRunner()
result = runner.invoke(loader_cli, ["list", "npm"])
assert result.exit_code == 0
expected_help_msg = """
Loader: Load npm origin's artifact releases into swh archive.
"""
assert result.output.startswith(expected_help_msg[1:])
diff --git a/swh/loader/tests/test_init.py b/swh/loader/tests/test_init.py
index bb1fed9..64655c7 100644
--- a/swh/loader/tests/test_init.py
+++ b/swh/loader/tests/test_init.py
@@ -1,510 +1,533 @@
# Copyright (C) 2020-2021 The Software Heritage developers
# See the AUTHORS file at the top-level directory of this distribution
# License: GNU General Public License version 3, or any later version
# See top-level LICENSE file for more information
import datetime
import os
import subprocess
import attr
import pytest
from swh.loader.tests import (
InconsistentAliasBranchError,
InexistentObjectsError,
assert_last_visit_matches,
check_snapshot,
encode_target,
prepare_repository_from_archive,
)
from swh.model.from_disk import DentryPerms
from swh.model.hashutil import hash_to_bytes
from swh.model.model import (
Content,
Directory,
DirectoryEntry,
ObjectType,
OriginVisit,
OriginVisitStatus,
Person,
Release,
Revision,
RevisionType,
Snapshot,
SnapshotBranch,
TargetType,
Timestamp,
TimestampWithTimezone,
)
hash_hex = "43e45d56f88993aae6a0198013efa80716fd8920"
ORIGIN_VISIT = OriginVisit(
origin="some-url",
visit=1,
date=datetime.datetime.now(tz=datetime.timezone.utc),
type="archive",
)
ORIGIN_VISIT_STATUS = OriginVisitStatus(
origin="some-url",
visit=1,
type="archive",
date=datetime.datetime.now(tz=datetime.timezone.utc),
status="full",
snapshot=hash_to_bytes("d81cc0710eb6cf9efd5b920a8453e1e07157b6cd"),
metadata=None,
)
CONTENT = Content(
data=b"42\n",
length=3,
sha1=hash_to_bytes("34973274ccef6ab4dfaaf86599792fa9c3fe4689"),
sha1_git=hash_to_bytes("d81cc0710eb6cf9efd5b920a8453e1e07157b6cd"),
sha256=hash_to_bytes(
"673650f936cb3b0a2f93ce09d81be10748b1b203c19e8176b4eefc1964a0cf3a"
),
blake2s256=hash_to_bytes(
"d5fe1939576527e42cfd76a9455a2432fe7f56669564577dd93c4280e76d661d"
),
status="visible",
)
DIRECTORY = Directory(
id=hash_to_bytes("34f335a750111ca0a8b64d8034faec9eedc396be"),
entries=tuple(
[
DirectoryEntry(
name=b"foo",
type="file",
target=CONTENT.sha1_git,
perms=DentryPerms.content,
)
]
),
)
REVISION = Revision(
id=hash_to_bytes("066b1b62dbfa033362092af468bf6cfabec230e7"),
message=b"hello",
author=Person(
name=b"Nicolas Dandrimont",
email=b"nicolas@example.com",
fullname=b"Nicolas Dandrimont <nicolas@example.com> ",
),
date=TimestampWithTimezone(Timestamp(1234567890, 0), offset_bytes=b"+0200"),
committer=Person(
name=b"St\xc3fano Zacchiroli",
email=b"stefano@example.com",
fullname=b"St\xc3fano Zacchiroli <stefano@example.com>",
),
committer_date=TimestampWithTimezone(
Timestamp(1123456789, 0), offset_bytes=b"-0000"
),
parents=(),
type=RevisionType.GIT,
directory=DIRECTORY.id,
metadata={
- "checksums": {"sha1": "tarball-sha1", "sha256": "tarball-sha256",},
+ "checksums": {
+ "sha1": "tarball-sha1",
+ "sha256": "tarball-sha256",
+ },
"signed-off-by": "some-dude",
},
extra_headers=(
(b"gpgsig", b"test123"),
(b"mergetag", b"foo\\bar"),
(b"mergetag", b"\x22\xaf\x89\x80\x01\x00"),
),
synthetic=True,
)
RELEASE = Release(
id=hash_to_bytes("3e9050196aa288264f2a9d279d6abab8b158448b"),
name=b"v0.0.2",
author=Person(
- name=b"tony", email=b"tony@ardumont.fr", fullname=b"tony <tony@ardumont.fr>",
+ name=b"tony",
+ email=b"tony@ardumont.fr",
+ fullname=b"tony <tony@ardumont.fr>",
),
date=TimestampWithTimezone.from_datetime(
datetime.datetime(2021, 10, 15, 22, 26, 53, tzinfo=datetime.timezone.utc)
),
target=REVISION.id,
target_type=ObjectType.REVISION,
message=b"yet another synthetic release",
synthetic=True,
)
SNAPSHOT = Snapshot(
id=hash_to_bytes("2498dbf535f882bc7f9a18fb16c9ad27fda7bab7"),
branches={
b"release/0.1.0": SnapshotBranch(
- target=RELEASE.id, target_type=TargetType.RELEASE,
+ target=RELEASE.id,
+ target_type=TargetType.RELEASE,
+ ),
+ b"HEAD": SnapshotBranch(
+ target=REVISION.id,
+ target_type=TargetType.REVISION,
+ ),
+ b"alias": SnapshotBranch(
+ target=b"HEAD",
+ target_type=TargetType.ALIAS,
),
- b"HEAD": SnapshotBranch(target=REVISION.id, target_type=TargetType.REVISION,),
- b"alias": SnapshotBranch(target=b"HEAD", target_type=TargetType.ALIAS,),
b"evaluation": SnapshotBranch( # branch dedicated to not exist in storage
target=hash_to_bytes("cc4e04c26672dd74e5fd0fecb78b435fb55368f7"),
target_type=TargetType.REVISION,
),
},
)
@pytest.fixture
def swh_storage_backend_config(swh_storage_postgresql):
return {
"cls": "postgresql",
"db": swh_storage_postgresql.dsn,
"objstorage": {"cls": "memory"},
}
@pytest.fixture
def mock_storage(mocker):
mock_storage = mocker.patch("swh.loader.tests.origin_get_latest_visit_status")
mock_storage.return_value = ORIGIN_VISIT_STATUS
return mock_storage
def test_assert_last_visit_matches_raise(mock_storage, mocker):
- """Not finding origin visit_and_statu should raise
-
- """
+ """Not finding origin visit_and_statu should raise"""
# overwrite so we raise because we do not find the right visit
mock_storage.return_value = None
with pytest.raises(AssertionError, match="Origin url has no visits"):
assert_last_visit_matches(mock_storage, "url", status="full")
assert mock_storage.called is True
def test_assert_last_visit_matches_wrong_status(mock_storage, mocker):
- """Wrong visit detected should raise AssertionError
-
- """
+ """Wrong visit detected should raise AssertionError"""
expected_status = "partial"
assert ORIGIN_VISIT_STATUS.status != expected_status
with pytest.raises(AssertionError, match="Visit_status has status"):
assert_last_visit_matches(mock_storage, "url", status=expected_status)
assert mock_storage.called is True
def test_assert_last_visit_matches_wrong_type(mock_storage, mocker):
- """Wrong visit detected should raise AssertionError
-
- """
+ """Wrong visit detected should raise AssertionError"""
expected_type = "git"
assert ORIGIN_VISIT.type != expected_type
with pytest.raises(AssertionError, match="Visit has type"):
assert_last_visit_matches(
mock_storage,
"url",
status=ORIGIN_VISIT_STATUS.status,
type=expected_type, # mismatched type will raise
)
assert mock_storage.called is True
def test_assert_last_visit_matches_wrong_snapshot(mock_storage, mocker):
- """Wrong visit detected should raise AssertionError
-
- """
+ """Wrong visit detected should raise AssertionError"""
expected_snapshot_id = hash_to_bytes("e92cc0710eb6cf9efd5b920a8453e1e07157b6cd")
assert ORIGIN_VISIT_STATUS.snapshot != expected_snapshot_id
with pytest.raises(AssertionError, match="Visit_status points to snapshot"):
assert_last_visit_matches(
mock_storage,
"url",
status=ORIGIN_VISIT_STATUS.status,
snapshot=expected_snapshot_id, # mismatched snapshot will raise
)
assert mock_storage.called is True
def test_assert_last_visit_matches(mock_storage, mocker):
- """Correct visit detected should return the visit_status
-
- """
+ """Correct visit detected should return the visit_status"""
visit_type = ORIGIN_VISIT.type
visit_status = ORIGIN_VISIT_STATUS.status
visit_snapshot = ORIGIN_VISIT_STATUS.snapshot
actual_visit_status = assert_last_visit_matches(
mock_storage,
"url",
type=visit_type,
status=visit_status,
snapshot=visit_snapshot,
)
assert actual_visit_status == ORIGIN_VISIT_STATUS
assert mock_storage.called is True
def test_prepare_repository_from_archive_failure():
# does not deal with inexistent archive so raise
assert os.path.exists("unknown-archive") is False
with pytest.raises(subprocess.CalledProcessError, match="exit status 2"):
prepare_repository_from_archive("unknown-archive")
def test_prepare_repository_from_archive(datadir, tmp_path):
archive_name = "0805nexter-1.1.0"
archive_path = os.path.join(str(datadir), f"{archive_name}.tar.gz")
assert os.path.exists(archive_path) is True
tmp_path = str(tmp_path) # deals with path string
repo_url = prepare_repository_from_archive(
archive_path, filename=archive_name, tmp_path=tmp_path
)
expected_uncompressed_archive_path = os.path.join(tmp_path, archive_name)
assert repo_url == f"file://{expected_uncompressed_archive_path}"
assert os.path.exists(expected_uncompressed_archive_path)
def test_prepare_repository_from_archive_no_filename(datadir, tmp_path):
archive_name = "0805nexter-1.1.0"
archive_path = os.path.join(str(datadir), f"{archive_name}.tar.gz")
assert os.path.exists(archive_path) is True
# deals with path as posix path (for tmp_path)
repo_url = prepare_repository_from_archive(archive_path, tmp_path=tmp_path)
tmp_path = str(tmp_path)
expected_uncompressed_archive_path = os.path.join(tmp_path, archive_name)
expected_repo_url = os.path.join(tmp_path, f"{archive_name}.tar.gz")
assert repo_url == f"file://{expected_repo_url}"
# passing along the filename does not influence the on-disk extraction
# just the repo-url computation
assert os.path.exists(expected_uncompressed_archive_path)
def test_encode_target():
assert encode_target(None) is None
for target_alias in ["something", b"something"]:
target = {
"target_type": "alias",
"target": target_alias,
}
actual_alias_encode_target = encode_target(target)
assert actual_alias_encode_target == {
"target_type": "alias",
"target": b"something",
}
for hash_ in [hash_hex, hash_to_bytes(hash_hex)]:
target = {"target_type": "revision", "target": hash_}
actual_encode_target = encode_target(target)
assert actual_encode_target == {
"target_type": "revision",
"target": hash_to_bytes(hash_hex),
}
def test_check_snapshot(swh_storage):
"""Everything should be fine when snapshot is found and the snapshot reference up to the
revision exist in the storage.
"""
# Create a consistent snapshot arborescence tree in storage
found = False
for entry in DIRECTORY.entries:
if entry.target == CONTENT.sha1_git:
found = True
break
assert found is True
assert REVISION.directory == DIRECTORY.id
assert RELEASE.target == REVISION.id
for branch, target in SNAPSHOT.branches.items():
if branch == b"alias":
assert target.target in SNAPSHOT.branches
elif branch == b"evaluation":
# this one does not exist and we are safelisting its check below
continue
else:
assert target.target in [REVISION.id, RELEASE.id]
swh_storage.content_add([CONTENT])
swh_storage.directory_add([DIRECTORY])
swh_storage.revision_add([REVISION])
swh_storage.release_add([RELEASE])
s = swh_storage.snapshot_add([SNAPSHOT])
assert s == {
"snapshot:add": 1,
}
# all should be fine!
check_snapshot(
SNAPSHOT, swh_storage, allowed_empty=[(TargetType.REVISION, b"evaluation")]
)
def test_check_snapshot_failures(swh_storage):
"""Failure scenarios:
0. snapshot parameter is not a snapshot
1. snapshot id is correct but branches mismatched
2. snapshot id is not correct, it's not found in the storage
3. snapshot reference an alias which does not exist
4. snapshot is found in storage, targeted revision does not exist
5. snapshot is found in storage, targeted revision exists but the directory the
revision targets does not exist
6. snapshot is found in storage, target revision exists, targeted directory by the
revision exist. Content targeted by the directory does not exist.
7. snapshot is found in storage, targeted release does not exist
"""
snap_id_hex = "2498dbf535f882bc7f9a18fb16c9ad27fda7bab7"
snapshot = Snapshot(
id=hash_to_bytes(snap_id_hex),
branches={
b"master": SnapshotBranch(
- target=hash_to_bytes(hash_hex), target_type=TargetType.REVISION,
+ target=hash_to_bytes(hash_hex),
+ target_type=TargetType.REVISION,
),
},
)
s = swh_storage.snapshot_add([snapshot])
assert s == {
"snapshot:add": 1,
}
unexpected_snapshot = Snapshot(
branches={
b"tip": SnapshotBranch( # wrong branch
target=hash_to_bytes(hash_hex), target_type=TargetType.RELEASE
)
},
)
# 0. not a Snapshot object, raise!
with pytest.raises(
AssertionError, match="argument 'expected_snapshot' must be a snapshot"
):
check_snapshot(ORIGIN_VISIT, swh_storage)
# 1. snapshot id is correct but branches mismatched
with pytest.raises(AssertionError): # sadly debian build raises only assertion
check_snapshot(attr.evolve(unexpected_snapshot, id=snapshot.id), swh_storage)
# 2. snapshot id is not correct, it's not found in the storage
wrong_snap_id = hash_to_bytes("999666f535f882bc7f9a18fb16c9ad27fda7bab7")
with pytest.raises(AssertionError, match="is not found"):
check_snapshot(attr.evolve(unexpected_snapshot, id=wrong_snap_id), swh_storage)
# 3. snapshot references an inexistent alias
snapshot0 = Snapshot(
id=hash_to_bytes("123666f535f882bc7f9a18fb16c9ad27fda7bab7"),
branches={
- b"alias": SnapshotBranch(target=b"HEAD", target_type=TargetType.ALIAS,),
+ b"alias": SnapshotBranch(
+ target=b"HEAD",
+ target_type=TargetType.ALIAS,
+ ),
},
)
swh_storage.snapshot_add([snapshot0])
with pytest.raises(InconsistentAliasBranchError, match="Alias branch HEAD"):
check_snapshot(snapshot0, swh_storage)
# 4. snapshot is found in storage, targeted revision does not exist
rev_not_found = list(swh_storage.revision_missing([REVISION.id]))
assert len(rev_not_found) == 1
snapshot1 = Snapshot(
id=hash_to_bytes("456666f535f882bc7f9a18fb16c9ad27fda7bab7"),
branches={
- b"alias": SnapshotBranch(target=b"HEAD", target_type=TargetType.ALIAS,),
+ b"alias": SnapshotBranch(
+ target=b"HEAD",
+ target_type=TargetType.ALIAS,
+ ),
b"HEAD": SnapshotBranch(
- target=REVISION.id, target_type=TargetType.REVISION,
+ target=REVISION.id,
+ target_type=TargetType.REVISION,
),
},
)
swh_storage.snapshot_add([snapshot1])
with pytest.raises(InexistentObjectsError, match="Branch/Revision"):
check_snapshot(snapshot1, swh_storage)
# 5. snapshot is found in storage, targeted revision exists but the directory the
# revision targets does not exist
swh_storage.revision_add([REVISION])
dir_not_found = list(swh_storage.directory_missing([REVISION.directory]))
assert len(dir_not_found) == 1
snapshot2 = Snapshot(
id=hash_to_bytes("987123f535f882bc7f9a18fb16c9ad27fda7bab7"),
branches={
- b"alias": SnapshotBranch(target=b"HEAD", target_type=TargetType.ALIAS,),
+ b"alias": SnapshotBranch(
+ target=b"HEAD",
+ target_type=TargetType.ALIAS,
+ ),
b"HEAD": SnapshotBranch(
- target=REVISION.id, target_type=TargetType.REVISION,
+ target=REVISION.id,
+ target_type=TargetType.REVISION,
),
},
)
swh_storage.snapshot_add([snapshot2])
with pytest.raises(InexistentObjectsError, match="Missing directories"):
check_snapshot(snapshot2, swh_storage)
assert DIRECTORY.id == REVISION.directory
swh_storage.directory_add([DIRECTORY])
# 6. snapshot is found in storage, target revision exists, targeted directory by the
# revision exist. Content targeted by the directory does not exist.
assert DIRECTORY.entries[0].target == CONTENT.sha1_git
not_found = list(swh_storage.content_missing_per_sha1_git([CONTENT.sha1_git]))
assert len(not_found) == 1
swh_storage.directory_add([DIRECTORY])
snapshot3 = Snapshot(
id=hash_to_bytes("091456f535f882bc7f9a18fb16c9ad27fda7bab7"),
branches={
- b"alias": SnapshotBranch(target=b"HEAD", target_type=TargetType.ALIAS,),
+ b"alias": SnapshotBranch(
+ target=b"HEAD",
+ target_type=TargetType.ALIAS,
+ ),
b"HEAD": SnapshotBranch(
- target=REVISION.id, target_type=TargetType.REVISION,
+ target=REVISION.id,
+ target_type=TargetType.REVISION,
),
},
)
swh_storage.snapshot_add([snapshot3])
with pytest.raises(InexistentObjectsError, match="Missing content(s)"):
check_snapshot(snapshot3, swh_storage)
# 7. snapshot is found in storage, targeted release does not exist
# release targets the revisions which exists
assert RELEASE.target == REVISION.id
snapshot4 = Snapshot(
id=hash_to_bytes("789666f535f882bc7f9a18fb16c9ad27fda7bab7"),
branches={
- b"alias": SnapshotBranch(target=b"HEAD", target_type=TargetType.ALIAS,),
+ b"alias": SnapshotBranch(
+ target=b"HEAD",
+ target_type=TargetType.ALIAS,
+ ),
b"HEAD": SnapshotBranch(
- target=REVISION.id, target_type=TargetType.REVISION,
+ target=REVISION.id,
+ target_type=TargetType.REVISION,
),
b"release/0.1.0": SnapshotBranch(
- target=RELEASE.id, target_type=TargetType.RELEASE,
+ target=RELEASE.id,
+ target_type=TargetType.RELEASE,
),
},
)
swh_storage.snapshot_add([snapshot4])
with pytest.raises(InexistentObjectsError, match="Branch/Release"):
check_snapshot(snapshot4, swh_storage)
diff --git a/tox.ini b/tox.ini
index d5da68d..0bb69c2 100644
--- a/tox.ini
+++ b/tox.ini
@@ -1,76 +1,77 @@
[tox]
envlist=black,flake8,mypy,py3
[testenv]
extras =
testing
deps =
swh.core[testing]
swh.storage[testing]
swh.scheduler[testing] >= 0.5.0
pytest-cov
dev: pdbpp
commands =
pytest \
!dev: --cov={envsitepackagesdir}/swh/loader/ --cov-branch \
{envsitepackagesdir}/swh/loader/ {posargs}
[testenv:black]
skip_install = true
deps =
- black==19.10b0
+ black==22.3.0
commands =
{envpython} -m black --check swh
[testenv:flake8]
skip_install = true
deps =
- flake8
+ flake8==4.0.1
+ flake8-bugbear==22.3.23
commands =
{envpython} -m flake8
[testenv:mypy]
extras =
testing
deps =
mypy==0.920
commands =
mypy swh
# build documentation outside swh-environment using the current
# git HEAD of swh-docs, is executed on CI for each diff to prevent
# breaking doc build
[testenv:sphinx]
whitelist_externals = make
usedevelop = true
extras =
testing
deps =
# fetch and install swh-docs in develop mode
-e git+https://forge.softwareheritage.org/source/swh-docs#egg=swh.docs
setenv =
SWH_PACKAGE_DOC_TOX_BUILD = 1
# turn warnings into errors
SPHINXOPTS = -W
commands =
make -I ../.tox/sphinx/src/swh-docs/swh/ -C docs
# build documentation only inside swh-environment using local state
# of swh-docs package
[testenv:sphinx-dev]
whitelist_externals = make
usedevelop = true
extras =
testing
deps =
# install swh-docs in develop mode
-e ../swh-docs
setenv =
SWH_PACKAGE_DOC_TOX_BUILD = 1
# turn warnings into errors
SPHINXOPTS = -W
commands =
make -I ../.tox/sphinx-dev/src/swh-docs/swh/ -C docs

File Metadata

Mime Type
text/x-diff
Expires
Jul 4 2025, 9:35 AM (5 w, 5 d ago)
Storage Engine
blob
Storage Format
Raw Data
Storage Handle
3340002

Event Timeline