diff --git a/PKG-INFO b/PKG-INFO index c54f91a9..6befffb5 100644 --- a/PKG-INFO +++ b/PKG-INFO @@ -1,10 +1,10 @@ Metadata-Version: 1.0 Name: swh.web.ui -Version: 0.0.73 +Version: 0.0.74 Summary: Software Heritage Web UI Home-page: https://forge.softwareheritage.org/diffusion/DWUI/ Author: Software Heritage developers Author-email: swh-devel@inria.fr License: UNKNOWN Description: UNKNOWN Platform: UNKNOWN diff --git a/debian/changelog b/debian/changelog index f0b17049..dbc2c642 100644 --- a/debian/changelog +++ b/debian/changelog @@ -1,638 +1,640 @@ -swh-web (0.0.73-1~swh1~bpo9+1) stretch-swh; urgency=medium +swh-web (0.0.74-1~swh1) unstable-swh; urgency=medium - * Rebuild for stretch-backports. + * Release swh.web.ui v0.0.74 + * Various interface cleanups for API documentation + * Return Error types in API error return values - -- Nicolas Dandrimont Wed, 01 Feb 2017 22:44:10 +0100 + -- Nicolas Dandrimont Thu, 02 Feb 2017 11:03:56 +0100 swh-web (0.0.73-1~swh1) unstable-swh; urgency=medium * Deploy swh.web.ui v0.0.73 * Add a bazillion of style fixes. -- Nicolas Dandrimont Wed, 01 Feb 2017 22:44:10 +0100 swh-web (0.0.72-1~swh1) unstable-swh; urgency=medium * v0.0.72 * apidoc rendering: Improvments * apidoc: add usual copyright/license/contact footer * apidoc: show status code if != 200 * apidoc: hide /content/known/ from the doc * apidoc: document upcoming v. available in endpoint index * apidoc: vertically distantiate jquery search box and preceding text -- Antoine R. Dumont (@ardumont) Wed, 01 Feb 2017 18:34:56 +0100 swh-web (0.0.71-1~swh1) unstable-swh; urgency=medium * v0.0.71 * add static/robots.txt, disabling crawling of /api/ * re-root content-specific endpoints under /api/1/content/ * fix not converted empty bytes string * /revision/origin/: Make the timestamp default to the most recent visit * api: simplify HTML layout by dropping redundant nav and about page * apidoc: document correctly endpoints /content/known/, * /revision/{origin,origin/log}/ and /stat/counters/ -- Antoine R. Dumont (@ardumont) Wed, 01 Feb 2017 16:23:56 +0100 swh-web (0.0.70-1~swh1) unstable-swh; urgency=medium * v0.0.70 * apidoc: Review documentation for * endpoints (person/release/revision/visit-related/upcoming methods) * apidoc: List only method docstring's first paragraph in endpoint index * apidoc: Render type annotation for optional parameter * apidoc: Improve rendering issues * api: Fix problem in origin visit by type and url lookup -- Antoine R. Dumont (@ardumont) Wed, 01 Feb 2017 11:28:32 +0100 swh-web (0.0.69-1~swh1) unstable-swh; urgency=medium * v0.0.69 * Improve documentation information and rendering -- Antoine R. Dumont (@ardumont) Tue, 31 Jan 2017 14:31:19 +0100 swh-web (0.0.68-1~swh1) unstable-swh; urgency=medium * v0.0.68 * Improve ui with last nitpicks * Remove endpoints not supposed to be displayed -- Antoine R. Dumont (@ardumont) Wed, 25 Jan 2017 13:29:49 +0100 swh-web (0.0.67-1~swh1) unstable-swh; urgency=medium * v0.0.67 * Improve rendering style - pass 4 -- Antoine R. Dumont (@ardumont) Tue, 24 Jan 2017 15:30:58 +0100 swh-web (0.0.66-1~swh1) unstable-swh; urgency=medium * v0.0.66 * Improve rendering style - pass 4 -- Antoine R. Dumont (@ardumont) Tue, 24 Jan 2017 15:24:05 +0100 swh-web (0.0.65-1~swh1) unstable-swh; urgency=medium * v0.0.65 * Unify rendering style with www.s.o - pass 3 -- Antoine R. Dumont (@ardumont) Mon, 23 Jan 2017 19:58:19 +0100 swh-web (0.0.64-1~swh1) unstable-swh; urgency=medium * v0.0.64 * Unify rendering style with www.s.o - pass 2 -- Antoine R. Dumont (@ardumont) Mon, 23 Jan 2017 19:28:31 +0100 swh-web (0.0.63-1~swh1) unstable-swh; urgency=medium * v0.0.63 * Unify rendering style with www.s.o - pass 1 -- Antoine R. Dumont (@ardumont) Mon, 23 Jan 2017 16:06:30 +0100 swh-web (0.0.62-1~swh1) unstable-swh; urgency=medium * Release swh-web-ui v0.0.62 * Add flask-limiter to dependencies and wire it in -- Nicolas Dandrimont Fri, 20 Jan 2017 16:29:48 +0100 swh-web (0.0.61-1~swh1) unstable-swh; urgency=medium * v0.0.61 * Fix revision's metadata field limitation -- Antoine R. Dumont (@ardumont) Fri, 20 Jan 2017 15:26:37 +0100 swh-web (0.0.60-1~swh1) unstable-swh; urgency=medium * v0.0.60 * Improve escaping data -- Antoine R. Dumont (@ardumont) Fri, 20 Jan 2017 12:21:22 +0100 swh-web (0.0.59-1~swh1) unstable-swh; urgency=medium * v0.0.59 * Unify pagination on /revision/log/ and /revision/origin/log/ endpoints -- Antoine R. Dumont (@ardumont) Thu, 19 Jan 2017 15:59:06 +0100 swh-web (0.0.58-1~swh1) unstable-swh; urgency=medium * v0.0.58 * Pagination on /api/1/origin/visits/ endpoint -- Antoine R. Dumont (@ardumont) Thu, 19 Jan 2017 14:48:57 +0100 swh-web (0.0.57-1~swh1) unstable-swh; urgency=medium * v0.0.57 * Improve documentation information on api endpoints -- Antoine R. Dumont (@ardumont) Thu, 19 Jan 2017 13:32:56 +0100 swh-web (0.0.56-1~swh1) unstable-swh; urgency=medium * v0.0.56 * Add abilities to display multiple examples on each doc endpoint. -- Antoine R. Dumont (@ardumont) Wed, 18 Jan 2017 14:43:58 +0100 swh-web (0.0.55-1~swh1) unstable-swh; urgency=medium * v0.0.55 * api /content/search/ to /content/known/ * Adapt return values to empty list/dict instead of null * Remove empty values when mono-values are null * Fix broken entity endpoint * Update upcoming endpoints * apidoc: Remove hard-coded example and provide links to follow -- Antoine R. Dumont (@ardumont) Wed, 18 Jan 2017 11:27:45 +0100 swh-web (0.0.54-1~swh1) unstable-swh; urgency=medium * v0.0.54 * Improve documentation description and browsability * Fix css style -- Antoine R. Dumont (@ardumont) Mon, 16 Jan 2017 17:18:21 +0100 swh-web (0.0.53-1~swh1) unstable-swh; urgency=medium * v0.0.53 * apidoc: Update upcoming and hidden endpoints information * apidoc: Enrich route information with tags * apidoc: /api/1/revision/origin/log/: Add pagination explanation * apidoc: /api/1/revision/log/: Add pagination explanation * api: Fix filtering fields to work in depth -- Antoine R. Dumont (@ardumont) Fri, 13 Jan 2017 17:33:01 +0100 swh-web (0.0.52-1~swh1) unstable-swh; urgency=medium * v0.0.52 * Fix doc generation regarding arg and exception * Fix broken examples * Add missing documentation on not found origin visit -- Antoine R. Dumont (@ardumont) Thu, 12 Jan 2017 17:38:59 +0100 swh-web (0.0.51-1~swh1) unstable-swh; urgency=medium * v0.0.51 * Update configuration file from ini to yml -- Antoine R. Dumont (@ardumont) Fri, 16 Dec 2016 13:27:08 +0100 swh-web (0.0.50-1~swh1) unstable-swh; urgency=medium * v0.0.50 * Fix issue regarding data structure change in ctags' reading api endpoint -- Antoine R. Dumont (@ardumont) Tue, 06 Dec 2016 16:08:01 +0100 swh-web (0.0.49-1~swh1) unstable-swh; urgency=medium * v0.0.49 * Rendering improvments -- Antoine R. Dumont (@ardumont) Thu, 01 Dec 2016 16:29:31 +0100 swh-web (0.0.48-1~swh1) unstable-swh; urgency=medium * v0.0.48 * Fix api doc example to actual existing data * Improve search symbol view experience -- Antoine R. Dumont (@ardumont) Thu, 01 Dec 2016 15:32:44 +0100 swh-web (0.0.47-1~swh1) unstable-swh; urgency=medium * v0.0.47 * Improve search content ui (add datatable) * Improve search symbol ui (add datatable without pagination, with * multi-field search) * Split those views to improve readability -- Antoine R. Dumont (@ardumont) Thu, 01 Dec 2016 11:57:16 +0100 swh-web (0.0.46-1~swh1) unstable-swh; urgency=medium * v0.0.46 * Improve search output view on symbols -- Antoine R. Dumont (@ardumont) Wed, 30 Nov 2016 17:45:40 +0100 swh-web (0.0.45-1~swh1) unstable-swh; urgency=medium * v0.0.45 * Migrate search symbol api endpoint to strict equality search * Improve search symbol view result (based on that api) to navigate * through result * Permit to slice result per page with per page flag (limited to 100) * Unify behavior in renderer regarding pagination computation -- Antoine R. Dumont (@ardumont) Wed, 30 Nov 2016 11:00:49 +0100 swh-web (0.0.44-1~swh1) unstable-swh; urgency=medium * v0.0.44 * Rename appropriately /api/1/symbol to /api/1/content/symbol/ * Improve documentation on /api/1/content/symbol/ api endpoint -- Antoine R. Dumont (@ardumont) Tue, 29 Nov 2016 15:00:14 +0100 swh-web (0.0.43-1~swh1) unstable-swh; urgency=medium * v0.0.43 * Improve edge case when looking for ctags symbols * Add a lookup ui to search through symbols -- Antoine R. Dumont (@ardumont) Mon, 28 Nov 2016 16:42:33 +0100 swh-web (0.0.42-1~swh1) unstable-swh; urgency=medium * v0.0.42 * List ctags line as link to content in /browse/content/ view -- Antoine R. Dumont (@ardumont) Fri, 25 Nov 2016 16:21:12 +0100 swh-web (0.0.41-1~swh1) unstable-swh; urgency=medium * v0.0.41 * Improve browse content view by: * adding new information (license, mimetype, language) * highlighting source code -- Antoine R. Dumont (@ardumont) Fri, 25 Nov 2016 14:52:34 +0100 swh-web (0.0.40-1~swh1) unstable-swh; urgency=medium * v0.0.40 * Add pagination to symbol search endpoint -- Antoine R. Dumont (@ardumont) Thu, 24 Nov 2016 14:23:45 +0100 swh-web (0.0.39-1~swh1) unstable-swh; urgency=medium * v0.0.39 * Open /api/1/symbol// * Fix api breaking on /api/1/content/search/ -- Antoine R. Dumont (@ardumont) Thu, 24 Nov 2016 10:28:42 +0100 swh-web (0.0.38-1~swh1) unstable-swh; urgency=medium * v0.0.38 * Minor refactoring * Remove one commit which breaks production -- Antoine R. Dumont (@ardumont) Tue, 22 Nov 2016 16:26:03 +0100 swh-web (0.0.37-1~swh1) unstable-swh; urgency=medium * v0.0.37 * api: Open new endpoints on license, language, filetype * api: Update content endpoint to add url on new endpoints -- Antoine R. Dumont (@ardumont) Tue, 22 Nov 2016 15:04:07 +0100 swh-web (0.0.36-1~swh1) unstable-swh; urgency=medium * v0.0.36 * Adapt to latest origin_visit format -- Antoine R. Dumont (@ardumont) Thu, 08 Sep 2016 15:24:33 +0200 swh-web (0.0.35-1~swh1) unstable-swh; urgency=medium * v0.0.35 * Open /api/1/provenance// api endpoint * Open /api/1/origin//visits/() api endpoint * View: Fix redirection url issue -- Antoine R. Dumont (@ardumont) Mon, 05 Sep 2016 14:28:33 +0200 swh-web (0.0.34-1~swh1) unstable-swh; urgency=medium * v0.0.34 * Improve global ui navigation * Fix apidoc rendering issue * Open /api/1/provenance/ about content provenant information -- Antoine R. Dumont (@ardumont) Fri, 02 Sep 2016 11:42:04 +0200 swh-web (0.0.33-1~swh1) unstable-swh; urgency=medium * Release swh.web.ui v0.0.33 * New declarative API documentation mechanisms -- Nicolas Dandrimont Wed, 24 Aug 2016 16:25:24 +0200 swh-web (0.0.32-1~swh1) unstable-swh; urgency=medium * v0.0.32 * Activate tests during debian packaging * Fix issues on debian packaging * Fix useless jquery loading url * Improve date time parsing -- Antoine R. Dumont (@ardumont) Wed, 20 Jul 2016 12:35:09 +0200 swh-web (0.0.31-1~swh1) unstable-swh; urgency=medium * v0.0.31 * Unify jquery-flot library names with .min -- Antoine R. Dumont (@ardumont) Mon, 18 Jul 2016 11:11:59 +0200 swh-web (0.0.30-1~swh1) unstable-swh; urgency=medium * v0.0.30 * View: Open calendar ui view on origin * API: open /api/1/stat/visits// -- Antoine R. Dumont (@ardumont) Wed, 13 Jul 2016 18:42:40 +0200 swh-web (0.0.29-1~swh1) unstable-swh; urgency=medium * Release swh.web.ui v0.0.29 * All around enhancements of the web ui * Package now tested when building -- Nicolas Dandrimont Tue, 14 Jun 2016 17:58:42 +0200 swh-web (0.0.28-1~swh1) unstable-swh; urgency=medium * v0.0.28 * Fix packaging issues -- Antoine R. Dumont (@ardumont) Mon, 09 May 2016 16:21:04 +0200 swh-web (0.0.27-1~swh1) unstable-swh; urgency=medium * v0.0.27 * Fix packaging issue -- Antoine R. Dumont (@ardumont) Tue, 03 May 2016 16:52:40 +0200 swh-web (0.0.24-1~swh1) unstable-swh; urgency=medium * Release swh.web.ui v0.0.24 * New swh.storage API for timestamps -- Nicolas Dandrimont Fri, 05 Feb 2016 12:07:33 +0100 swh-web (0.0.23-1~swh1) unstable-swh; urgency=medium * v0.0.23 * Bump dependency requirements to latest swh.storage * Returns person's identifier on api + Hide person's emails in views endpoint * Try to decode the content's raw data and fail gracefully * Unify /directory api to Display content's raw data when path resolves to a file * Expose unconditionally the link to download the content's raw data * Download link data redirects to the api ones -- Antoine R. Dumont (@ardumont) Fri, 29 Jan 2016 17:50:31 +0100 swh-web (0.0.22-1~swh1) unstable-swh; urgency=medium * v0.0.22 * Open /browse/revision/origin/[/branch/][/ts/] /history// view * Open /browse/revision/origin/[/branch/][/ts/] / view * Open /browse/revision//history//directory/[] view * Open /browse/revision/origin/[/branch/][/ts/] /history//directory/[] view * Open /browse/revision/origin/[/branch/][/ts/] /directory/[] view * Open /browse/revision//directory// view * Open /browse/revision//history// view * Open /browse/revision//log/ view * Open /browse/entity// view * Release can point to other objects than revision * Fix misbehavior when retrieving git log * Fix another edge case when listing a directory that does not exist * Fix edge case when listing is empty * Fix person_get call * Update documentation about possible error codes -- Antoine R. Dumont (@ardumont) Tue, 26 Jan 2016 15:14:35 +0100 swh-web (0.0.21-1~swh1) unstable-swh; urgency=medium * v0.0.21 * Deal nicely with communication downtime with storage * Update to latest swh.storage api -- Antoine R. Dumont (@ardumont) Wed, 20 Jan 2016 16:31:34 +0100 swh-web (0.0.20-1~swh1) unstable-swh; urgency=medium * v0.0.20 * Open /api/1/entity// -- Antoine R. Dumont (@ardumont) Fri, 15 Jan 2016 16:40:56 +0100 swh-web (0.0.19-1~swh1) unstable-swh; urgency=medium * v0.0.19 * Improve directory_get_by_path integration with storage * Refactor - Only lookup sha1_git_root if needed + factorize service behavior -- Antoine R. Dumont (@ardumont) Fri, 15 Jan 2016 12:47:39 +0100 swh-web (0.0.18-1~swh1) unstable-swh; urgency=medium * v0.0.18 * Open /api/1/revision/origin/[/branch/][/ts/]/ history//directory/[] * origin/master Open /api/1/revision/origin/[/branch/][/ts/]/ history// * Open /api/1/revision/origin/[/branch/][/ts/]/ directory/[] * Open /api/1/revision/origin//branch//ts// * /directory/ apis can now point to files too. * Bump dependency requirement on latest swh.storage * Deactivate api querying occurrences for now * Improve function documentation -- Antoine R. Dumont (@ardumont) Wed, 13 Jan 2016 12:54:54 +0100 swh-web (0.0.17-1~swh1) unstable-swh; urgency=medium * v0.0.17 * Open /api/1/revision//directory/' * Open /api/1/revision//history//directory/ / * Enrich directory listing with url to next subdir * Improve testing coverage * Open 'limit' get query parameter to revision_log and revision_history api -- Antoine R. Dumont (@ardumont) Fri, 08 Jan 2016 11:36:55 +0100 swh-web (0.0.16-1~swh1) unstable-swh; urgency=medium * v0.0.16 * service.lookup_revision_log: Add a limit to the number of commits * Fix docstring rendering -- Antoine R. Dumont (@ardumont) Wed, 06 Jan 2016 15:37:21 +0100 swh-web (0.0.15-1~swh1) unstable-swh; urgency=medium * v0.0.15 * Improve browsable api rendering style * Fix typo in jquery.min.js link * Fix docstring typos * packaging: * add python3-flask-api as package dependency -- Antoine R. Dumont (@ardumont) Wed, 06 Jan 2016 15:12:04 +0100 swh-web (0.0.14-1~swh1) unstable-swh; urgency=medium * v0.0.14 * Open /revision//history// * Add links to api * Improve browsable api rendering -> when api links exists, actual html links will be displayed * Fix production bugs (regarding browsable api) -- Antoine R. Dumont (@ardumont) Wed, 06 Jan 2016 11:42:18 +0100 swh-web (0.0.13-1~swh1) unstable-swh; urgency=medium * v0.0.13 * Open /browse/person/ view * Open /browse/origin/ view * Open /browse/release/ view * Open /browse/revision/ view * Deactivate temporarily /browse/content/ * Add default sha1 * Automatic doc endpoint on base path -- Antoine R. Dumont (@ardumont) Tue, 15 Dec 2015 17:01:27 +0100 swh-web (0.0.12-1~swh1) unstable-swh; urgency=medium * v0.0.12 * Update /api/1/release/ with latest internal standard * Update /api/1/revision/ with latest internal standard * Add global filtering on 'fields' parameter * Update /api/1/content/ with links to raw resource * Improve documentations * Open /api/1/revision//log/ * Open /browse/directory/ to list directory content * Open /browse/content// to show the content * Open /browse/content//raw to show the content * Open /api/1/person/ * Implementation detail * Add Flask API dependency * Split controller in api and views module * Unify internal apis' behavior -- Antoine R. Dumont (@ardumont) Mon, 07 Dec 2015 16:44:43 +0100 swh-web (0.0.11-1~swh1) unstable-swh; urgency=medium * v0.0.11 * Open /1/api/content// * Open /api/1/revision/ * Open /api/1/release/ * Open /api/1/uploadnsearch/ (POST) * Open /api/1/origin/ * Unify 404 and 400 responses on api * Increase code coverage -- Antoine R. Dumont (@ardumont) Thu, 19 Nov 2015 11:24:46 +0100 swh-web (0.0.10-1~swh1) unstable-swh; urgency=medium * v0.0.10 * set document.domain to parent domain softwareheritage.org * improve HTML templates to be (more) valid * cosmetic change in Content-Type JSON header -- Stefano Zacchiroli Mon, 02 Nov 2015 13:59:45 +0100 swh-web (0.0.9-1~swh1) unstable-swh; urgency=medium * v0.0.9 * Remove query entry in api response * Deal with bad request properly with api calls * Improve coverage * Improve dev starting up app * Fix duplicated print statement in dev app startup -- Antoine R. Dumont (@ardumont) Fri, 30 Oct 2015 17:24:15 +0100 swh-web (0.0.8-1~swh1) unstable-swh; urgency=medium * version 0.0.8 -- Stefano Zacchiroli Wed, 28 Oct 2015 20:59:40 +0100 swh-web (0.0.7-1~swh1) unstable-swh; urgency=medium * v0.0.7 * Add @jsonp abilities to /api/1/stat/counters endpoint -- Antoine R. Dumont (@ardumont) Mon, 19 Oct 2015 14:01:40 +0200 swh-web (0.0.4-1~swh1) unstable-swh; urgency=medium * Prepare swh.web.ui v0.0.4 deployment -- Nicolas Dandrimont Fri, 16 Oct 2015 15:38:44 +0200 swh-web (0.0.3-1~swh1) unstable-swh; urgency=medium * Prepare deployment of swh-web-ui v0.0.3 -- Nicolas Dandrimont Wed, 14 Oct 2015 11:09:33 +0200 swh-web (0.0.2-1~swh1) unstable-swh; urgency=medium * Prepare swh.web.ui v0.0.2 deployment -- Nicolas Dandrimont Tue, 13 Oct 2015 16:25:46 +0200 swh-web (0.0.1-1~swh1) unstable-swh; urgency=medium * Initial release * v0.0.1 * Hash lookup to check existence in swh's backend * Hash lookup to detail a content -- Antoine R. Dumont (@ardumont) Thu, 01 Oct 2015 10:01:29 +0200 diff --git a/swh.web.ui.egg-info/PKG-INFO b/swh.web.ui.egg-info/PKG-INFO index c54f91a9..6befffb5 100644 --- a/swh.web.ui.egg-info/PKG-INFO +++ b/swh.web.ui.egg-info/PKG-INFO @@ -1,10 +1,10 @@ Metadata-Version: 1.0 Name: swh.web.ui -Version: 0.0.73 +Version: 0.0.74 Summary: Software Heritage Web UI Home-page: https://forge.softwareheritage.org/diffusion/DWUI/ Author: Software Heritage developers Author-email: swh-devel@inria.fr License: UNKNOWN Description: UNKNOWN Platform: UNKNOWN diff --git a/swh/web/ui/renderers.py b/swh/web/ui/renderers.py index aee3cb46..406cd071 100644 --- a/swh/web/ui/renderers.py +++ b/swh/web/ui/renderers.py @@ -1,286 +1,289 @@ # Copyright (C) 2015-2017 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information import re import yaml import json from docutils.core import publish_parts from docutils.writers.html4css1 import Writer, HTMLTranslator from inspect import cleandoc from jinja2 import escape, Markup from flask import request, Response, render_template from flask import g from pygments import highlight from pygments.lexers import guess_lexer from pygments.formatters import HtmlFormatter from swh.web.ui import utils class SWHFilterEnricher(): """Global filter on fields. """ @classmethod def filter_by_fields(cls, data): """Extract a request parameter 'fields' if it exists to permit the filtering on the data dict's keys. If such field is not provided, returns the data as is. """ fields = request.args.get('fields') if fields: fields = set(fields.split(',')) data = utils.filter_field_keys(data, fields) return data class SWHComputeLinkHeader: """Add link header to response. Mixin intended to be used for example in SWHMultiResponse """ @classmethod def compute_link_header(cls, rv, options): """Add Link header in returned value results. Expects rv to be a dict with 'results' and 'headers' key: 'results': the returned value expected to be shown 'headers': dictionary with link-next and link-prev Args: rv (dict): with keys: - 'headers': potential headers with 'link-next' and 'link-prev' keys - 'results': containing the result to return options (dict): the initial dict to update with result if any Returns: Dict with optional keys 'link-next' and 'link-prev'. """ link_headers = [] if 'headers' not in rv: return {} rv_headers = rv['headers'] if 'link-next' in rv_headers: link_headers.append('<%s>; rel="next"' % ( rv_headers['link-next'])) if 'link-prev' in rv_headers: link_headers.append('<%s>; rel="previous"' % ( rv_headers['link-prev'])) if link_headers: link_header_str = ','.join(link_headers) headers = options.get('headers', {}) headers.update({ 'Link': link_header_str }) return headers return {} class SWHTransformProcessor: """Transform an eventual returned value with multiple layer of information with only what's necessary. If the returned value rv contains the 'results' key, this is the associated value which is returned. Otherwise, return the initial dict without the potential 'headers' key. """ @classmethod def transform(cls, rv): if 'results' in rv: return rv['results'] if 'headers' in rv: rv.pop('headers') return rv class SWHMultiResponse(Response, SWHFilterEnricher, SWHComputeLinkHeader, SWHTransformProcessor): """ A Flask Response subclass. Override force_type to transform dict/list responses into callable Flask response objects whose mimetype matches the request's Accept header: HTML template render, YAML dump or default to a JSON dump. """ @classmethod def make_response_from_mimetype(cls, rv, options={}): options = options.copy() if not (isinstance(rv, list) or isinstance(rv, dict)): return rv def wants_html(best_match): return best_match == 'text/html' and \ request.accept_mimetypes[best_match] > \ request.accept_mimetypes['application/json'] def wants_yaml(best_match): return best_match == 'application/yaml' and \ request.accept_mimetypes[best_match] > \ request.accept_mimetypes['application/json'] acc_mime = ['application/json', 'application/yaml', 'text/html'] best_match = request.accept_mimetypes.best_match(acc_mime) options['headers'] = cls.compute_link_header(rv, options) rv = cls.transform(rv) rv = cls.filter_by_fields(rv) if wants_html(best_match): data = json.dumps(rv, sort_keys=True, indent=4, separators=(',', ': ')) env = g.get('doc_env', {}) env['response_data'] = data env['headers_data'] = None if options and 'headers' in options: env['headers_data'] = options['headers'] env['request'] = request env['heading'] = utils.shorten_path(str(request.path)) env['status_code'] = options.get('status', 200) rv = Response(render_template('apidoc.html', **env), content_type='text/html', **options) elif wants_yaml(best_match): rv = Response( yaml.dump(rv), content_type='application/yaml', **options) else: # jsonify is unhappy with lists in Flask 0.10.1, use json.dumps rv = Response( json.dumps(rv), content_type='application/json', **options) return rv @classmethod def force_type(cls, rv, environ=None): if isinstance(rv, dict) or isinstance(rv, list): rv = cls.make_response_from_mimetype(rv) return super().force_type(rv, environ) def error_response(error_code, error): """Private function to create a custom error response. """ error_opts = {'status': error_code} - error_data = {'error': str(error)} + error_data = { + 'exception': error.__class__.__name__, + 'reason': str(error), + } return SWHMultiResponse.make_response_from_mimetype(error_data, options=error_opts) def urlize_api_links(text): """Utility function for decorating api links in browsable api. Args: text: whose content matching links should be transformed into contextual API or Browse html links. Returns The text transformed if any link is found. The text as is otherwise. """ return re.sub(r'(/api/.*/|/browse/.*/)', r'\1', str(escape(text))) def urlize_header_links(text): """Utility function for decorating headers links in browsable api. Args text: Text whose content contains Link header value Returns: The text transformed with html link if any link is found. The text as is otherwise. """ return re.sub(r'<(/api/.*|/browse/.*)>', r'<\1>', text) class NoHeaderHTMLTranslator(HTMLTranslator): """ Docutils translator subclass to customize the generation of HTML from reST-formatted docstrings """ def __init__(self, document): super().__init__(document) self.body_prefix = [] self.body_suffix = [] def visit_bullet_list(self, node): self.context.append((self.compact_simple, self.compact_p)) self.compact_p = None self.compact_simple = self.is_compactable(node) self.body.append(self.starttag(node, 'ul', CLASS='docstring')) DOCSTRING_WRITER = Writer() DOCSTRING_WRITER.translator_class = NoHeaderHTMLTranslator def safe_docstring_display(docstring): """ Utility function to htmlize reST-formatted documentation in browsable api. """ docstring = cleandoc(docstring) return publish_parts(docstring, writer=DOCSTRING_WRITER)['html_body'] def revision_id_from_url(url): """Utility function to obtain a revision's ID from its browsing URL.""" return re.sub(r'/browse/revision/([0-9a-f]{40}|[0-9a-f]{64})/.*', r'\1', url) def highlight_source(source_code_as_text): """Leverage pygments to guess and highlight source code. Args source_code_as_text (str): source code in plain text Returns: Highlighted text if possible or plain text otherwise """ try: maybe_lexer = guess_lexer(source_code_as_text) if maybe_lexer: r = highlight( source_code_as_text, maybe_lexer, HtmlFormatter(linenos=True, lineanchors='l', anchorlinenos=True)) else: r = '
%s
' % source_code_as_text except: r = '
%s
' % source_code_as_text return Markup(r) diff --git a/swh/web/ui/templates/api-endpoints.html b/swh/web/ui/templates/api-endpoints.html index f81fd2ce..8a84fd7f 100644 --- a/swh/web/ui/templates/api-endpoints.html +++ b/swh/web/ui/templates/api-endpoints.html @@ -1,69 +1,69 @@ {% extends "layout.html" %} {% block title %} Endpoints – Software Heritage API {% endblock %} {% block content %}

Below you can find a list of the available endpoints for version 1 of the Software Heritage API. For a more general introduction please refer to the API overview.

Endpoints marked "available" are considered stable for the current version of the API; endpoints marked "upcoming" are work in progress that will be - stabilized in the near feature. + stabilized in the near future.

{% for route, doc in doc_routes %} {% if 'tags' in doc and doc['tags'] is not none %} {% else %} {% endif %} {% set doc_intro = doc['docstring'].split('\n\n')[0] %} {% endfor %}
Endpoint Status Description
{{ route }} {{ ', '.join(doc['tags']) }}{{ route }} available{{ doc_intro | safe_docstring_display | safe }}
{% endblock %} diff --git a/swh/web/ui/templates/apidoc.html b/swh/web/ui/templates/apidoc.html index 5688ca1a..b5310331 100644 --- a/swh/web/ui/templates/apidoc.html +++ b/swh/web/ui/templates/apidoc.html @@ -1,128 +1,136 @@ {% extends "layout.html" %} {% block title %}{{ heading }} – Software Heritage API {% endblock %} {% block content %} {% if docstring %}

Description

{{ docstring | safe_docstring_display | safe }}
{% endif %} {% if response_data and response_data is not none %}

Request

{{ request.method }} {{ request.url }}
+

Response

{% if status_code != 200 %}

Status Code

{{ status_code }}
{% endif %} {% if headers_data and headers_data is not none %}

Headers

{% for header_name, header_value in headers_data.items() %}
{{ header_name }} {{ header_value | urlize_header_links | safe }}
{% endfor %} {% endif %}

Body

{{ response_data | urlize_api_links | safe }}
{% endif %}
+{% if urls and urls|length > 0 %}
{% for url in urls %} {% endfor %}
URL Allowed Methods
{{ url['rule'] }} {{ url['methods'] | sort | join(', ') }}

+{% endif %} {% if args and args|length > 0 %}
-

Args

+

Arguments

{% for arg in args %}
{{ arg['name'] }}: {{ arg['type'] }}
{{ arg['doc'] | safe_docstring_display | safe }}
{% endfor %}
-
+ +
{% endif %} {% if params and params|length > 0 %}
-

Params

+

Parameters

{% for param in params %}
{{ param['name'] }}: {{ param['type'] }}
{{ param['doc'] | safe_docstring_display | safe }}
{% endfor %}
-
-{% endif %} -{% if excs and excs|length > 0 %} -
-

Raises

-
- {% for exc in excs %} -
{{ exc['exc'] }}
-
{{ exc['doc'] | safe_docstring_display | safe }}
- {% endfor %} -
+
{% endif %} {% if headers %}

Headers

{% for header in headers %}
{{ header['name'] }}: string
{{ header['doc'] | safe_docstring_display | safe }}
{% endfor %}
+
{% endif %} {% if return %}

Returns

{{ return['type'] }}
{{ return['doc'] | safe_docstring_display | safe }}
+
+{% endif %} +{% if excs and excs|length > 0 %} +
+

Errors

+
+ {% for exc in excs %} +
{{ exc['exc'] }}
+
{{ exc['doc'] | safe_docstring_display | safe }}
+ {% endfor %} +
+
+
{% endif %} {% if examples %}

Examples

{% for example in examples %}
{{ example }}
{% endfor %}
{% endif %} {% endblock %} diff --git a/swh/web/ui/templates/includes/apidoc-header.html b/swh/web/ui/templates/includes/apidoc-header.html index a76d8064..2668dbb7 100644 --- a/swh/web/ui/templates/includes/apidoc-header.html +++ b/swh/web/ui/templates/includes/apidoc-header.html @@ -1,104 +1,105 @@

This document describes the Software Heritage Web API.

Endpoint index

You can jump directly to the endpoint index, which lists all available API functionalities, or read on for more general information about the API.

Data model

The Software Heritage project harvests publicly available source code by tracking software distribution channels such as version control systems, tarball releases, and distribution packages.

All retrieved source code and related metadata are stored in the Software Heritage archive, that is conceptually a Merkle DAG. All nodes in the graph are content-addressable, i.e., their node identifiers are computed by hashing their content and, transitively, that of all nodes reachable from them; and no node or edge is ever removed from the graph: the Software Heritage archive is an append-only data structure.

The following types of objects (i.e., graph nodes) can be found in the Software Heritage archive (for more information see the Software Heritage glossary):

  • Content: a specific version of a file stored in the archive, identified by its cryptographic hashes (currently: SHA1, Git-like "salted" SHA1, SHA256). Note that content objects are nameless; their names are context-dependent and stored as part of directory entries (see below).
    Also known as: "blob"
  • Directory: a list of directory entries, where each entry can point to content objects ("file entries"), revisions ("revision entries"), or transitively to other directories ("directory entries"). All entries are associated to the local name of the entry (i.e., a relative path without any path separator) and permission metadata (e.g., chmod value or equivalent).
  • Revision: a point in time snapshot of the content of a directory, together with associated development metadata (e.g., author, timestamp, log message, etc).
    Also known as: "commit".
  • Release: a revision that has been marked as noteworthy with a specific name (e.g., a version number), together with associated development metadata (e.g., author, timestamp, etc).
    Also known as: "tag"
  • Origin: an Internet-based location from which a coherent set of objects (contents, revisions, releases, etc.) archived by Software Heritage has been obtained. Origins are currently identified by URLs.
  • Visit: the passage of Software Heritage on a given origin, to retrieve all source code and metadata available there at the time. A visit object stores the state of all visible branches (if any) available at the origin at visit time; each of them points to a revision object in the archive. Future visits of the same origin will create new visit objects, without removing previous ones.
  • Person: an entity referenced by a revision as either the author or the committer of the corresponding change. A person is associated to a full name and/or an email address.

Version

The current version of the API is v1.

Schema

API access is over HTTPS.

All API endpoints are rooted at https://archive.softwareheritage.org/api/1/.

Data is sent and received as JSON by default.

Example:

  • from the command line:

    curl -i https://archive.softwareheritage.org/api/1/stat/counters/

Response format override

The response format can be overridden using the Accept request header. In particular, Accept: text/html (that web browsers send by default) requests HTML pretty-printing, whereas Accept: application/yaml requests YAML-encoded responses.

Example:

  • /api/1/stat/counters/
  • from the command line:

    curl -i -H 'Accept: application/yaml' https://archive.softwareheritage.org/api/1/stat/counters/

Parameters

Some API endpoints can be tweaked by passing optional parameters. For GET requests, optional parameters can be passed as an HTTP query string.

The optional parameter fields is accepted by all endpoints that return dictionaries and can be used to restrict the list of fields returned by the API, in case you are not interested in all of them. By default, all available fields are returned.

Example:

Errors

While API endpoints will return different kinds of errors depending on their own semantics, some error patterns are common across all endpoints.

Sending malformed data, including syntactically incorrect object identifiers, will result in a 400 Bad Request HTTP response. Example:

  • /api/1/content/deadbeef/ (client error: "deadbeef" is too short to be a syntactically valid object identifier)
  • from the command line:

    curl -i https://archive.softwareheritage.org/api/1/content/deadbeef/

Requesting non existent resources will result in a 404 Not Found HTTP response. Example:

+

Unavailability of the underlying storage backend will result in a 503 Service Unavailable HTTP response.

Pagination

Requests that might potentially return many items will be paginated.

Page size is set to a default (usually: 10 items), but might be overridden with the per_page query parameter up to a maximum (usually: 50 items). Example:

curl https://archive.softwareheritage.org/api/1/origin/1/visits/?per_page=2

To navigate through paginated results, a Link HTTP response header is available to link the current result page to the next one. Example:

curl -i https://archive.softwareheritage.org/api/1/origin/1/visits/?per_page=2 | grep ^Link:
 Link: </api/1/origin/1/visits/?last_visit=2&per_page=2>; rel="next",

Rate limiting

Due to limited resource availability on the back end side, API usage is currently rate limited. Furthermore, as API usage is currently entirely anonymous (i.e., without any authentication), API "users" are currently identified by their origin IP address.

Three HTTP response fields will inform you about the current state of limits that apply to your current rate limiting bucket:

  • X-RateLimit-Limit: maximum number of permitted requests per hour
  • X-RateLimit-Remaining: number of permitted requests remaining before the next reset
  • X-RateLimit-Reset: the time (expressed in Unix time seconds) at which the current rate limiting will expire, resetting to a fresh X-RateLimit-Limit

Example:

curl -i https://archive.softwareheritage.org/api/1/stat/counters/ | grep ^X-RateLimit
 X-RateLimit-Limit: 60
 X-RateLimit-Remaining: 54
 X-RateLimit-Reset: 1485794532
diff --git a/swh/web/ui/templates/includes/apidoc-header.md b/swh/web/ui/templates/includes/apidoc-header.md index c5e833d9..a9c30b4f 100644 --- a/swh/web/ui/templates/includes/apidoc-header.md +++ b/swh/web/ui/templates/includes/apidoc-header.md @@ -1,192 +1,195 @@ This document describes the Software Heritage Web API. ### Endpoint index You can jump directly to the endpoint index, which lists all available API functionalities, or read on for more general information about the API. ### Data model The [Software Heritage](https://www.softwareheritage.org/) project harvests publicly available source code by tracking software distribution channels such as version control systems, tarball releases, and distribution packages. All retrieved source code and related metadata are stored in the Software Heritage archive, that is conceptually a [Merkle DAG](https://en.wikipedia.org/wiki/Merkle_tree). All nodes in the graph are content-addressable, i.e., their node identifiers are computed by hashing their content and, transitively, that of all nodes reachable from them; and no node or edge is ever removed from the graph: the Software Heritage archive is an append-only data structure. The following types of objects (i.e., graph nodes) can be found in the Software Heritage archive (for more information see the [Software Heritage glossary](https://wiki.softwareheritage.org/index.php?title=Glossary)): - **Content**: a specific version of a file stored in the archive, identified by its cryptographic hashes (currently: SHA1, Git-like "salted" SHA1, SHA256). Note that content objects are nameless; their names are context-dependent and stored as part of directory entries (see below).
*Also known as:* "blob" - **Directory**: a list of directory entries, where each entry can point to content objects ("file entries"), revisions ("revision entries"), or transitively to other directories ("directory entries"). All entries are associated to the local name of the entry (i.e., a relative path without any path separator) and permission metadata (e.g., chmod value or equivalent). - **Revision**: a point in time snapshot of the content of a directory, together with associated development metadata (e.g., author, timestamp, log message, etc).
*Also known as:* "commit". - **Release**: a revision that has been marked as noteworthy with a specific name (e.g., a version number), together with associated development metadata (e.g., author, timestamp, etc).
*Also known as:* "tag" - **Origin**: an Internet-based location from which a coherent set of objects (contents, revisions, releases, etc.) archived by Software Heritage has been obtained. Origins are currently identified by URLs. - **Visit**: the passage of Software Heritage on a given origin, to retrieve all source code and metadata available there at the time. A visit object stores the state of all visible branches (if any) available at the origin at visit time; each of them points to a revision object in the archive. Future visits of the same origin will create new visit objects, without removing previous ones. - **Person**: an entity referenced by a revision as either the author or the committer of the corresponding change. A person is associated to a full name and/or an email address. ### Version The current version of the API is **v1**. ### Schema API access is over HTTPS. All API endpoints are rooted at . Data is sent and received as JSON by default. Example: - from the command line: ``` shell curl -i https://archive.softwareheritage.org/api/1/stat/counters/ ``` #### Response format override The response format can be overridden using the `Accept` request header. In particular, `Accept: text/html` (that web browsers send by default) requests HTML pretty-printing, whereas `Accept: application/yaml` requests YAML-encoded responses. Example: - [/api/1/stat/counters/](/api/1/stat/counters/) - from the command line: ``` shell curl -i -H 'Accept: application/yaml' https://archive.softwareheritage.org/api/1/stat/counters/ ``` ### Parameters Some API endpoints can be tweaked by passing optional parameters. For GET requests, optional parameters can be passed as an HTTP query string. The optional parameter `fields` is accepted by all endpoints that return dictionaries and can be used to restrict the list of fields returned by the API, in case you are not interested in all of them. By default, all available fields are returned. Example: - [/api/1/stat/counters/\?fields\=content,directory,revision](/api/1/stat/counters/?fields=content,directory,revision) - from the command line: ``` shell curl https://archive.softwareheritage.org/api/1/stat/counters/?fields=content,directory,revision ``` ### Errors While API endpoints will return different kinds of errors depending on their own semantics, some error patterns are common across all endpoints. Sending malformed data, including syntactically incorrect object identifiers, will result in a `400 Bad Request` HTTP response. Example: - [/api/1/content/deadbeef/](/api/1/content/deadbeef/) (client error: "deadbeef" is too short to be a syntactically valid object identifier) - from the command line: ``` shell curl -i https://archive.softwareheritage.org/api/1/content/deadbeef/ ``` Requesting non existent resources will result in a `404 Not Found` HTTP response. Example: - [/api/1/content/0123456789abcdef0123456789abcdef01234567/](/api/1/content/0123456789abcdef0123456789abcdef01234567/) (error: no object with that identifier is available [yet?]) - from the command line: ``` shell curl -i https://archive.softwareheritage.org/api/1/content/04740277a81c5be6c16f6c9da488ca073b770d7f/ ``` +Unavailability of the underlying storage backend will result in a `503 Service +Unavailable` HTTP response. + ### Pagination Requests that might potentially return many items will be paginated. Page size is set to a default (usually: 10 items), but might be overridden with the `per_page` query parameter up to a maximum (usually: 50 items). Example: ``` shell curl https://archive.softwareheritage.org/api/1/origin/1/visits/?per_page=2 ``` To navigate through paginated results, a `Link` HTTP response header is available to link the current result page to the next one. Example: curl -i https://archive.softwareheritage.org/api/1/origin/1/visits/?per_page=2 | grep ^Link: Link: ; rel="next", ### Rate limiting Due to limited resource availability on the back end side, API usage is currently rate limited. Furthermore, as API usage is currently entirely anonymous (i.e., without any authentication), API "users" are currently identified by their origin IP address. Three HTTP response fields will inform you about the current state of limits that apply to your current rate limiting bucket: - `X-RateLimit-Limit`: maximum number of permitted requests per hour - `X-RateLimit-Remaining`: number of permitted requests remaining before the next reset - `X-RateLimit-Reset`: the time (expressed in [Unix time](https://en.wikipedia.org/wiki/Unix_time) seconds) at which the current rate limiting will expire, resetting to a fresh `X-RateLimit-Limit` Example: curl -i https://archive.softwareheritage.org/api/1/stat/counters/ | grep ^X-RateLimit X-RateLimit-Limit: 60 X-RateLimit-Remaining: 54 X-RateLimit-Reset: 1485794532 diff --git a/swh/web/ui/tests/views/test_api.py b/swh/web/ui/tests/views/test_api.py index c8ec722c..497d5cd8 100644 --- a/swh/web/ui/tests/views/test_api.py +++ b/swh/web/ui/tests/views/test_api.py @@ -1,2373 +1,2403 @@ # Copyright (C) 2015-2017 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information import json import unittest import yaml from nose.tools import istest from unittest.mock import patch, MagicMock from swh.web.ui.tests import test_app from swh.web.ui import exc from swh.web.ui.views import api from swh.web.ui.exc import NotFoundExc, BadInputExc from swh.storage.exc import StorageDBError, StorageAPIError class ApiTestCase(test_app.SWHApiTestCase): def setUp(self): self.origin_visit1 = { 'date': 1104616800.0, 'origin': 10, 'visit': 100, 'metadata': None, 'status': 'full', } self.origin1 = { 'id': 1234, 'lister': 'uuid-lister-0', 'project': 'uuid-project-0', 'url': 'ftp://some/url/to/origin/0', 'type': 'ftp' } @istest def generic_api_lookup_nothing_is_found(self): # given def test_generic_lookup_fn(sha1, another_unused_arg): assert another_unused_arg == 'unused arg' assert sha1 == 'sha1' return None # when with self.assertRaises(NotFoundExc) as cm: api._api_lookup('sha1', test_generic_lookup_fn, 'This will be raised because None is returned.', lambda x: x, 'unused arg') self.assertIn('This will be raised because None is returned.', cm.exception.args[0]) @istest def generic_api_map_are_enriched_and_transformed_to_list(self): # given def test_generic_lookup_fn_1(criteria0, param0, param1): assert criteria0 == 'something' return map(lambda x: x + 1, [1, 2, 3]) # when actual_result = api._api_lookup( 'something', test_generic_lookup_fn_1, 'This is not the error message you are looking for. Move along.', lambda x: x * 2, 'some param 0', 'some param 1') self.assertEqual(actual_result, [4, 6, 8]) @istest def generic_api_list_are_enriched_too(self): # given def test_generic_lookup_fn_2(crit): assert crit == 'something' return ['a', 'b', 'c'] # when actual_result = api._api_lookup( 'something', test_generic_lookup_fn_2, 'Not the error message you are looking for, it is. ' 'Along, you move!', lambda x: ''. join(['=', x, '='])) self.assertEqual(actual_result, ['=a=', '=b=', '=c=']) @istest def generic_api_generator_are_enriched_and_returned_as_list(self): # given def test_generic_lookup_fn_3(crit): assert crit == 'crit' return (i for i in [4, 5, 6]) # when actual_result = api._api_lookup( 'crit', test_generic_lookup_fn_3, 'Move!', lambda x: x - 1) self.assertEqual(actual_result, [3, 4, 5]) @istest def generic_api_simple_data_are_enriched_and_returned_too(self): # given def test_generic_lookup_fn_4(crit): assert crit == '123' return {'a': 10} def test_enrich_data(x): x['a'] = x['a'] * 10 return x # when actual_result = api._api_lookup( '123', test_generic_lookup_fn_4, 'Nothing to do', test_enrich_data) self.assertEqual(actual_result, {'a': 100}) @patch('swh.web.ui.views.api.service') @istest def api_content_filetype(self, mock_service): stub_filetype = { 'mimetype': 'application/xml', 'encoding': 'ascii', 'id': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', } mock_service.lookup_content_filetype.return_value = stub_filetype # when rv = self.app.get( '/api/1/content/' 'sha1_git:b04caf10e9535160d90e874b45aa426de762f19f/filetype/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { 'mimetype': 'application/xml', 'encoding': 'ascii', 'id': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'content_url': '/api/1/content/' 'sha1:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/', }) mock_service.lookup_content_filetype.assert_called_once_with( 'sha1_git:b04caf10e9535160d90e874b45aa426de762f19f') @patch('swh.web.ui.views.api.service') @istest def api_content_filetype_sha_not_found(self, mock_service): # given mock_service.lookup_content_filetype.return_value = None # when rv = self.app.get( '/api/1/content/sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03/' 'filetype/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'No filetype information found for content ' + 'exception': 'NotFoundExc', + 'reason': 'No filetype information found for content ' 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03.' }) mock_service.lookup_content_filetype.assert_called_once_with( 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03') @patch('swh.web.ui.views.api.service') @istest def api_content_language(self, mock_service): stub_language = { 'lang': 'lisp', 'id': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', } mock_service.lookup_content_language.return_value = stub_language # when rv = self.app.get( '/api/1/content/' 'sha1_git:b04caf10e9535160d90e874b45aa426de762f19f/language/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { 'lang': 'lisp', 'id': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'content_url': '/api/1/content/' 'sha1:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/', }) mock_service.lookup_content_language.assert_called_once_with( 'sha1_git:b04caf10e9535160d90e874b45aa426de762f19f') @patch('swh.web.ui.views.api.service') @istest def api_content_language_sha_not_found(self, mock_service): # given mock_service.lookup_content_language.return_value = None # when rv = self.app.get( '/api/1/content/sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03' '/language/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'No language information found for content ' + 'exception': 'NotFoundExc', + 'reason': 'No language information found for content ' 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03.' }) mock_service.lookup_content_language.assert_called_once_with( 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03') @patch('swh.web.ui.views.api.service') @istest def api_content_symbol(self, mock_service): stub_ctag = [{ 'sha1': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'name': 'foobar', 'kind': 'Haskell', 'line': 10, }] mock_service.lookup_expression.return_value = stub_ctag # when rv = self.app.get('/api/1/content/symbol/foo/?last_sha1=sha1') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, [{ 'sha1': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'name': 'foobar', 'kind': 'Haskell', 'line': 10, 'content_url': '/api/1/content/' 'sha1:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/', 'data_url': '/api/1/content/' 'sha1:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/raw/', 'license_url': '/api/1/content/' 'sha1:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/license/', 'language_url': '/api/1/content/' 'sha1:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/language/', 'filetype_url': '/api/1/content/' 'sha1:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/filetype/', }]) actual_headers = dict(rv.headers) self.assertFalse('Link' in actual_headers) mock_service.lookup_expression.assert_called_once_with( 'foo', 'sha1', 10) @patch('swh.web.ui.views.api.service') @istest def api_content_symbol_2(self, mock_service): stub_ctag = [{ 'sha1': '12371b8614fcd89ccd17ca2b1d9e66c5b00a6456', 'name': 'foobar', 'kind': 'Haskell', 'line': 10, }, { 'sha1': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6678', 'name': 'foo', 'kind': 'Lisp', 'line': 10, }] mock_service.lookup_expression.return_value = stub_ctag # when rv = self.app.get( '/api/1/content/symbol/foo/?last_sha1=prev-sha1&per_page=2') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, stub_ctag) actual_headers = dict(rv.headers) self.assertTrue( actual_headers['Link'] == '; rel="next"' or # noqa actual_headers['Link'] == '; rel="next"' # noqa ) mock_service.lookup_expression.assert_called_once_with( 'foo', 'prev-sha1', 2) @patch('swh.web.ui.views.api.service') # @istest def api_content_symbol_3(self, mock_service): stub_ctag = [{ 'sha1': '67891b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'name': 'foo', 'kind': 'variable', 'line': 100, }] mock_service.lookup_expression.return_value = stub_ctag # when rv = self.app.get('/api/1/content/symbol/foo/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, [{ 'sha1': '67891b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'name': 'foo', 'kind': 'variable', 'line': 100, 'content_url': '/api/1/content/' 'sha1:67891b8614fcd89ccd17ca2b1d9e66c5b00a6d03/', 'data_url': '/api/1/content/' 'sha1:67891b8614fcd89ccd17ca2b1d9e66c5b00a6d03/raw/', 'license_url': '/api/1/content/' 'sha1:67891b8614fcd89ccd17ca2b1d9e66c5b00a6d03/license/', 'language_url': '/api/1/content/' 'sha1:67891b8614fcd89ccd17ca2b1d9e66c5b00a6d03/language/', 'filetype_url': '/api/1/content/' 'sha1:67891b8614fcd89ccd17ca2b1d9e66c5b00a6d03/filetype/', }]) actual_headers = dict(rv.headers) self.assertEquals( actual_headers['Link'], '') mock_service.lookup_expression.assert_called_once_with('foo', None, 10) @patch('swh.web.ui.views.api.service') @istest def api_content_symbol_not_found(self, mock_service): # given mock_service.lookup_expression.return_value = [] # when rv = self.app.get('/api/1/content/symbol/bar/?last_sha1=hash') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'No indexed raw content match expression \'bar\'.' + 'exception': 'NotFoundExc', + 'reason': 'No indexed raw content match expression \'bar\'.' }) actual_headers = dict(rv.headers) self.assertFalse('Link' in actual_headers) mock_service.lookup_expression.assert_called_once_with( 'bar', 'hash', 10) @patch('swh.web.ui.views.api.service') @istest def api_content_ctags(self, mock_service): stub_ctags = { 'id': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'ctags': [] } mock_service.lookup_content_ctags.return_value = stub_ctags # when rv = self.app.get( '/api/1/content/' 'sha1_git:b04caf10e9535160d90e874b45aa426de762f19f/ctags/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { 'id': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'ctags': [], 'content_url': '/api/1/content/' 'sha1:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/', }) mock_service.lookup_content_ctags.assert_called_once_with( 'sha1_git:b04caf10e9535160d90e874b45aa426de762f19f') @patch('swh.web.ui.views.api.service') @istest def api_content_license(self, mock_service): stub_license = { 'licenses': ['No_license_found', 'Apache-2.0'], 'id': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'tool_name': 'nomos', } mock_service.lookup_content_license.return_value = stub_license # when rv = self.app.get( '/api/1/content/' 'sha1_git:b04caf10e9535160d90e874b45aa426de762f19f/license/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { 'licenses': ['No_license_found', 'Apache-2.0'], 'id': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'tool_name': 'nomos', 'content_url': '/api/1/content/' 'sha1:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/', }) mock_service.lookup_content_license.assert_called_once_with( 'sha1_git:b04caf10e9535160d90e874b45aa426de762f19f') @patch('swh.web.ui.views.api.service') @istest def api_content_license_sha_not_found(self, mock_service): # given mock_service.lookup_content_license.return_value = None # when rv = self.app.get( '/api/1/content/sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03/' 'license/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'No license information found for content ' + 'exception': 'NotFoundExc', + 'reason': 'No license information found for content ' 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03.' }) mock_service.lookup_content_license.assert_called_once_with( 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03') @patch('swh.web.ui.views.api.service') @istest def api_content_provenance(self, mock_service): stub_provenances = [{ 'origin': 1, 'visit': 2, 'revision': 'b04caf10e9535160d90e874b45aa426de762f19f', 'content': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'path': 'octavio-3.4.0/octave.html/doc_002dS_005fISREG.html' }] mock_service.lookup_content_provenance.return_value = stub_provenances # when rv = self.app.get( '/api/1/content/' 'sha1_git:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/provenance/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, [{ 'origin': 1, 'visit': 2, 'origin_url': '/api/1/origin/1/', 'origin_visits_url': '/api/1/origin/1/visits/', 'origin_visit_url': '/api/1/origin/1/visit/2/', 'revision': 'b04caf10e9535160d90e874b45aa426de762f19f', 'revision_url': '/api/1/revision/' 'b04caf10e9535160d90e874b45aa426de762f19f/', 'content': '34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'content_url': '/api/1/content/' 'sha1_git:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03/', 'path': 'octavio-3.4.0/octave.html/doc_002dS_005fISREG.html' }]) mock_service.lookup_content_provenance.assert_called_once_with( 'sha1_git:34571b8614fcd89ccd17ca2b1d9e66c5b00a6d03') @patch('swh.web.ui.views.api.service') @istest def api_content_provenance_sha_not_found(self, mock_service): # given mock_service.lookup_content_provenance.return_value = None # when rv = self.app.get( '/api/1/content/sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03/' 'provenance/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Content with sha1:40e71b8614fcd89ccd17ca2b1d9e6' + 'exception': 'NotFoundExc', + 'reason': 'Content with sha1:40e71b8614fcd89ccd17ca2b1d9e6' '6c5b00a6d03 not found.' }) mock_service.lookup_content_provenance.assert_called_once_with( 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03') @patch('swh.web.ui.views.api.service') @istest def api_content_metadata(self, mock_service): # given mock_service.lookup_content.return_value = { 'sha1': '40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'sha1_git': 'b4e8f472ffcb01a03875b26e462eb568739f6882', 'sha256': '83c0e67cc80f60caf1fcbec2d84b0ccd7968b3be4735637006560' 'cde9b067a4f', 'length': 17, 'status': 'visible' } # when rv = self.app.get( '/api/1/content/sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03/') self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { 'data_url': '/api/1/content/' 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03/raw/', 'filetype_url': '/api/1/content/' 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03/filetype/', 'language_url': '/api/1/content/' 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03/language/', 'license_url': '/api/1/content/' 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03/license/', 'sha1': '40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03', 'sha1_git': 'b4e8f472ffcb01a03875b26e462eb568739f6882', 'sha256': '83c0e67cc80f60caf1fcbec2d84b0ccd7968b3be4735637006560c' 'de9b067a4f', 'length': 17, 'status': 'visible' }) mock_service.lookup_content.assert_called_once_with( 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03') @patch('swh.web.ui.views.api.service') @istest def api_content_not_found_as_json(self, mock_service): # given mock_service.lookup_content.return_value = None mock_service.lookup_content_provenance = MagicMock() # when rv = self.app.get( '/api/1/content/sha256:83c0e67cc80f60caf1fcbec2d84b0ccd7968b3' 'be4735637006560c/') self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Content with sha256:83c0e67cc80f60caf1fcbec2d84b0ccd79' + 'exception': 'NotFoundExc', + 'reason': 'Content with sha256:83c0e67cc80f60caf1fcbec2d84b0ccd79' '68b3be4735637006560c not found.' }) mock_service.lookup_content.assert_called_once_with( 'sha256:83c0e67cc80f60caf1fcbec2d84b0ccd7968b3' 'be4735637006560c') mock_service.lookup_content_provenance.called = False @patch('swh.web.ui.views.api.service') @istest def api_content_not_found_as_yaml(self, mock_service): # given mock_service.lookup_content.return_value = None mock_service.lookup_content_provenance = MagicMock() # when rv = self.app.get( '/api/1/content/sha256:83c0e67cc80f60caf1fcbec2d84b0ccd7968b3' 'be4735637006560c/', headers={'accept': 'application/yaml'}) self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/yaml') response_data = yaml.load(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Content with sha256:83c0e67cc80f60caf1fcbec2d84b0ccd79' + 'exception': 'NotFoundExc', + 'reason': 'Content with sha256:83c0e67cc80f60caf1fcbec2d84b0ccd79' '68b3be4735637006560c not found.' }) mock_service.lookup_content.assert_called_once_with( 'sha256:83c0e67cc80f60caf1fcbec2d84b0ccd7968b3' 'be4735637006560c') mock_service.lookup_content_provenance.called = False @patch('swh.web.ui.views.api.service') @istest def api_content_raw_ko_not_found(self, mock_service): # given mock_service.lookup_content_raw.return_value = None # when rv = self.app.get( '/api/1/content/sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03' '/raw/') self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Content with sha1:40e71b8614fcd89ccd17ca2b1d9e6' + 'exception': 'NotFoundExc', + 'reason': 'Content with sha1:40e71b8614fcd89ccd17ca2b1d9e6' '6c5b00a6d03 not found.' }) mock_service.lookup_content_raw.assert_called_once_with( 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03') @patch('swh.web.ui.views.api.service') @istest def api_content_raw(self, mock_service): # given stub_content = {'data': b'some content data'} mock_service.lookup_content_raw.return_value = stub_content # when rv = self.app.get( '/api/1/content/sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03' '/raw/', headers={'Content-type': 'application/octet-stream', 'Content-disposition': 'attachment'}) self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/octet-stream') self.assertEquals(rv.data, stub_content['data']) mock_service.lookup_content_raw.assert_called_once_with( 'sha1:40e71b8614fcd89ccd17ca2b1d9e66c5b00a6d03') @patch('swh.web.ui.views.api.service') @istest def api_check_content_known(self, mock_service): # given mock_service.lookup_multiple_hashes.return_value = [ {'found': True, 'filename': None, 'sha1': 'sha1:blah'} ] expected_result = { 'search_stats': {'nbfiles': 1, 'pct': 100}, 'search_res': [{'sha1': 'sha1:blah', 'found': True}] } # when rv = self.app.get('/api/1/content/known/sha1:blah/') self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_result) mock_service.lookup_multiple_hashes.assert_called_once_with( [{'filename': None, 'sha1': 'sha1:blah'}]) @patch('swh.web.ui.views.api.service') @istest def api_check_content_known_as_yaml(self, mock_service): # given mock_service.lookup_multiple_hashes.return_value = [ {'found': True, 'filename': None, 'sha1': 'sha1:halb'}, {'found': False, 'filename': None, 'sha1': 'sha1_git:hello'} ] expected_result = { 'search_stats': {'nbfiles': 2, 'pct': 50}, 'search_res': [{'sha1': 'sha1:halb', 'found': True}, {'sha1': 'sha1_git:hello', 'found': False}] } # when rv = self.app.get('/api/1/content/known/sha1:halb,sha1_git:hello/', headers={'Accept': 'application/yaml'}) self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/yaml') response_data = yaml.load(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_result) mock_service.lookup_multiple_hashes.assert_called_once_with( [{'filename': None, 'sha1': 'sha1:halb'}, {'filename': None, 'sha1': 'sha1_git:hello'}]) @patch('swh.web.ui.views.api.service') @istest def api_check_content_known_post_as_yaml(self, mock_service): # given stub_result = [{'sha1': '7e62b1fe10c88a3eddbba930b156bee2956b2435', 'found': True}, {'filename': 'filepath', 'sha1': '8e62b1fe10c88a3eddbba930b156bee2956b2435', 'found': True}, {'filename': 'filename', 'sha1': '64025b5d1520c615061842a6ce6a456cad962a3f', 'found': False}] mock_service.lookup_multiple_hashes.return_value = stub_result expected_result = { 'search_stats': {'nbfiles': 3, 'pct': 2/3 * 100}, 'search_res': stub_result } # when rv = self.app.post( '/api/1/content/known/', headers={'Accept': 'application/yaml'}, data=dict( q='7e62b1fe10c88a3eddbba930b156bee2956b2435', filepath='8e62b1fe10c88a3eddbba930b156bee2956b2435', filename='64025b5d1520c615061842a6ce6a456cad962a3f') ) self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/yaml') response_data = yaml.load(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_result) @patch('swh.web.ui.views.api.service') @istest def api_check_content_known_not_found(self, mock_service): # given stub_result = [{'sha1': 'sha1:halb', 'found': False}] mock_service.lookup_multiple_hashes.return_value = stub_result expected_result = { 'search_stats': {'nbfiles': 1, 'pct': 0.0}, 'search_res': stub_result } # when rv = self.app.get('/api/1/content/known/sha1:halb/') self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_result) mock_service.lookup_multiple_hashes.assert_called_once_with( [{'filename': None, 'sha1': 'sha1:halb'}]) @patch('swh.web.ui.views.api.service') @istest def api_1_stat_counters_raise_error(self, mock_service): # given mock_service.stat_counters.side_effect = ValueError( 'voluntary error to check the bad request middleware.') # when rv = self.app.get('/api/1/stat/counters/') # then self.assertEquals(rv.status_code, 400) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'voluntary error to check the bad request middleware.'}) + 'exception': 'ValueError', + 'reason': 'voluntary error to check the bad request middleware.'}) @patch('swh.web.ui.views.api.service') @istest def api_1_stat_counters_raise_swh_storage_error_db(self, mock_service): # given mock_service.stat_counters.side_effect = StorageDBError( 'SWH Storage exploded! Will be back online shortly!') # when rv = self.app.get('/api/1/stat/counters/') # then self.assertEquals(rv.status_code, 503) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': + 'exception': 'StorageDBError', + 'reason': 'An unexpected error occurred in the backend: ' 'SWH Storage exploded! Will be back online shortly!'}) @patch('swh.web.ui.views.api.service') @istest def api_1_stat_counters_raise_swh_storage_error_api(self, mock_service): # given mock_service.stat_counters.side_effect = StorageAPIError( 'SWH Storage API dropped dead! Will resurrect from its ashes asap!' ) # when rv = self.app.get('/api/1/stat/counters/') # then self.assertEquals(rv.status_code, 503) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': + 'exception': 'StorageAPIError', + 'reason': 'An unexpected error occurred in the api backend: ' 'SWH Storage API dropped dead! Will resurrect from its ashes asap!' }) @patch('swh.web.ui.views.api.service') @istest def api_1_stat_counters(self, mock_service): # given stub_stats = { "content": 1770830, "directory": 211683, "directory_entry_dir": 209167, "directory_entry_file": 1807094, "directory_entry_rev": 0, "entity": 0, "entity_history": 0, "occurrence": 0, "occurrence_history": 19600, "origin": 1096, "person": 0, "release": 8584, "revision": 7792, "revision_history": 0, "skipped_content": 0 } mock_service.stat_counters.return_value = stub_stats # when rv = self.app.get('/api/1/stat/counters/') self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, stub_stats) mock_service.stat_counters.assert_called_once_with() @patch('swh.web.ui.views.api.service') @istest def api_1_lookup_origin_visits_raise_error(self, mock_service): # given mock_service.lookup_origin_visits.side_effect = ValueError( 'voluntary error to check the bad request middleware.') # when rv = self.app.get('/api/1/origin/2/visits/') # then self.assertEquals(rv.status_code, 400) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'voluntary error to check the bad request middleware.'}) + 'exception': 'ValueError', + 'reason': 'voluntary error to check the bad request middleware.'}) @patch('swh.web.ui.views.api.service') @istest def api_1_lookup_origin_visits_raise_swh_storage_error_db( self, mock_service): # given mock_service.lookup_origin_visits.side_effect = StorageDBError( 'SWH Storage exploded! Will be back online shortly!') # when rv = self.app.get('/api/1/origin/2/visits/') # then self.assertEquals(rv.status_code, 503) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': + 'exception': 'StorageDBError', + 'reason': 'An unexpected error occurred in the backend: ' 'SWH Storage exploded! Will be back online shortly!'}) @patch('swh.web.ui.views.api.service') @istest def api_1_lookup_origin_visits_raise_swh_storage_error_api( self, mock_service): # given mock_service.lookup_origin_visits.side_effect = StorageAPIError( 'SWH Storage API dropped dead! Will resurrect from its ashes asap!' ) # when rv = self.app.get('/api/1/origin/2/visits/') # then self.assertEquals(rv.status_code, 503) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': + 'exception': 'StorageAPIError', + 'reason': 'An unexpected error occurred in the api backend: ' 'SWH Storage API dropped dead! Will resurrect from its ashes asap!' }) @patch('swh.web.ui.views.api.service') @istest def api_1_lookup_origin_visits(self, mock_service): # given stub_visits = [ { 'date': 1293919200.0, 'origin': 1, 'visit': 2 }, { 'date': 1420149600.0, 'origin': 1, 'visit': 3 } ] mock_service.lookup_origin_visits.return_value = stub_visits # when rv = self.app.get('/api/1/origin/2/visits/?per_page=2&last_visit=1') self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, [ { 'date': 1293919200.0, 'origin': 1, 'visit': 2, 'origin_visit_url': '/api/1/origin/1/visit/2/', }, { 'date': 1420149600.0, 'origin': 1, 'visit': 3, 'origin_visit_url': '/api/1/origin/1/visit/3/', } ]) mock_service.lookup_origin_visits.assert_called_once_with( 2, last_visit=1, per_page=2) @patch('swh.web.ui.views.api.service') @istest def api_1_lookup_origin_visit(self, mock_service): # given origin_visit = self.origin_visit1.copy() origin_visit.update({ 'occurrences': { 'master': { 'target_type': 'revision', 'target': 'revision-id', } } }) mock_service.lookup_origin_visit.return_value = origin_visit expected_origin_visit = self.origin_visit1.copy() expected_origin_visit.update({ 'origin_url': '/api/1/origin/10/', 'occurrences': { 'master': { 'target_type': 'revision', 'target': 'revision-id', 'target_url': '/api/1/revision/revision-id/' } } }) # when rv = self.app.get('/api/1/origin/10/visit/100/') self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_origin_visit) mock_service.lookup_origin_visit.assert_called_once_with(10, 100) @patch('swh.web.ui.views.api.service') @istest def api_1_lookup_origin_visit_not_found(self, mock_service): # given mock_service.lookup_origin_visit.return_value = None # when rv = self.app.get('/api/1/origin/1/visit/1000/') self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'No visit 1000 for origin 1 found' + 'exception': 'NotFoundExc', + 'reason': 'No visit 1000 for origin 1 found' }) mock_service.lookup_origin_visit.assert_called_once_with(1, 1000) @patch('swh.web.ui.views.api.service') @istest def api_origin_by_id(self, mock_service): # given mock_service.lookup_origin.return_value = self.origin1 expected_origin = self.origin1.copy() expected_origin.update({ 'origin_visits_url': '/api/1/origin/1234/visits/' }) # when rv = self.app.get('/api/1/origin/1234/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_origin) mock_service.lookup_origin.assert_called_with({'id': 1234}) @patch('swh.web.ui.views.api.service') @istest def api_origin_by_type_url(self, mock_service): # given stub_origin = self.origin1.copy() stub_origin.update({ 'id': 987 }) mock_service.lookup_origin.return_value = stub_origin expected_origin = stub_origin.copy() expected_origin.update({ 'origin_visits_url': '/api/1/origin/987/visits/' }) # when rv = self.app.get('/api/1/origin/ftp/url/ftp://some/url/to/origin/0/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_origin) mock_service.lookup_origin.assert_called_with( {'url': 'ftp://some/url/to/origin/0/', 'type': 'ftp'}) @patch('swh.web.ui.views.api.service') @istest def api_origin_not_found(self, mock_service): # given mock_service.lookup_origin.return_value = None # when rv = self.app.get('/api/1/origin/4321/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Origin with id 4321 not found.' + 'exception': 'NotFoundExc', + 'reason': 'Origin with id 4321 not found.' }) mock_service.lookup_origin.assert_called_with({'id': 4321}) @patch('swh.web.ui.views.api.service') @istest def api_release(self, mock_service): # given stub_release = { 'id': 'release-0', 'target_type': 'revision', 'target': 'revision-sha1', "date": "Mon, 10 Mar 1997 08:00:00 GMT", "synthetic": True, 'author': { 'name': 'author release name', 'email': 'author@email', }, } expected_release = { 'id': 'release-0', 'target_type': 'revision', 'target': 'revision-sha1', 'target_url': '/api/1/revision/revision-sha1/', "date": "Mon, 10 Mar 1997 08:00:00 GMT", "synthetic": True, 'author': { 'name': 'author release name', 'email': 'author@email', }, } mock_service.lookup_release.return_value = stub_release # when rv = self.app.get('/api/1/release/release-0/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_release) mock_service.lookup_release.assert_called_once_with('release-0') @patch('swh.web.ui.views.api.service') @istest def api_release_target_type_not_a_revision(self, mock_service): # given stub_release = { 'id': 'release-0', 'target_type': 'other-stuff', 'target': 'other-stuff-checksum', "date": "Mon, 10 Mar 1997 08:00:00 GMT", "synthetic": True, 'author': { 'name': 'author release name', 'email': 'author@email', }, } expected_release = { 'id': 'release-0', 'target_type': 'other-stuff', 'target': 'other-stuff-checksum', "date": "Mon, 10 Mar 1997 08:00:00 GMT", "synthetic": True, 'author': { 'name': 'author release name', 'email': 'author@email', }, } mock_service.lookup_release.return_value = stub_release # when rv = self.app.get('/api/1/release/release-0/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_release) mock_service.lookup_release.assert_called_once_with('release-0') @patch('swh.web.ui.views.api.service') @istest def api_release_not_found(self, mock_service): # given mock_service.lookup_release.return_value = None # when rv = self.app.get('/api/1/release/release-0/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Release with sha1_git release-0 not found.' + 'exception': 'NotFoundExc', + 'reason': 'Release with sha1_git release-0 not found.' }) @patch('swh.web.ui.views.api.service') @istest def api_revision(self, mock_service): # given stub_revision = { 'id': '18d8be353ed3480476f032475e7c233eff7371d5', 'directory': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'author_name': 'Software Heritage', 'author_email': 'robot@softwareheritage.org', 'committer_name': 'Software Heritage', 'committer_email': 'robot@softwareheritage.org', 'message': 'synthetic revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': ['8734ef7e7c357ce2af928115c6c6a42b7e2a44e7'], 'type': 'tar', 'synthetic': True, 'metadata': { 'original_artifact': [{ 'archive_type': 'tar', 'name': 'webbase-5.7.0.tar.gz', 'sha1': '147f73f369733d088b7a6fa9c4e0273dcd3c7ccd', 'sha1_git': '6a15ea8b881069adedf11feceec35588f2cfe8f1', 'sha256': '401d0df797110bea805d358b85bcc1ced29549d3d73f' '309d36484e7edf7bb912' }] }, } mock_service.lookup_revision.return_value = stub_revision expected_revision = { 'id': '18d8be353ed3480476f032475e7c233eff7371d5', 'url': '/api/1/revision/18d8be353ed3480476f032475e7c233eff7371d5/', 'history_url': '/api/1/revision/18d8be353ed3480476f032475e7c233e' 'ff7371d5/log/', 'directory': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'directory_url': '/api/1/directory/7834ef7e7c357ce2af928115c6c6' 'a42b7e2a44e6/', 'author_name': 'Software Heritage', 'author_email': 'robot@softwareheritage.org', 'committer_name': 'Software Heritage', 'committer_email': 'robot@softwareheritage.org', 'message': 'synthetic revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': [ '8734ef7e7c357ce2af928115c6c6a42b7e2a44e7' ], 'parent_urls': [ '/api/1/revision/8734ef7e7c357ce2af928115c6c6a42b7e2a44e7/' ], 'type': 'tar', 'synthetic': True, 'metadata': { 'original_artifact': [{ 'archive_type': 'tar', 'name': 'webbase-5.7.0.tar.gz', 'sha1': '147f73f369733d088b7a6fa9c4e0273dcd3c7ccd', 'sha1_git': '6a15ea8b881069adedf11feceec35588f2cfe8f1', 'sha256': '401d0df797110bea805d358b85bcc1ced29549d3d73f' '309d36484e7edf7bb912' }] }, } # when rv = self.app.get('/api/1/revision/' '18d8be353ed3480476f032475e7c233eff7371d5/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(expected_revision, response_data) mock_service.lookup_revision.assert_called_once_with( '18d8be353ed3480476f032475e7c233eff7371d5') @patch('swh.web.ui.views.api.service') @istest def api_revision_not_found(self, mock_service): # given mock_service.lookup_revision.return_value = None # when rv = self.app.get('/api/1/revision/revision-0/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Revision with sha1_git revision-0 not found.'}) + 'exception': 'NotFoundExc', + 'reason': 'Revision with sha1_git revision-0 not found.'}) @patch('swh.web.ui.views.api.service') @istest def api_revision_raw_ok(self, mock_service): # given stub_revision = {'message': 'synthetic revision message'} mock_service.lookup_revision_message.return_value = stub_revision # when rv = self.app.get('/api/1/revision/18d8be353ed3480476f032475e7c2' '33eff7371d5/raw/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/octet-stream') self.assertEquals(rv.data, b'synthetic revision message') mock_service.lookup_revision_message.assert_called_once_with( '18d8be353ed3480476f032475e7c233eff7371d5') @patch('swh.web.ui.views.api.service') @istest def api_revision_raw_ok_no_msg(self, mock_service): # given mock_service.lookup_revision_message.side_effect = NotFoundExc( 'No message for revision') # when rv = self.app.get('/api/1/revision/' '18d8be353ed3480476f032475e7c233eff7371d5/raw/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'No message for revision'}) + 'exception': 'NotFoundExc', + 'reason': 'No message for revision'}) self.assertEquals mock_service.lookup_revision_message.assert_called_once_with( '18d8be353ed3480476f032475e7c233eff7371d5') @patch('swh.web.ui.views.api.service') @istest def api_revision_raw_ko_no_rev(self, mock_service): # given mock_service.lookup_revision_message.side_effect = NotFoundExc( 'No revision found') # when rv = self.app.get('/api/1/revision/' '18d8be353ed3480476f032475e7c233eff7371d5/raw/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'No revision found'}) + 'exception': 'NotFoundExc', + 'reason': 'No revision found'}) mock_service.lookup_revision_message.assert_called_once_with( '18d8be353ed3480476f032475e7c233eff7371d5') @patch('swh.web.ui.views.api.service') @istest def api_revision_with_origin_not_found(self, mock_service): mock_service.lookup_revision_by.return_value = None rv = self.app.get('/api/1/revision/origin/123/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) - self.assertIn('Revision with (origin_id: 123', response_data['error']) - self.assertIn('not found', response_data['error']) + self.assertIn('Revision with (origin_id: 123', response_data['reason']) + self.assertIn('not found', response_data['reason']) + self.assertEqual('NotFoundExc', response_data['exception']) mock_service.lookup_revision_by.assert_called_once_with( 123, 'refs/heads/master', None) @patch('swh.web.ui.views.api.service') @istest def api_revision_with_origin(self, mock_service): mock_revision = { 'id': '32', 'directory': '21', 'message': 'message 1', 'type': 'deb', } expected_revision = { 'id': '32', 'url': '/api/1/revision/32/', 'history_url': '/api/1/revision/32/log/', 'directory': '21', 'directory_url': '/api/1/directory/21/', 'message': 'message 1', 'type': 'deb', } mock_service.lookup_revision_by.return_value = mock_revision rv = self.app.get('/api/1/revision/origin/1/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEqual(response_data, expected_revision) mock_service.lookup_revision_by.assert_called_once_with( 1, 'refs/heads/master', None) @patch('swh.web.ui.views.api.service') @istest def api_revision_with_origin_and_branch_name(self, mock_service): mock_revision = { 'id': '12', 'directory': '23', 'message': 'message 2', 'type': 'tar', } mock_service.lookup_revision_by.return_value = mock_revision expected_revision = { 'id': '12', 'url': '/api/1/revision/12/', 'history_url': '/api/1/revision/12/log/', 'directory': '23', 'directory_url': '/api/1/directory/23/', 'message': 'message 2', 'type': 'tar', } rv = self.app.get('/api/1/revision/origin/1/branch/refs/origin/dev/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEqual(response_data, expected_revision) mock_service.lookup_revision_by.assert_called_once_with( 1, 'refs/origin/dev', None) @patch('swh.web.ui.views.api.service') @patch('swh.web.ui.views.api.utils') @istest def api_revision_with_origin_and_branch_name_and_timestamp(self, mock_utils, mock_service): mock_revision = { 'id': '123', 'directory': '456', 'message': 'message 3', 'type': 'tar', } mock_service.lookup_revision_by.return_value = mock_revision expected_revision = { 'id': '123', 'url': '/api/1/revision/123/', 'history_url': '/api/1/revision/123/log/', 'directory': '456', 'directory_url': '/api/1/directory/456/', 'message': 'message 3', 'type': 'tar', } mock_utils.parse_timestamp.return_value = 'parsed-date' mock_utils.enrich_revision.return_value = expected_revision rv = self.app.get('/api/1/revision' '/origin/1' '/branch/refs/origin/dev' '/ts/1452591542/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEqual(response_data, expected_revision) mock_service.lookup_revision_by.assert_called_once_with( 1, 'refs/origin/dev', 'parsed-date') mock_utils.parse_timestamp.assert_called_once_with('1452591542') mock_utils.enrich_revision.assert_called_once_with( mock_revision) @patch('swh.web.ui.views.api.service') @patch('swh.web.ui.views.api.utils') @istest def api_revision_with_origin_and_branch_name_and_timestamp_with_escapes( self, mock_utils, mock_service): mock_revision = { 'id': '999', } mock_service.lookup_revision_by.return_value = mock_revision expected_revision = { 'id': '999', 'url': '/api/1/revision/999/', 'history_url': '/api/1/revision/999/log/', } mock_utils.parse_timestamp.return_value = 'parsed-date' mock_utils.enrich_revision.return_value = expected_revision rv = self.app.get('/api/1/revision' '/origin/1' '/branch/refs%2Forigin%2Fdev' '/ts/Today%20is%20' 'January%201,%202047%20at%208:21:00AM/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEqual(response_data, expected_revision) mock_service.lookup_revision_by.assert_called_once_with( 1, 'refs/origin/dev', 'parsed-date') mock_utils.parse_timestamp.assert_called_once_with( 'Today is January 1, 2047 at 8:21:00AM') mock_utils.enrich_revision.assert_called_once_with( mock_revision) @patch('swh.web.ui.views.api.service') @istest def revision_directory_by_ko_raise(self, mock_service): # given mock_service.lookup_directory_through_revision.side_effect = NotFoundExc('not') # noqa # when with self.assertRaises(NotFoundExc): api._revision_directory_by( {'sha1_git': 'id'}, None, '/api/1/revision/sha1/directory/') # then mock_service.lookup_directory_through_revision.assert_called_once_with( {'sha1_git': 'id'}, None, limit=100, with_data=False) @patch('swh.web.ui.views.api.service') @istest def revision_directory_by_type_dir(self, mock_service): # given mock_service.lookup_directory_through_revision.return_value = ( 'rev-id', { 'type': 'dir', 'revision': 'rev-id', 'path': 'some/path', 'content': [] }) # when actual_dir_content = api._revision_directory_by( {'sha1_git': 'blah-id'}, 'some/path', '/api/1/revision/sha1/directory/') # then self.assertEquals(actual_dir_content, { 'type': 'dir', 'revision': 'rev-id', 'path': 'some/path', 'content': [] }) mock_service.lookup_directory_through_revision.assert_called_once_with( {'sha1_git': 'blah-id'}, 'some/path', limit=100, with_data=False) @patch('swh.web.ui.views.api.service') @istest def revision_directory_by_type_file(self, mock_service): # given mock_service.lookup_directory_through_revision.return_value = ( 'rev-id', { 'type': 'file', 'revision': 'rev-id', 'path': 'some/path', 'content': {'blah': 'blah'} }) # when actual_dir_content = api._revision_directory_by( {'sha1_git': 'sha1'}, 'some/path', '/api/1/revision/origin/2/directory/', limit=1000, with_data=True) # then self.assertEquals(actual_dir_content, { 'type': 'file', 'revision': 'rev-id', 'path': 'some/path', 'content': {'blah': 'blah'} }) mock_service.lookup_directory_through_revision.assert_called_once_with( {'sha1_git': 'sha1'}, 'some/path', limit=1000, with_data=True) @patch('swh.web.ui.views.api.utils') @patch('swh.web.ui.views.api._revision_directory_by') @istest def api_directory_through_revision_origin_ko_not_found(self, mock_rev_dir, mock_utils): mock_rev_dir.side_effect = NotFoundExc('not found') mock_utils.parse_timestamp.return_value = '2012-10-20 00:00:00' rv = self.app.get('/api/1/revision' '/origin/10' '/branch/refs/remote/origin/dev' '/ts/2012-10-20' '/directory/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEqual(response_data, { - 'error': 'not found'}) + 'exception': 'NotFoundExc', + 'reason': 'not found'}) mock_rev_dir.assert_called_once_with( {'origin_id': 10, 'branch_name': 'refs/remote/origin/dev', 'ts': '2012-10-20 00:00:00'}, None, '/api/1/revision' '/origin/10' '/branch/refs/remote/origin/dev' '/ts/2012-10-20' '/directory/', with_data=False) @patch('swh.web.ui.views.api._revision_directory_by') @istest def api_directory_through_revision_origin(self, mock_revision_dir): expected_res = [{ 'id': '123' }] mock_revision_dir.return_value = expected_res rv = self.app.get('/api/1/revision/origin/3/directory/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEqual(response_data, expected_res) mock_revision_dir.assert_called_once_with({ 'origin_id': 3, 'branch_name': 'refs/heads/master', 'ts': None}, None, '/api/1/revision/origin/3/directory/', with_data=False) @patch('swh.web.ui.views.api.service') @istest def api_revision_log(self, mock_service): # given stub_revisions = [{ 'id': '18d8be353ed3480476f032475e7c233eff7371d5', 'directory': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'author_name': 'Software Heritage', 'author_email': 'robot@softwareheritage.org', 'committer_name': 'Software Heritage', 'committer_email': 'robot@softwareheritage.org', 'message': 'synthetic revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': ['7834ef7e7c357ce2af928115c6c6a42b7e2a4345'], 'type': 'tar', 'synthetic': True, }] mock_service.lookup_revision_log.return_value = stub_revisions expected_revisions = [{ 'id': '18d8be353ed3480476f032475e7c233eff7371d5', 'url': '/api/1/revision/18d8be353ed3480476f032475e7c233eff7371d5/', 'history_url': '/api/1/revision/18d8be353ed3480476f032475e7c233ef' 'f7371d5/log/', 'directory': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'directory_url': '/api/1/directory/7834ef7e7c357ce2af928115c6c6a' '42b7e2a44e6/', 'author_name': 'Software Heritage', 'author_email': 'robot@softwareheritage.org', 'committer_name': 'Software Heritage', 'committer_email': 'robot@softwareheritage.org', 'message': 'synthetic revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': [ '7834ef7e7c357ce2af928115c6c6a42b7e2a4345' ], 'parent_urls': [ '/api/1/revision/7834ef7e7c357ce2af928115c6c6a42b7e2a4345/' ], 'type': 'tar', 'synthetic': True, }] # when rv = self.app.get('/api/1/revision/8834ef7e7c357ce2af928115c6c6a42' 'b7e2a44e6/log/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_revisions) self.assertIsNone(rv.headers.get('Link')) mock_service.lookup_revision_log.assert_called_once_with( '8834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 11) @patch('swh.web.ui.views.api.service') @istest def api_revision_log_with_next(self, mock_service): # given stub_revisions = [] for i in range(27): stub_revisions.append({'id': i}) mock_service.lookup_revision_log.return_value = stub_revisions[:26] expected_revisions = [x for x in stub_revisions if x['id'] < 25] for e in expected_revisions: e['url'] = '/api/1/revision/%s/' % e['id'] e['history_url'] = '/api/1/revision/%s/log/' % e['id'] # when rv = self.app.get('/api/1/revision/8834ef7e7c357ce2af928115c6c6a42' 'b7e2a44e6/log/?per_page=25') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_revisions) self.assertEquals(rv.headers['Link'], '; rel="next"') mock_service.lookup_revision_log.assert_called_once_with( '8834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 26) @patch('swh.web.ui.views.api.service') @istest def api_revision_log_not_found(self, mock_service): # given mock_service.lookup_revision_log.return_value = None # when rv = self.app.get('/api/1/revision/8834ef7e7c357ce2af928115c6c6a42b7' 'e2a44e6/log/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Revision with sha1_git' + 'exception': 'NotFoundExc', + 'reason': 'Revision with sha1_git' ' 8834ef7e7c357ce2af928115c6c6a42b7e2a44e6 not found.'}) self.assertIsNone(rv.headers.get('Link')) mock_service.lookup_revision_log.assert_called_once_with( '8834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 11) @patch('swh.web.ui.views.api.service') @istest def api_revision_log_context(self, mock_service): # given stub_revisions = [{ 'id': '18d8be353ed3480476f032475e7c233eff7371d5', 'directory': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'author_name': 'Software Heritage', 'author_email': 'robot@softwareheritage.org', 'committer_name': 'Software Heritage', 'committer_email': 'robot@softwareheritage.org', 'message': 'synthetic revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': ['7834ef7e7c357ce2af928115c6c6a42b7e2a4345'], 'type': 'tar', 'synthetic': True, }] mock_service.lookup_revision_log.return_value = stub_revisions mock_service.lookup_revision_multiple.return_value = [{ 'id': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'directory': '18d8be353ed3480476f032475e7c233eff7371d5', 'author_name': 'Name Surname', 'author_email': 'name@surname.com', 'committer_name': 'Name Surname', 'committer_email': 'name@surname.com', 'message': 'amazing revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': ['adc83b19e793491b1c6ea0fd8b46cd9f32e592fc'], 'type': 'tar', 'synthetic': True, }] expected_revisions = [ { 'url': '/api/1/revision/' '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6/', 'history_url': '/api/1/revision/' '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6/log/', 'id': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'directory': '18d8be353ed3480476f032475e7c233eff7371d5', 'directory_url': '/api/1/directory/' '18d8be353ed3480476f032475e7c233eff7371d5/', 'author_name': 'Name Surname', 'author_email': 'name@surname.com', 'committer_name': 'Name Surname', 'committer_email': 'name@surname.com', 'message': 'amazing revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': ['adc83b19e793491b1c6ea0fd8b46cd9f32e592fc'], 'parent_urls': [ '/api/1/revision/adc83b19e793491b1c6ea0fd8b46cd9f32e592fc/' ], 'type': 'tar', 'synthetic': True, }, { 'url': '/api/1/revision/' '18d8be353ed3480476f032475e7c233eff7371d5/', 'history_url': '/api/1/revision/' '18d8be353ed3480476f032475e7c233eff7371d5/log/', 'id': '18d8be353ed3480476f032475e7c233eff7371d5', 'directory': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'directory_url': '/api/1/directory/' '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6/', 'author_name': 'Software Heritage', 'author_email': 'robot@softwareheritage.org', 'committer_name': 'Software Heritage', 'committer_email': 'robot@softwareheritage.org', 'message': 'synthetic revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': ['7834ef7e7c357ce2af928115c6c6a42b7e2a4345'], 'parent_urls': [ '/api/1/revision/7834ef7e7c357ce2af928115c6c6a42b7e2a4345/' ], 'type': 'tar', 'synthetic': True, }] # when rv = self.app.get('/api/1/revision/18d8be353ed3480476f0' '32475e7c233eff7371d5/prev/prev-rev/log/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(expected_revisions, response_data) self.assertIsNone(rv.headers.get('Link')) mock_service.lookup_revision_log.assert_called_once_with( '18d8be353ed3480476f032475e7c233eff7371d5', 11) mock_service.lookup_revision_multiple.assert_called_once_with( ['prev-rev']) @patch('swh.web.ui.views.api.service') @istest def api_revision_log_by(self, mock_service): # given stub_revisions = [{ 'id': '18d8be353ed3480476f032475e7c233eff7371d5', 'directory': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'author_name': 'Software Heritage', 'author_email': 'robot@softwareheritage.org', 'committer_name': 'Software Heritage', 'committer_email': 'robot@softwareheritage.org', 'message': 'synthetic revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': ['7834ef7e7c357ce2af928115c6c6a42b7e2a4345'], 'type': 'tar', 'synthetic': True, }] mock_service.lookup_revision_log_by.return_value = stub_revisions expected_revisions = [{ 'id': '18d8be353ed3480476f032475e7c233eff7371d5', 'url': '/api/1/revision/18d8be353ed3480476f032475e7c233eff7371d5/', 'history_url': '/api/1/revision/18d8be353ed3480476f032475e7c233ef' 'f7371d5/log/', 'directory': '7834ef7e7c357ce2af928115c6c6a42b7e2a44e6', 'directory_url': '/api/1/directory/7834ef7e7c357ce2af928115c6c6a' '42b7e2a44e6/', 'author_name': 'Software Heritage', 'author_email': 'robot@softwareheritage.org', 'committer_name': 'Software Heritage', 'committer_email': 'robot@softwareheritage.org', 'message': 'synthetic revision message', 'date_offset': 0, 'committer_date_offset': 0, 'parents': [ '7834ef7e7c357ce2af928115c6c6a42b7e2a4345' ], 'parent_urls': [ '/api/1/revision/7834ef7e7c357ce2af928115c6c6a42b7e2a4345/' ], 'type': 'tar', 'synthetic': True, }] # when rv = self.app.get('/api/1/revision/origin/1/log/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_revisions) self.assertEquals(rv.headers.get('Link'), None) mock_service.lookup_revision_log_by.assert_called_once_with( 1, 'refs/heads/master', None, 11) @patch('swh.web.ui.views.api.service') @istest def api_revision_log_by_with_next(self, mock_service): # given stub_revisions = [] for i in range(27): stub_revisions.append({'id': i}) mock_service.lookup_revision_log_by.return_value = stub_revisions[:26] expected_revisions = [x for x in stub_revisions if x['id'] < 25] for e in expected_revisions: e['url'] = '/api/1/revision/%s/' % e['id'] e['history_url'] = '/api/1/revision/%s/log/' % e['id'] # when rv = self.app.get('/api/1/revision/origin/1/log/?per_page=25') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') self.assertIsNotNone(rv.headers['Link']) response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_revisions) mock_service.lookup_revision_log_by.assert_called_once_with( 1, 'refs/heads/master', None, 26) @patch('swh.web.ui.views.api.service') @istest def api_revision_log_by_norev(self, mock_service): # given mock_service.lookup_revision_log_by.side_effect = NotFoundExc( 'No revision') # when rv = self.app.get('/api/1/revision/origin/1/log/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') self.assertIsNone(rv.headers.get('Link')) response_data = json.loads(rv.data.decode('utf-8')) - self.assertEquals(response_data, {'error': 'No revision'}) + self.assertEquals(response_data, {'exception': 'NotFoundExc', + 'reason': 'No revision'}) mock_service.lookup_revision_log_by.assert_called_once_with( 1, 'refs/heads/master', None, 11) @patch('swh.web.ui.views.api.service') @istest def api_revision_history(self, mock_service): # for readability purposes, we use: # - sha1 as 3 letters (url are way too long otherwise to respect pep8) # - only keys with modification steps (all other keys are kept as is) # given stub_revision = { 'id': '883', 'children': ['777', '999'], 'parents': [], 'directory': '272' } mock_service.lookup_revision.return_value = stub_revision # then rv = self.app.get('/api/1/revision/883/prev/999/') self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { 'id': '883', 'url': '/api/1/revision/883/', 'history_url': '/api/1/revision/883/log/', 'history_context_url': '/api/1/revision/883/prev/999/log/', 'children': ['777', '999'], 'children_urls': ['/api/1/revision/777/', '/api/1/revision/999/'], 'parents': [], 'parent_urls': [], 'directory': '272', 'directory_url': '/api/1/directory/272/' }) mock_service.lookup_revision.assert_called_once_with('883') @patch('swh.web.ui.views.api._revision_directory_by') @istest def api_revision_directory_ko_not_found(self, mock_rev_dir): # given mock_rev_dir.side_effect = NotFoundExc('Not found') # then rv = self.app.get('/api/1/revision/999/directory/some/path/to/dir/') self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Not found'}) + 'exception': 'NotFoundExc', + 'reason': 'Not found'}) mock_rev_dir.assert_called_once_with( {'sha1_git': '999'}, 'some/path/to/dir', '/api/1/revision/999/directory/some/path/to/dir/', with_data=False) @patch('swh.web.ui.views.api._revision_directory_by') @istest def api_revision_directory_ok_returns_dir_entries(self, mock_rev_dir): stub_dir = { 'type': 'dir', 'revision': '999', 'content': [ { 'sha1_git': '789', 'type': 'file', 'target': '101', 'target_url': '/api/1/content/sha1_git:101/', 'name': 'somefile', 'file_url': '/api/1/revision/999/directory/some/path/' 'somefile/' }, { 'sha1_git': '123', 'type': 'dir', 'target': '456', 'target_url': '/api/1/directory/456/', 'name': 'to-subdir', 'dir_url': '/api/1/revision/999/directory/some/path/' 'to-subdir/', }] } # given mock_rev_dir.return_value = stub_dir # then rv = self.app.get('/api/1/revision/999/directory/some/path/') self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, stub_dir) mock_rev_dir.assert_called_once_with( {'sha1_git': '999'}, 'some/path', '/api/1/revision/999/directory/some/path/', with_data=False) @patch('swh.web.ui.views.api._revision_directory_by') @istest def api_revision_directory_ok_returns_content(self, mock_rev_dir): stub_content = { 'type': 'file', 'revision': '999', 'content': { 'sha1_git': '789', 'sha1': '101', 'data_url': '/api/1/content/101/raw/', } } # given mock_rev_dir.return_value = stub_content # then url = '/api/1/revision/666/directory/some/other/path/' rv = self.app.get(url) self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, stub_content) mock_rev_dir.assert_called_once_with( {'sha1_git': '666'}, 'some/other/path', url, with_data=False) @patch('swh.web.ui.views.api.service') @istest def api_person(self, mock_service): # given stub_person = { 'id': '198003', 'name': 'Software Heritage', 'email': 'robot@softwareheritage.org', } mock_service.lookup_person.return_value = stub_person # when rv = self.app.get('/api/1/person/198003/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, stub_person) @patch('swh.web.ui.views.api.service') @istest def api_person_not_found(self, mock_service): # given mock_service.lookup_person.return_value = None # when rv = self.app.get('/api/1/person/666/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Person with id 666 not found.'}) + 'exception': 'NotFoundExc', + 'reason': 'Person with id 666 not found.'}) @patch('swh.web.ui.views.api.service') @istest def api_directory(self, mock_service): # given stub_directories = [ { 'sha1_git': '18d8be353ed3480476f032475e7c233eff7371d5', 'type': 'file', 'target': '4568be353ed3480476f032475e7c233eff737123', }, { 'sha1_git': '1d518d8be353ed3480476f032475e7c233eff737', 'type': 'dir', 'target': '8be353ed3480476f032475e7c233eff737123456', }] expected_directories = [ { 'sha1_git': '18d8be353ed3480476f032475e7c233eff7371d5', 'type': 'file', 'target': '4568be353ed3480476f032475e7c233eff737123', 'target_url': '/api/1/content/' 'sha1_git:4568be353ed3480476f032475e7c233eff737123/', }, { 'sha1_git': '1d518d8be353ed3480476f032475e7c233eff737', 'type': 'dir', 'target': '8be353ed3480476f032475e7c233eff737123456', 'target_url': '/api/1/directory/8be353ed3480476f032475e7c233eff737123456/', }] mock_service.lookup_directory.return_value = stub_directories # when rv = self.app.get('/api/1/directory/' '18d8be353ed3480476f032475e7c233eff7371d5/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_directories) mock_service.lookup_directory.assert_called_once_with( '18d8be353ed3480476f032475e7c233eff7371d5') @patch('swh.web.ui.views.api.service') @istest def api_directory_not_found(self, mock_service): # given mock_service.lookup_directory.return_value = [] # when rv = self.app.get('/api/1/directory/' '66618d8be353ed3480476f032475e7c233eff737/') # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'Directory with sha1_git ' + 'exception': 'NotFoundExc', + 'reason': 'Directory with sha1_git ' '66618d8be353ed3480476f032475e7c233eff737 not found.'}) @patch('swh.web.ui.views.api.service') @istest def api_directory_with_path_found(self, mock_service): # given expected_dir = { 'sha1_git': '18d8be353ed3480476f032475e7c233eff7371d5', 'type': 'file', 'name': 'bla', 'target': '4568be353ed3480476f032475e7c233eff737123', 'target_url': '/api/1/content/' 'sha1_git:4568be353ed3480476f032475e7c233eff737123/', } mock_service.lookup_directory_with_path.return_value = expected_dir # when rv = self.app.get('/api/1/directory/' '18d8be353ed3480476f032475e7c233eff7371d5/bla/') # then self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_dir) mock_service.lookup_directory_with_path.assert_called_once_with( '18d8be353ed3480476f032475e7c233eff7371d5', 'bla') @patch('swh.web.ui.views.api.service') @istest def api_directory_with_path_not_found(self, mock_service): # given mock_service.lookup_directory_with_path.return_value = None path = 'some/path/to/dir/' # when rv = self.app.get(('/api/1/directory/' '66618d8be353ed3480476f032475e7c233eff737/%s') % path) path = path.strip('/') # Path stripped of lead/trail separators # then self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': (('Entry with path %s relative to ' - 'directory with sha1_git ' - '66618d8be353ed3480476f032475e7c233eff737 not found.') - % path)}) + 'exception': 'NotFoundExc', + 'reason': (('Entry with path %s relative to ' + 'directory with sha1_git ' + '66618d8be353ed3480476f032475e7c233eff737 not found.') + % path)}) @patch('swh.web.ui.views.api.service') @istest def api_lookup_entity_by_uuid_not_found(self, mock_service): # when mock_service.lookup_entity_by_uuid.return_value = [] # when rv = self.app.get('/api/1/entity/' '5f4d4c51-498a-4e28-88b3-b3e4e8396cba/') self.assertEquals(rv.status_code, 404) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': + 'exception': 'NotFoundExc', + 'reason': "Entity with uuid '5f4d4c51-498a-4e28-88b3-b3e4e8396cba' not " + "found."}) mock_service.lookup_entity_by_uuid.assert_called_once_with( '5f4d4c51-498a-4e28-88b3-b3e4e8396cba') @patch('swh.web.ui.views.api.service') @istest def api_lookup_entity_by_uuid_bad_request(self, mock_service): # when mock_service.lookup_entity_by_uuid.side_effect = BadInputExc( 'bad input: uuid malformed!') # when rv = self.app.get('/api/1/entity/uuid malformed/') self.assertEquals(rv.status_code, 400) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, { - 'error': 'bad input: uuid malformed!'}) + 'exception': 'BadInputExc', + 'reason': 'bad input: uuid malformed!'}) mock_service.lookup_entity_by_uuid.assert_called_once_with( 'uuid malformed') @patch('swh.web.ui.views.api.service') @istest def api_lookup_entity_by_uuid(self, mock_service): # when stub_entities = [ { 'uuid': '34bd6b1b-463f-43e5-a697-785107f598e4', 'parent': 'aee991a0-f8d7-4295-a201-d1ce2efc9fb2' }, { 'uuid': 'aee991a0-f8d7-4295-a201-d1ce2efc9fb2' } ] mock_service.lookup_entity_by_uuid.return_value = stub_entities expected_entities = [ { 'uuid': '34bd6b1b-463f-43e5-a697-785107f598e4', 'uuid_url': '/api/1/entity/34bd6b1b-463f-43e5-a697-' '785107f598e4/', 'parent': 'aee991a0-f8d7-4295-a201-d1ce2efc9fb2', 'parent_url': '/api/1/entity/aee991a0-f8d7-4295-a201-' 'd1ce2efc9fb2/' }, { 'uuid': 'aee991a0-f8d7-4295-a201-d1ce2efc9fb2', 'uuid_url': '/api/1/entity/aee991a0-f8d7-4295-a201-' 'd1ce2efc9fb2/' } ] # when rv = self.app.get('/api/1/entity' '/34bd6b1b-463f-43e5-a697-785107f598e4/') self.assertEquals(rv.status_code, 200) self.assertEquals(rv.mimetype, 'application/json') response_data = json.loads(rv.data.decode('utf-8')) self.assertEquals(response_data, expected_entities) mock_service.lookup_entity_by_uuid.assert_called_once_with( '34bd6b1b-463f-43e5-a697-785107f598e4') class ApiUtils(unittest.TestCase): @istest def api_lookup_not_found(self): # when with self.assertRaises(exc.NotFoundExc) as e: api._api_lookup('something', lambda x: None, 'this is the error message raised as it is None') self.assertEqual(e.exception.args[0], 'this is the error message raised as it is None') @istest def api_lookup_with_result(self): # when actual_result = api._api_lookup('something', lambda x: x + '!', 'this is the error which won\'t be ' 'used here') self.assertEqual(actual_result, 'something!') @istest def api_lookup_with_result_as_map(self): # when actual_result = api._api_lookup([1, 2, 3], lambda x: map(lambda y: y+1, x), 'this is the error which won\'t be ' 'used here') self.assertEqual(actual_result, [2, 3, 4]) diff --git a/swh/web/ui/views/api.py b/swh/web/ui/views/api.py index 50665435..433d3d74 100644 --- a/swh/web/ui/views/api.py +++ b/swh/web/ui/views/api.py @@ -1,1096 +1,1096 @@ # Copyright (C) 2015-2017 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information from types import GeneratorType from flask import request, url_for from swh.web.ui import service, utils, apidoc as doc from swh.web.ui.exc import NotFoundExc from swh.web.ui.main import app # canned doc string snippets that are used in several doc strings _doc_arg_content_id = """A "[hash_type:]hash" content identifier, where hash_type is one of "sha1" (the default), "sha1_git", "sha256", and hash is a checksum obtained with the hash_type hashing algorithm.""" _doc_arg_last_elt = 'element to start listing from, for pagination purposes' _doc_arg_per_page = 'number of elements to list, for pagination purposes' _doc_exc_bad_id = 'syntax error in the given identifier(s)' _doc_exc_id_not_found = 'no object matching the given criteria could be found' _doc_ret_revision_meta = 'metadata of the revision identified by sha1_git' _doc_ret_revision_log = """list of dictionaries representing the metadata of each revision found in the commit log heading to revision sha1_git. For each commit at least the following information are returned: author/committer, authoring/commit timestamps, revision id, commit message, parent (i.e., immediately preceding) commits, "root" directory id.""" _doc_header_link = """indicates that a subsequent result page is available, pointing to it""" @app.route('/api/1/stat/counters/') @doc.route('/api/1/stat/counters/', noargs=True) @doc.returns(rettype=doc.rettypes.dict, retdoc="""dictionary mapping object types to the amount of corresponding objects currently available in the archive""") def api_stats(): """Get statistics about the content of the archive. """ return service.stat_counters() @app.route('/api/1/origin//visits/') @doc.route('/api/1/origin/visits/') @doc.arg('origin_id', default=1, argtype=doc.argtypes.int, argdoc='software origin identifier') @doc.header('Link', doc=_doc_header_link) @doc.param('last_visit', default=None, argtype=doc.argtypes.int, doc=_doc_arg_last_elt) @doc.param('per_page', default=10, argtype=doc.argtypes.int, doc=_doc_arg_per_page) @doc.returns(rettype=doc.rettypes.list, retdoc="""a list of dictionaries describing individual visits. For each visit, its identifier, timestamp (as UNIX time), outcome, and visit-specific URL for more information are given.""") def api_origin_visits(origin_id): """Get information about all visits of a given software origin. """ result = {} per_page = int(request.args.get('per_page', '10')) last_visit = request.args.get('last_visit') if last_visit: last_visit = int(last_visit) def _lookup_origin_visits( origin_id, last_visit=last_visit, per_page=per_page): return service.lookup_origin_visits( origin_id, last_visit=last_visit, per_page=per_page) def _enrich_origin_visit(origin_visit): ov = origin_visit.copy() ov['origin_visit_url'] = url_for('api_origin_visit', origin_id=ov['origin'], visit_id=ov['visit']) return ov r = _api_lookup( origin_id, _lookup_origin_visits, error_msg_if_not_found='No origin %s found' % origin_id, enrich_fn=_enrich_origin_visit) if r: l = len(r) if l == per_page: new_last_visit = r[-1]['visit'] params = { 'origin_id': origin_id, 'last_visit': new_last_visit } if request.args.get('per_page'): params['per_page'] = per_page result['headers'] = { 'link-next': url_for('api_origin_visits', **params) } result.update({ 'results': r }) return result @app.route('/api/1/origin//visit//') @doc.route('/api/1/origin/visit/') @doc.arg('origin_id', default=1, argtype=doc.argtypes.int, argdoc='software origin identifier') @doc.arg('visit_id', default=1, argtype=doc.argtypes.int, argdoc="""visit identifier, relative to the origin identified by origin_id""") @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""dictionary containing both metadata for the entire visit (e.g., timestamp as UNIX time, visit outcome, etc.) and what was at the software origin during the visit (i.e., a mapping from - branches to other archive objects""") + branches to other archive objects)""") def api_origin_visit(origin_id, visit_id): """Get information about a specific visit of a software origin. """ def _enrich_origin_visit(origin_visit): ov = origin_visit.copy() ov['origin_url'] = url_for('api_origin', origin_id=ov['origin']) if 'occurrences' in ov: ov['occurrences'] = { k: utils.enrich_object(v) for k, v in ov['occurrences'].items() } return ov return _api_lookup( origin_id, service.lookup_origin_visit, 'No visit %s for origin %s found' % (visit_id, origin_id), _enrich_origin_visit, visit_id) @app.route('/api/1/content/symbol/', methods=['POST']) @app.route('/api/1/content/symbol//') @doc.route('/api/1/content/symbol/', tags=['upcoming']) @doc.arg('q', default='hello', argtype=doc.argtypes.str, argdoc="""An expression string to lookup in swh's raw content""") @doc.header('Link', doc=_doc_header_link) @doc.param('last_sha1', default=None, argtype=doc.argtypes.str, doc=_doc_arg_last_elt) @doc.param('per_page', default=10, argtype=doc.argtypes.int, doc=_doc_arg_per_page) @doc.returns(rettype=doc.rettypes.list, retdoc="""A list of dict whose content matches the expression. Each dict has the following keys: - id (bytes): identifier of the content - name (text): symbol whose content match the expression - kind (text): kind of the symbol that matched - lang (text): Language for that entry - line (int): Number line for the symbol """) def api_content_symbol(q=None): """Search content objects by `Ctags `_-style symbol (e.g., function name, data type, method, ...). """ result = {} last_sha1 = request.args.get('last_sha1', None) per_page = int(request.args.get('per_page', '10')) def lookup_exp(exp, last_sha1=last_sha1, per_page=per_page): return service.lookup_expression(exp, last_sha1, per_page) symbols = _api_lookup( q, lookup_fn=lookup_exp, error_msg_if_not_found='No indexed raw content match expression \'' '%s\'.' % q, enrich_fn=lambda x: utils.enrich_content(x, top_url=True)) if symbols: l = len(symbols) if l == per_page: new_last_sha1 = symbols[-1]['sha1'] params = { 'q': q, 'last_sha1': new_last_sha1, } if request.args.get('per_page'): params['per_page'] = per_page result['headers'] = { 'link-next': url_for('api_content_symbol', **params), } result.update({ 'results': symbols }) return result @app.route('/api/1/content/known/', methods=['POST']) @app.route('/api/1/content/known//') @doc.route('/api/1/content/known/', tags=['hidden']) @doc.arg('q', default='adc83b19e793491b1c6ea0fd8b46cd9f32e592fc', argtype=doc.argtypes.sha1, argdoc='content identifier as a sha1 checksum') # @doc.param('q', default=None, # argtype=doc.argtypes.str, # doc="""(POST request) An algo_hash:hash string, where algo_hash # is one of sha1, sha1_git or sha256 and hash is the hash to # search for in SWH""") @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.returns(rettype=doc.rettypes.dict, retdoc="""a dictionary with results (found/not found for each given identifier) and statistics about how many identifiers were found""") def api_check_content_known(q=None): """Check whether some content (AKA "blob") is present in the archive. Lookup can be performed by various means: - a GET request with one or several hashes, separated by ',' - a POST request with one or several hashes, passed as (multiple) values for parameter 'q' """ response = {'search_res': None, 'search_stats': None} search_stats = {'nbfiles': 0, 'pct': 0} search_res = None queries = [] # GET: Many hash separated values request if q: hashes = q.split(',') for v in hashes: queries.append({'filename': None, 'sha1': v}) # POST: Many hash requests in post form submission elif request.method == 'POST': data = request.form # Remove potential inputs with no associated value for k, v in data.items(): if v is not None: if k == 'q' and len(v) > 0: queries.append({'filename': None, 'sha1': v}) elif v != '': queries.append({'filename': k, 'sha1': v}) if queries: lookup = service.lookup_multiple_hashes(queries) result = [] l = len(queries) for el in lookup: res_d = {'sha1': el['sha1'], 'found': el['found']} if 'filename' in el and el['filename']: res_d['filename'] = el['filename'] result.append(res_d) search_res = result nbfound = len([x for x in lookup if x['found']]) search_stats['nbfiles'] = l search_stats['pct'] = (nbfound / l) * 100 response['search_res'] = search_res response['search_stats'] = search_stats return response def _api_lookup(criteria, lookup_fn, error_msg_if_not_found, enrich_fn=lambda x: x, *args): """Capture a redundant behavior of: - looking up the backend with a criteria (be it an identifier or checksum) passed to the function lookup_fn - if nothing is found, raise an NotFoundExc exception with error message error_msg_if_not_found. - Otherwise if something is returned: - either as list, map or generator, map the enrich_fn function to it and return the resulting data structure as list. - either as dict and pass to enrich_fn and return the dict enriched. Args: - criteria: discriminating criteria to lookup - lookup_fn: function expects one criteria and optional supplementary *args. - error_msg_if_not_found: if nothing matching the criteria is found, raise NotFoundExc with this error message. - enrich_fn: Function to use to enrich the result returned by lookup_fn. Default to the identity function if not provided. - *args: supplementary arguments to pass to lookup_fn. Raises: NotFoundExp or whatever `lookup_fn` raises. """ res = lookup_fn(criteria, *args) if not res: raise NotFoundExc(error_msg_if_not_found) if isinstance(res, (map, list, GeneratorType)): return [enrich_fn(x) for x in res] return enrich_fn(res) @app.route('/api/1/origin//') @app.route('/api/1/origin//url/') @doc.route('/api/1/origin/') @doc.arg('origin_id', default=1, argtype=doc.argtypes.int, argdoc='origin identifier (when looking up by ID)') @doc.arg('origin_type', default='git', argtype=doc.argtypes.str, argdoc='origin type (when looking up by type+URL)') @doc.arg('origin_url', default='https://github.com/hylang/hy', argtype=doc.argtypes.path, argdoc='origin URL (when looking up by type+URL') @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""The metadata of the origin corresponding to the given criteria""") def api_origin(origin_id=None, origin_type=None, origin_url=None): """Get information about a software origin. Software origins might be looked up by origin type and canonical URL (e.g., "git" + a "git clone" URL), or by their unique (but otherwise meaningless) identifier. """ ori_dict = { 'id': origin_id, 'type': origin_type, 'url': origin_url } ori_dict = {k: v for k, v in ori_dict.items() if ori_dict[k]} if 'id' in ori_dict: error_msg = 'Origin with id %s not found.' % ori_dict['id'] else: error_msg = 'Origin with type %s and URL %s not found' % ( ori_dict['type'], ori_dict['url']) def _enrich_origin(origin): if 'id' in origin: o = origin.copy() o['origin_visits_url'] = url_for('api_origin_visits', origin_id=o['id']) return o return origin return _api_lookup( ori_dict, lookup_fn=service.lookup_origin, error_msg_if_not_found=error_msg, enrich_fn=_enrich_origin) @app.route('/api/1/person//') @doc.route('/api/1/person/') @doc.arg('person_id', default=42, argtype=doc.argtypes.int, argdoc='person identifier') @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc='The metadata of the person identified by person_id') def api_person(person_id): """Get information about a person. """ return _api_lookup( person_id, lookup_fn=service.lookup_person, error_msg_if_not_found='Person with id %s not found.' % person_id) @app.route('/api/1/release//') @doc.route('/api/1/release/') @doc.arg('sha1_git', default='7045404f3d1c54e6473c71bbb716529fbad4be24', argtype=doc.argtypes.sha1_git, argdoc='release identifier') @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc='The metadata of the release identified by sha1_git') def api_release(sha1_git): """Get information about a release. Releases are identified by SHA1 checksums, compatible with Git tag identifiers. See ``release_identifier`` in our `data model module `_ for details about how they are computed. """ error_msg = 'Release with sha1_git %s not found.' % sha1_git return _api_lookup( sha1_git, lookup_fn=service.lookup_release, error_msg_if_not_found=error_msg, enrich_fn=utils.enrich_release) def _revision_directory_by(revision, path, request_path, limit=100, with_data=False): """Compute the revision matching criterion's directory or content data. Args: revision: dictionary of criterions representing a revision to lookup path: directory's path to lookup request_path: request path which holds the original context to limit: optional query parameter to limit the revisions log (default to 100). For now, note that this limit could impede the transitivity conclusion about sha1_git not being an ancestor of with_data: indicate to retrieve the content's raw data if path resolves to a content. """ def enrich_directory_local(dir, context_url=request_path): return utils.enrich_directory(dir, context_url) rev_id, result = service.lookup_directory_through_revision( revision, path, limit=limit, with_data=with_data) content = result['content'] if result['type'] == 'dir': # dir_entries result['content'] = list(map(enrich_directory_local, content)) else: # content result['content'] = utils.enrich_content(content) return result @app.route('/api/1/revision' '/origin/' '/directory/') @app.route('/api/1/revision' '/origin/' '/directory//') @app.route('/api/1/revision' '/origin/' '/branch/' '/directory/') @app.route('/api/1/revision' '/origin/' '/branch/' '/directory//') @app.route('/api/1/revision' '/origin/' '/branch/' '/ts/' '/directory/') @app.route('/api/1/revision' '/origin/' '/branch/' '/ts/' '/directory//') @doc.route('/api/1/revision/origin/directory/', tags=['hidden']) @doc.arg('origin_id', default=1, argtype=doc.argtypes.int, argdoc="The revision's origin's SWH identifier") @doc.arg('branch_name', default='refs/heads/master', argtype=doc.argtypes.path, argdoc="""The optional branch for the given origin (default to master""") @doc.arg('ts', default='2000-01-17T11:23:54+00:00', argtype=doc.argtypes.ts, argdoc="""Optional timestamp (default to the nearest time crawl of timestamp)""") @doc.arg('path', default='Dockerfile', argtype=doc.argtypes.path, argdoc='The path to the directory or file to display') @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""The metadata of the revision corresponding to the given criteria""") def api_directory_through_revision_origin(origin_id, branch_name="refs/heads/master", ts=None, path=None, with_data=False): """Display directory or content information through a revision identified by origin/branch/timestamp. """ if ts: ts = utils.parse_timestamp(ts) return _revision_directory_by( { 'origin_id': origin_id, 'branch_name': branch_name, 'ts': ts }, path, request.path, with_data=with_data) @app.route('/api/1/revision' '/origin//') @app.route('/api/1/revision' '/origin/' '/branch//') @app.route('/api/1/revision' '/origin/' '/branch/' '/ts//') @app.route('/api/1/revision' '/origin/' '/ts//') @doc.route('/api/1/revision/origin/') @doc.arg('origin_id', default=1, argtype=doc.argtypes.int, argdoc='software origin identifier') @doc.arg('branch_name', default='refs/heads/master', argtype=doc.argtypes.path, argdoc="""(optional) fully-qualified branch name, e.g., "refs/heads/master". Defaults to the master branch.""") @doc.arg('ts', default=None, argtype=doc.argtypes.ts, argdoc="""(optional) timestamp close to which the revision pointed by the given branch should be looked up. Defaults to now.""") @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc=_doc_ret_revision_meta) def api_revision_with_origin(origin_id, branch_name="refs/heads/master", ts=None): """Get information about a revision, searching for it based on software origin, branch name, and/or visit timestamp. This endpoint behaves like ``/revision``, but operates on the revision that has been found at a given software origin, close to a given point in time, pointed by a given branch. """ ts = utils.parse_timestamp(ts) return _api_lookup( origin_id, service.lookup_revision_by, 'Revision with (origin_id: %s, branch_name: %s' ', ts: %s) not found.' % (origin_id, branch_name, ts), utils.enrich_revision, branch_name, ts) @app.route('/api/1/revision//prev//') @doc.route('/api/1/revision/prev/', tags=['hidden']) @doc.arg('sha1_git', default='ec72c666fb345ea5f21359b7bc063710ce558e39', argtype=doc.argtypes.sha1_git, argdoc="The revision's sha1_git identifier") @doc.arg('context', default='6adc4a22f20bbf3bbc754f1ec8c82be5dfb5c71a', argtype=doc.argtypes.path, argdoc='The navigation breadcrumbs -- use at your own risk') @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc='The metadata of the revision identified by sha1_git') def api_revision_with_context(sha1_git, context): """Return information about revision with id sha1_git. """ def _enrich_revision(revision, context=context): return utils.enrich_revision(revision, context) return _api_lookup( sha1_git, service.lookup_revision, 'Revision with sha1_git %s not found.' % sha1_git, _enrich_revision) @app.route('/api/1/revision//') @doc.route('/api/1/revision/') @doc.arg('sha1_git', default='aafb16d69fd30ff58afdd69036a26047f3aebdc6', argtype=doc.argtypes.sha1_git, argdoc="revision identifier") @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc=_doc_ret_revision_meta) def api_revision(sha1_git): """Get information about a revision. Revisions are identified by SHA1 checksums, compatible with Git commit identifiers. See ``revision_identifier`` in our `data model module `_ for details about how they are computed. """ return _api_lookup( sha1_git, service.lookup_revision, 'Revision with sha1_git %s not found.' % sha1_git, utils.enrich_revision) @app.route('/api/1/revision//raw/') @doc.route('/api/1/revision/raw/', tags=['hidden']) @doc.arg('sha1_git', default='ec72c666fb345ea5f21359b7bc063710ce558e39', argtype=doc.argtypes.sha1_git, argdoc="The queried revision's sha1_git identifier") @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.octet_stream, retdoc="""The message of the revision identified by sha1_git as a downloadable octet stream""") def api_revision_raw_message(sha1_git): """Return the raw data of the message of revision identified by sha1_git """ raw = service.lookup_revision_message(sha1_git) return app.response_class(raw['message'], headers={'Content-disposition': 'attachment;' 'filename=rev_%s_raw' % sha1_git}, mimetype='application/octet-stream') @app.route('/api/1/revision//directory/') @app.route('/api/1/revision//directory//') @doc.route('/api/1/revision/directory/') @doc.arg('sha1_git', default='ec72c666fb345ea5f21359b7bc063710ce558e39', argtype=doc.argtypes.sha1_git, argdoc='revision identifier') @doc.arg('dir_path', default='Documentation/BUG-HUNTING', argtype=doc.argtypes.path, argdoc="""path relative to the root directory of revision identifier by sha1_git""") @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""either a list of directory entries with their metadata, or the metadata of a single directory entry""") def api_revision_directory(sha1_git, dir_path=None, with_data=False): """Get information about directory (entry) objects associated to revisions. Each revision is associated to a single "root" directory. This endpoint behaves like ``/directory/``, but operates on the root directory associated to a given revision. """ return _revision_directory_by( { 'sha1_git': sha1_git }, dir_path, request.path, with_data=with_data) @app.route('/api/1/revision//log/') @app.route('/api/1/revision//prev//log/') @doc.route('/api/1/revision/log/') @doc.arg('sha1_git', default='37fc9e08d0c4b71807a4f1ecb06112e78d91c283', argtype=doc.argtypes.sha1_git, argdoc='revision identifier') # @doc.arg('prev_sha1s', # default='6adc4a22f20bbf3bbc754f1ec8c82be5dfb5c71a', # argtype=doc.argtypes.path, # argdoc="""(Optional) Navigation breadcrumbs (descendant revisions # previously visited). If multiple values, use / as delimiter. """) @doc.header('Link', doc=_doc_header_link) @doc.param('per_page', default=10, argtype=doc.argtypes.int, doc=_doc_arg_per_page) @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc=_doc_ret_revision_log) def api_revision_log(sha1_git, prev_sha1s=None): """Get a list of all revisions heading to a given one, i.e., show the commit log. """ result = {} per_page = int(request.args.get('per_page', '10')) def lookup_revision_log_with_limit(s, limit=per_page+1): return service.lookup_revision_log(s, limit) error_msg = 'Revision with sha1_git %s not found.' % sha1_git rev_get = _api_lookup(sha1_git, lookup_fn=lookup_revision_log_with_limit, error_msg_if_not_found=error_msg, enrich_fn=utils.enrich_revision) l = len(rev_get) if l == per_page+1: rev_backward = rev_get[:-1] new_last_sha1 = rev_get[-1]['id'] params = { 'sha1_git': new_last_sha1, } if request.args.get('per_page'): params['per_page'] = per_page result['headers'] = { 'link-next': url_for('api_revision_log', **params) } else: rev_backward = rev_get if not prev_sha1s: # no nav breadcrumbs, so we're done revisions = rev_backward else: rev_forward_ids = prev_sha1s.split('/') rev_forward = _api_lookup(rev_forward_ids, lookup_fn=service.lookup_revision_multiple, error_msg_if_not_found=error_msg, enrich_fn=utils.enrich_revision) revisions = rev_forward + rev_backward result.update({ 'results': revisions }) return result @app.route('/api/1/revision' '/origin//log/') @app.route('/api/1/revision' '/origin/' '/branch//log/') @app.route('/api/1/revision' '/origin/' '/branch/' '/ts//log/') @app.route('/api/1/revision' '/origin/' '/ts//log/') @doc.route('/api/1/revision/origin/log/') @doc.arg('origin_id', default=1, argtype=doc.argtypes.int, argdoc="The revision's SWH origin identifier") @doc.arg('branch_name', default='refs/heads/master', argtype=doc.argtypes.path, argdoc="""(Optional) The revision's branch name within the origin specified. Defaults to 'refs/heads/master'.""") @doc.arg('ts', default='2000-01-17T11:23:54+00:00', argtype=doc.argtypes.ts, argdoc="""(Optional) A time or timestamp string to parse""") @doc.header('Link', doc=_doc_header_link) @doc.param('per_page', default=10, argtype=doc.argtypes.int, doc=_doc_arg_per_page) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc=_doc_ret_revision_log) def api_revision_log_by(origin_id, branch_name='refs/heads/master', ts=None): """Show the commit log for a revision, searching for it based on software origin, branch name, and/or visit timestamp. This endpoint behaves like ``/log``, but operates on the revision that has been found at a given software origin, close to a given point in time, pointed by a given branch. """ result = {} per_page = int(request.args.get('per_page', '10')) if ts: ts = utils.parse_timestamp(ts) def lookup_revision_log_by_with_limit(o_id, br, ts, limit=per_page+1): return service.lookup_revision_log_by(o_id, br, ts, limit) error_msg = 'No revision matching origin %s ' % origin_id error_msg += ', branch name %s' % branch_name error_msg += (' and time stamp %s.' % ts) if ts else '.' rev_get = _api_lookup(origin_id, lookup_revision_log_by_with_limit, error_msg, utils.enrich_revision, branch_name, ts) l = len(rev_get) if l == per_page+1: revisions = rev_get[:-1] last_sha1_git = rev_get[-1]['id'] params = { 'origin_id': origin_id, 'branch_name': branch_name, 'ts': ts, 'sha1_git': last_sha1_git, } if request.args.get('per_page'): params['per_page'] = per_page result['headers'] = { 'link-next': url_for('api_revision_log_by', **params), } else: revisions = rev_get result.update({'results': revisions}) return result @app.route('/api/1/directory//') @app.route('/api/1/directory///') @doc.route('/api/1/directory/') @doc.arg('sha1_git', default='1bd0e65f7d2ff14ae994de17a1e7fe65111dcad8', argtype=doc.argtypes.sha1_git, argdoc='directory identifier') @doc.arg('path', default='codec/demux', argtype=doc.argtypes.path, argdoc='path relative to directory identified by sha1_git') @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""either a list of directory entries with their metadata, or the metadata of a single directory entry""") def api_directory(sha1_git, path=None): """Get information about directory or directory entry objects. Directories are identified by SHA1 checksums, compatible with Git directory identifiers. See ``directory_identifier`` in our `data model module `_ for details about how they are computed. When given only a directory identifier, this endpoint returns information about the directory itself, returning its content (usually a list of directory entries). When given a directory identifier and a path, this endpoint returns information about the directory entry pointed by the relative path, starting path resolution from the given directory. """ if path: error_msg_path = ('Entry with path %s relative to directory ' 'with sha1_git %s not found.') % (path, sha1_git) return _api_lookup( sha1_git, service.lookup_directory_with_path, error_msg_path, utils.enrich_directory, path) else: error_msg_nopath = 'Directory with sha1_git %s not found.' % sha1_git return _api_lookup( sha1_git, service.lookup_directory, error_msg_nopath, utils.enrich_directory) @app.route('/api/1/content//provenance/') @doc.route('/api/1/content/provenance/', tags=['hidden']) @doc.arg('q', default='sha1_git:88b9b366facda0b5ff8d8640ee9279bed346f242', argtype=doc.argtypes.algo_and_hash, argdoc=_doc_arg_content_id) @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""List of provenance information (dict) for the matched content.""") def api_content_provenance(q): """Return content's provenance information if any. """ def _enrich_revision(provenance): p = provenance.copy() p['revision_url'] = url_for('api_revision', sha1_git=provenance['revision']) p['content_url'] = url_for('api_content_metadata', q='sha1_git:%s' % provenance['content']) p['origin_url'] = url_for('api_origin', origin_id=provenance['origin']) p['origin_visits_url'] = url_for('api_origin_visits', origin_id=provenance['origin']) p['origin_visit_url'] = url_for('api_origin_visit', origin_id=provenance['origin'], visit_id=provenance['visit']) return p return _api_lookup( q, lookup_fn=service.lookup_content_provenance, error_msg_if_not_found='Content with %s not found.' % q, enrich_fn=_enrich_revision) @app.route('/api/1/content//filetype/') @doc.route('/api/1/content/filetype/', tags=['upcoming']) @doc.arg('q', default='sha1:1fc6129a692e7a87b5450e2ba56e7669d0c5775d', argtype=doc.argtypes.algo_and_hash, argdoc=_doc_arg_content_id) @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""Filetype information (dict) for the matched content.""") def api_content_filetype(q): """Get information about the detected MIME type of a content object. """ return _api_lookup( q, lookup_fn=service.lookup_content_filetype, error_msg_if_not_found='No filetype information found ' 'for content %s.' % q, enrich_fn=utils.enrich_metadata_endpoint) @app.route('/api/1/content//language/') @doc.route('/api/1/content/language/', tags=['upcoming']) @doc.arg('q', default='sha1:1fc6129a692e7a87b5450e2ba56e7669d0c5775d', argtype=doc.argtypes.algo_and_hash, argdoc=_doc_arg_content_id) @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""Language information (dict) for the matched content.""") def api_content_language(q): """Get information about the detected (programming) language of a content object. """ return _api_lookup( q, lookup_fn=service.lookup_content_language, error_msg_if_not_found='No language information found ' 'for content %s.' % q, enrich_fn=utils.enrich_metadata_endpoint) @app.route('/api/1/content//license/') @doc.route('/api/1/content/license/', tags=['upcoming']) @doc.arg('q', default='sha1:1fc6129a692e7a87b5450e2ba56e7669d0c5775d', argtype=doc.argtypes.algo_and_hash, argdoc=_doc_arg_content_id) @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""License information (dict) for the matched content.""") def api_content_license(q): """Get information about the detected license of a content object. """ return _api_lookup( q, lookup_fn=service.lookup_content_license, error_msg_if_not_found='No license information found ' 'for content %s.' % q, enrich_fn=utils.enrich_metadata_endpoint) @app.route('/api/1/content//ctags/') @doc.route('/api/1/content/ctags/', tags=['upcoming']) @doc.arg('q', default='sha1:1fc6129a692e7a87b5450e2ba56e7669d0c5775d', argtype=doc.argtypes.algo_and_hash, argdoc=_doc_arg_content_id) @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""Ctags symbol (dict) for the matched content.""") def api_content_ctags(q): """Get information about all `Ctags `_-style symbols defined in a content object. """ return _api_lookup( q, lookup_fn=service.lookup_content_ctags, error_msg_if_not_found='No ctags symbol found ' 'for content %s.' % q, enrich_fn=utils.enrich_metadata_endpoint) @app.route('/api/1/content//raw/') @doc.route('/api/1/content/raw/', tags=['upcoming']) @doc.arg('q', default='adc83b19e793491b1c6ea0fd8b46cd9f32e592fc', argtype=doc.argtypes.algo_and_hash, argdoc=_doc_arg_content_id) @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.octet_stream, retdoc='The raw content data as an octet stream') def api_content_raw(q): """Get the raw content of a content object (AKA "blob"), as a byte sequence. """ def generate(content): yield content['data'] content = service.lookup_content_raw(q) if not content: raise NotFoundExc('Content with %s not found.' % q) return app.response_class(generate(content), headers={'Content-disposition': 'attachment;' 'filename=content_%s_raw' % q}, mimetype='application/octet-stream') @app.route('/api/1/content//') @doc.route('/api/1/content/') @doc.arg('q', default='adc83b19e793491b1c6ea0fd8b46cd9f32e592fc', argtype=doc.argtypes.algo_and_hash, argdoc=_doc_arg_content_id) @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc="""known metadata for content identified by q""") def api_content_metadata(q): """Get information about a content (AKA "blob") object. """ return _api_lookup( q, lookup_fn=service.lookup_content, error_msg_if_not_found='Content with %s not found.' % q, enrich_fn=utils.enrich_content) @app.route('/api/1/entity//') @doc.route('/api/1/entity/', tags=['hidden']) @doc.arg('uuid', default='5f4d4c51-498a-4e28-88b3-b3e4e8396cba', argtype=doc.argtypes.uuid, argdoc="The entity's uuid identifier") @doc.raises(exc=doc.excs.badinput, doc=_doc_exc_bad_id) @doc.raises(exc=doc.excs.notfound, doc=_doc_exc_id_not_found) @doc.returns(rettype=doc.rettypes.dict, retdoc='The metadata of the entity identified by uuid') def api_entity_by_uuid(uuid): """Return content information if content is found. """ return _api_lookup( uuid, lookup_fn=service.lookup_entity_by_uuid, error_msg_if_not_found="Entity with uuid '%s' not found." % uuid, enrich_fn=utils.enrich_entity) diff --git a/version.txt b/version.txt index a06922cd..c2e3d963 100644 --- a/version.txt +++ b/version.txt @@ -1 +1 @@ -v0.0.73-0-g791a368 \ No newline at end of file +v0.0.74-0-g332df6f \ No newline at end of file