diff --git a/PKG-INFO b/PKG-INFO index e596ca18..d18c3d32 100644 --- a/PKG-INFO +++ b/PKG-INFO @@ -1,127 +1,127 @@ Metadata-Version: 2.1 Name: swh.web -Version: 0.0.177 +Version: 0.0.178 Summary: Software Heritage Web UI Home-page: https://forge.softwareheritage.org/diffusion/DWUI/ Author: Software Heritage developers Author-email: swh-devel@inria.fr License: UNKNOWN Project-URL: Funding, https://www.softwareheritage.org/donate -Project-URL: Source, https://forge.softwareheritage.org/source/swh-web Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest +Project-URL: Source, https://forge.softwareheritage.org/source/swh-web Description: # swh-web This repository holds the development of Software Heritage web applications: * swh-web API (https://archive.softwareheritage.org/api): enables to query the content of the archive through HTTP requests and get responses in JSON or YAML. * swh-web browse (https://archive.softwareheritage.org/browse): graphical interface that eases the navigation in the archive. Documentation about how to use these components but also the details of their URI schemes can be found in the docs folder. The produced HTML documentation can be read and browsed at https://docs.softwareheritage.org/devel/swh-web/index.html. ## Technical details Those applications are powered by: * [Django Web Framework](https://www.djangoproject.com/) on the backend side with the following extensions enabled: * [django-rest-framework](http://www.django-rest-framework.org/) * [django-webpack-loader](https://github.com/owais/django-webpack-loader) * [django-js-reverse](http://django-js-reverse.readthedocs.io/en/latest/) * [webpack](https://webpack.js.org/) on the frontend side for better static assets management, including: * assets dependencies management and retrieval through [npm](https://www.npmjs.com/) * linting of custom javascript code (through [eslint](https://eslint.org/)) and stylesheets (through [stylelint](https://stylelint.io/)) * use of [es6](http://es6-features.org) syntax and advanced javascript feature like [async/await](https://javascript.info/async-await) or [fetch](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) thanks to [babel](https://babeljs.io/) (es6 to es5 transpiler and polyfills provider) * assets minification (using [UglifyJS](https://github.com/mishoo/UglifyJS2) and [cssnano](http://cssnano.co/)) but also dead code elimination for production use ## How to build and run ### Requirements First you will need [Python 3](https://www.python.org) and a complete [swh development environment](https://forge.softwareheritage.org/source/swh-environment/) installed. To run the backend, you need to have the following Python 3 modules installed: * beautifulsoup4 * django >= 1.10.7 * djangorestframework >= 3.4.0 * django_webpack_loader * django_js_reverse * docutils * file_magic >= 0.3.0 * htmlmin * lxml * pygments * pypandoc * python-dateutil * pyyaml * requests To compile the frontend assets, you need to have [nodejs](https://nodejs.org/en/) >= 8.x (preferably version 8.x LTS) and [npm](https://www.npmjs.com/) (or [yarn](https://yarnpkg.com/en/)) installed. If you are on Debian stretch, you can easily install an up to date nodejs/npm from the stretch-backports repository or by following the instructions located at https://github.com/nodesource/distributions. Once you have installed nodejs, issue the following command in the root directory of swh-web in order to retrieve all the frontend dependencies: ``` $ npm install ``` or if you prefer to use yarn (faster than npm): ``` $ yarn install ``` Please note that the static assets bundles generated by webpack are not stored in the git repository. Follow the instructions below in order to generate them in order to be able to run the frontend part of the web applications. ### Make targets Below is the list of available make targets that can be executed from the root directory of swh-web in order to build and/or execute the web applications under various configurations: * **run-django-webpack-devserver**: Compile and serve not optimized (without mignification and dead code elimination) frontend static assets using [webpack-dev-server](https://github.com/webpack/webpack-dev-server) and run django server with development settings. This is the recommended target to use when developing swh-web as it enables automatic reloading of backend and frontend part of the applications when modifying source files (*.py, *.js, *.css, *.html). * **run-django-webpack-dev**: Compile not optimized (no minification, no dead code elimination) frontend static assets using webpack and run django server with development settings. This is the recommended target when one only wants to develop the backend side of the application. * **run-django-webpack-prod**: Compile optimized (with minification and dead code elimination) frontend static assets using webpack and run django server with production settings. This is useful to test the applications in production mode (with the difference that static assets are served by django). Production settings notably enable advanced django caching and you will need to have [memcached](https://memcached.org/) installed for that feature to work. * **run-django-server-dev**: Run the django server with development settings but without compiling frontend static assets through webpack. * **run-django-server-prod**: Run the django server with production settings but without compiling frontend static assets through webpack. * **run-gunicorn-server**: Run the web applications with production settings in a [gunicorn](http://gunicorn.org/) worker as they will be in real production environment. Once one of these targets executed, the web applications can be executed by pointing your browser to http://localhost:5004. ### Npm/Yarn targets Below is a list of available npm/yarn targets in order to only execute the frontend static assets compilation (no web server will be executed): * **build-dev**: compile not optimized (without mignification and dead code elimination) frontend static assets and store the results in the `swh/web/static` folder. * **build**: compile optimized (with mignification and dead code elimination) frontend static assets and store the results in the `swh/web/static` folder. **The build target must be executed prior performing the Debian packaging of swh-web** in order for the package to contain the optimized assets dedicated to production environment. To execute these targets, issue the following commmand: ``` $ npm run ``` or if you prefer using yarn: ``` $ yarn ``` Platform: UNKNOWN Classifier: Programming Language :: Python :: 3 Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+) Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Classifier: Framework :: Django Description-Content-Type: text/markdown Provides-Extra: testing diff --git a/swh.web.egg-info/PKG-INFO b/swh.web.egg-info/PKG-INFO index e596ca18..d18c3d32 100644 --- a/swh.web.egg-info/PKG-INFO +++ b/swh.web.egg-info/PKG-INFO @@ -1,127 +1,127 @@ Metadata-Version: 2.1 Name: swh.web -Version: 0.0.177 +Version: 0.0.178 Summary: Software Heritage Web UI Home-page: https://forge.softwareheritage.org/diffusion/DWUI/ Author: Software Heritage developers Author-email: swh-devel@inria.fr License: UNKNOWN Project-URL: Funding, https://www.softwareheritage.org/donate -Project-URL: Source, https://forge.softwareheritage.org/source/swh-web Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest +Project-URL: Source, https://forge.softwareheritage.org/source/swh-web Description: # swh-web This repository holds the development of Software Heritage web applications: * swh-web API (https://archive.softwareheritage.org/api): enables to query the content of the archive through HTTP requests and get responses in JSON or YAML. * swh-web browse (https://archive.softwareheritage.org/browse): graphical interface that eases the navigation in the archive. Documentation about how to use these components but also the details of their URI schemes can be found in the docs folder. The produced HTML documentation can be read and browsed at https://docs.softwareheritage.org/devel/swh-web/index.html. ## Technical details Those applications are powered by: * [Django Web Framework](https://www.djangoproject.com/) on the backend side with the following extensions enabled: * [django-rest-framework](http://www.django-rest-framework.org/) * [django-webpack-loader](https://github.com/owais/django-webpack-loader) * [django-js-reverse](http://django-js-reverse.readthedocs.io/en/latest/) * [webpack](https://webpack.js.org/) on the frontend side for better static assets management, including: * assets dependencies management and retrieval through [npm](https://www.npmjs.com/) * linting of custom javascript code (through [eslint](https://eslint.org/)) and stylesheets (through [stylelint](https://stylelint.io/)) * use of [es6](http://es6-features.org) syntax and advanced javascript feature like [async/await](https://javascript.info/async-await) or [fetch](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) thanks to [babel](https://babeljs.io/) (es6 to es5 transpiler and polyfills provider) * assets minification (using [UglifyJS](https://github.com/mishoo/UglifyJS2) and [cssnano](http://cssnano.co/)) but also dead code elimination for production use ## How to build and run ### Requirements First you will need [Python 3](https://www.python.org) and a complete [swh development environment](https://forge.softwareheritage.org/source/swh-environment/) installed. To run the backend, you need to have the following Python 3 modules installed: * beautifulsoup4 * django >= 1.10.7 * djangorestframework >= 3.4.0 * django_webpack_loader * django_js_reverse * docutils * file_magic >= 0.3.0 * htmlmin * lxml * pygments * pypandoc * python-dateutil * pyyaml * requests To compile the frontend assets, you need to have [nodejs](https://nodejs.org/en/) >= 8.x (preferably version 8.x LTS) and [npm](https://www.npmjs.com/) (or [yarn](https://yarnpkg.com/en/)) installed. If you are on Debian stretch, you can easily install an up to date nodejs/npm from the stretch-backports repository or by following the instructions located at https://github.com/nodesource/distributions. Once you have installed nodejs, issue the following command in the root directory of swh-web in order to retrieve all the frontend dependencies: ``` $ npm install ``` or if you prefer to use yarn (faster than npm): ``` $ yarn install ``` Please note that the static assets bundles generated by webpack are not stored in the git repository. Follow the instructions below in order to generate them in order to be able to run the frontend part of the web applications. ### Make targets Below is the list of available make targets that can be executed from the root directory of swh-web in order to build and/or execute the web applications under various configurations: * **run-django-webpack-devserver**: Compile and serve not optimized (without mignification and dead code elimination) frontend static assets using [webpack-dev-server](https://github.com/webpack/webpack-dev-server) and run django server with development settings. This is the recommended target to use when developing swh-web as it enables automatic reloading of backend and frontend part of the applications when modifying source files (*.py, *.js, *.css, *.html). * **run-django-webpack-dev**: Compile not optimized (no minification, no dead code elimination) frontend static assets using webpack and run django server with development settings. This is the recommended target when one only wants to develop the backend side of the application. * **run-django-webpack-prod**: Compile optimized (with minification and dead code elimination) frontend static assets using webpack and run django server with production settings. This is useful to test the applications in production mode (with the difference that static assets are served by django). Production settings notably enable advanced django caching and you will need to have [memcached](https://memcached.org/) installed for that feature to work. * **run-django-server-dev**: Run the django server with development settings but without compiling frontend static assets through webpack. * **run-django-server-prod**: Run the django server with production settings but without compiling frontend static assets through webpack. * **run-gunicorn-server**: Run the web applications with production settings in a [gunicorn](http://gunicorn.org/) worker as they will be in real production environment. Once one of these targets executed, the web applications can be executed by pointing your browser to http://localhost:5004. ### Npm/Yarn targets Below is a list of available npm/yarn targets in order to only execute the frontend static assets compilation (no web server will be executed): * **build-dev**: compile not optimized (without mignification and dead code elimination) frontend static assets and store the results in the `swh/web/static` folder. * **build**: compile optimized (with mignification and dead code elimination) frontend static assets and store the results in the `swh/web/static` folder. **The build target must be executed prior performing the Debian packaging of swh-web** in order for the package to contain the optimized assets dedicated to production environment. To execute these targets, issue the following commmand: ``` $ npm run ``` or if you prefer using yarn: ``` $ yarn ``` Platform: UNKNOWN Classifier: Programming Language :: Python :: 3 Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+) Classifier: Operating System :: OS Independent Classifier: Development Status :: 5 - Production/Stable Classifier: Framework :: Django Description-Content-Type: text/markdown Provides-Extra: testing diff --git a/swh.web.egg-info/SOURCES.txt b/swh.web.egg-info/SOURCES.txt index 42a489ab..c9425738 100644 --- a/swh.web.egg-info/SOURCES.txt +++ b/swh.web.egg-info/SOURCES.txt @@ -1,367 +1,369 @@ MANIFEST.in Makefile README.md package-lock.json package.json pytest.ini requirements-swh.txt requirements-test.txt requirements.txt setup.py tox.ini version.txt yarn.lock swh/__init__.py swh.web.egg-info/PKG-INFO swh.web.egg-info/SOURCES.txt swh.web.egg-info/dependency_links.txt swh.web.egg-info/requires.txt swh.web.egg-info/top_level.txt swh/web/__init__.py swh/web/config.py swh/web/doc_config.py swh/web/manage.py swh/web/urls.py swh/web/wsgi.py swh/web/admin/__init__.py swh/web/admin/adminurls.py swh/web/admin/deposit.py swh/web/admin/origin_save.py swh/web/admin/urls.py swh/web/api/__init__.py swh/web/api/apidoc.py swh/web/api/apiresponse.py swh/web/api/apiurls.py swh/web/api/renderers.py swh/web/api/urls.py swh/web/api/utils.py swh/web/api/views/__init__.py swh/web/api/views/content.py swh/web/api/views/directory.py swh/web/api/views/identifiers.py swh/web/api/views/origin.py swh/web/api/views/origin_save.py swh/web/api/views/person.py swh/web/api/views/release.py swh/web/api/views/revision.py swh/web/api/views/snapshot.py swh/web/api/views/stat.py swh/web/api/views/utils.py swh/web/api/views/vault.py swh/web/assets/config/.bootstraprc swh/web/assets/config/.eslintrc swh/web/assets/config/bootstrap-pre-customize.scss swh/web/assets/config/webpack.config.development.js swh/web/assets/config/webpack.config.production.js swh/web/assets/config/webpack-plugins/remove-source-map-url-webpack-plugin.js swh/web/assets/src/bundles/admin/deposit.js swh/web/assets/src/bundles/admin/index.js swh/web/assets/src/bundles/admin/origin-save.js swh/web/assets/src/bundles/browse/breadcrumbs.css swh/web/assets/src/bundles/browse/browse-utils.js swh/web/assets/src/bundles/browse/browse.css swh/web/assets/src/bundles/browse/content.css swh/web/assets/src/bundles/browse/index.js swh/web/assets/src/bundles/browse/origin-save.js swh/web/assets/src/bundles/browse/origin-search.js swh/web/assets/src/bundles/browse/snapshot-navigation.css swh/web/assets/src/bundles/browse/snapshot-navigation.js swh/web/assets/src/bundles/browse/swh-ids-utils.js swh/web/assets/src/bundles/origin/index.js swh/web/assets/src/bundles/origin/visits-calendar.js swh/web/assets/src/bundles/origin/visits-histogram.js swh/web/assets/src/bundles/origin/visits-reporting.css swh/web/assets/src/bundles/origin/visits-reporting.js swh/web/assets/src/bundles/revision/diff-utils.js swh/web/assets/src/bundles/revision/index.js swh/web/assets/src/bundles/revision/log-utils.js swh/web/assets/src/bundles/revision/revision.css swh/web/assets/src/bundles/vault/index.js swh/web/assets/src/bundles/vault/vault-create-tasks.js swh/web/assets/src/bundles/vault/vault-ui.js swh/web/assets/src/bundles/vault/vault.css swh/web/assets/src/bundles/vendors/datatables.css swh/web/assets/src/bundles/vendors/index.js swh/web/assets/src/bundles/webapp/breadcrumbs.css swh/web/assets/src/bundles/webapp/code-highlighting.js swh/web/assets/src/bundles/webapp/index.js swh/web/assets/src/bundles/webapp/pdf-rendering.js swh/web/assets/src/bundles/webapp/readme-rendering.js swh/web/assets/src/bundles/webapp/webapp-utils.js swh/web/assets/src/bundles/webapp/webapp.css swh/web/assets/src/utils/functions.js swh/web/assets/src/utils/heaps-permute.js swh/web/assets/src/utils/highlightjs.css swh/web/assets/src/utils/highlightjs.js swh/web/assets/src/utils/jquery.tabSlideOut.css swh/web/assets/src/utils/jquery.tabSlideOut.js swh/web/assets/src/utils/org.css swh/web/assets/src/utils/org.js swh/web/assets/src/utils/showdown.css swh/web/assets/src/utils/showdown.js swh/web/browse/__init__.py swh/web/browse/browseurls.py swh/web/browse/identifiers.py swh/web/browse/urls.py swh/web/browse/utils.py swh/web/browse/views/__init__.py swh/web/browse/views/content.py swh/web/browse/views/directory.py swh/web/browse/views/origin.py swh/web/browse/views/origin_save.py swh/web/browse/views/person.py swh/web/browse/views/release.py swh/web/browse/views/revision.py swh/web/browse/views/snapshot.py swh/web/browse/views/utils/__init__.py swh/web/browse/views/utils/snapshot_context.py swh/web/common/__init__.py swh/web/common/apps.py swh/web/common/converters.py swh/web/common/exc.py swh/web/common/highlightjs.py swh/web/common/middlewares.py swh/web/common/models.py swh/web/common/origin_save.py swh/web/common/origin_visits.py swh/web/common/query.py swh/web/common/service.py swh/web/common/swh_templatetags.py swh/web/common/throttling.py swh/web/common/urlsindex.py swh/web/common/utils.py swh/web/common/migrations/0001_initial.py swh/web/common/migrations/0002_saveoriginrequest_visit_date.py swh/web/common/migrations/0003_saveoriginrequest_loading_task_status.py +swh/web/common/migrations/0004_auto_20190204_1324.py swh/web/common/migrations/__init__.py swh/web/misc/__init__.py swh/web/misc/coverage.py swh/web/settings/__init__.py swh/web/settings/common.py swh/web/settings/development.py swh/web/settings/production.py swh/web/settings/tests.py swh/web/static/robots.txt swh/web/static/webpack-stats.json swh/web/static/css/browse.9ba7883c3d18be681809.css swh/web/static/css/browse.9ba7883c3d18be681809.css.map swh/web/static/css/highlightjs.a2e4fed4ce9d03b2da89.css swh/web/static/css/highlightjs.a2e4fed4ce9d03b2da89.css.map swh/web/static/css/org.c3e4dd76523902db7b0c.css swh/web/static/css/org.c3e4dd76523902db7b0c.css.map swh/web/static/css/origin.e7684418b774e1097763.css swh/web/static/css/origin.e7684418b774e1097763.css.map swh/web/static/css/revision.03921ca25d2089f04fc8.css swh/web/static/css/revision.03921ca25d2089f04fc8.css.map swh/web/static/css/showdown.f808751e1dc358e76139.css swh/web/static/css/showdown.f808751e1dc358e76139.css.map swh/web/static/css/vault.5532c6dc35752f9278c4.css swh/web/static/css/vault.5532c6dc35752f9278c4.css.map swh/web/static/css/vendors.ff029d7ebf0bd7c01ea3.css swh/web/static/css/vendors.ff029d7ebf0bd7c01ea3.css.map swh/web/static/css/webapp.79871acb3092080bb5ff.css swh/web/static/css/webapp.79871acb3092080bb5ff.css.map swh/web/static/fonts/alegreya-latin-400.woff swh/web/static/fonts/alegreya-latin-400.woff2 swh/web/static/fonts/alegreya-latin-400italic.woff swh/web/static/fonts/alegreya-latin-400italic.woff2 swh/web/static/fonts/alegreya-latin-500.woff swh/web/static/fonts/alegreya-latin-500.woff2 swh/web/static/fonts/alegreya-latin-500italic.woff swh/web/static/fonts/alegreya-latin-500italic.woff2 swh/web/static/fonts/alegreya-latin-700.woff swh/web/static/fonts/alegreya-latin-700.woff2 swh/web/static/fonts/alegreya-latin-700italic.woff swh/web/static/fonts/alegreya-latin-700italic.woff2 swh/web/static/fonts/alegreya-latin-800.woff swh/web/static/fonts/alegreya-latin-800.woff2 swh/web/static/fonts/alegreya-latin-800italic.woff swh/web/static/fonts/alegreya-latin-800italic.woff2 swh/web/static/fonts/alegreya-latin-900.woff swh/web/static/fonts/alegreya-latin-900.woff2 swh/web/static/fonts/alegreya-latin-900italic.woff swh/web/static/fonts/alegreya-latin-900italic.woff2 swh/web/static/fonts/alegreya-sans-latin-100.woff swh/web/static/fonts/alegreya-sans-latin-100.woff2 swh/web/static/fonts/alegreya-sans-latin-100italic.woff swh/web/static/fonts/alegreya-sans-latin-100italic.woff2 swh/web/static/fonts/alegreya-sans-latin-300.woff swh/web/static/fonts/alegreya-sans-latin-300.woff2 swh/web/static/fonts/alegreya-sans-latin-300italic.woff swh/web/static/fonts/alegreya-sans-latin-300italic.woff2 swh/web/static/fonts/alegreya-sans-latin-400.woff swh/web/static/fonts/alegreya-sans-latin-400.woff2 swh/web/static/fonts/alegreya-sans-latin-400italic.woff swh/web/static/fonts/alegreya-sans-latin-400italic.woff2 swh/web/static/fonts/alegreya-sans-latin-500.woff swh/web/static/fonts/alegreya-sans-latin-500.woff2 swh/web/static/fonts/alegreya-sans-latin-500italic.woff swh/web/static/fonts/alegreya-sans-latin-500italic.woff2 swh/web/static/fonts/alegreya-sans-latin-700.woff swh/web/static/fonts/alegreya-sans-latin-700.woff2 swh/web/static/fonts/alegreya-sans-latin-700italic.woff swh/web/static/fonts/alegreya-sans-latin-700italic.woff2 swh/web/static/fonts/alegreya-sans-latin-800.woff swh/web/static/fonts/alegreya-sans-latin-800.woff2 swh/web/static/fonts/alegreya-sans-latin-800italic.woff swh/web/static/fonts/alegreya-sans-latin-800italic.woff2 swh/web/static/fonts/alegreya-sans-latin-900.woff swh/web/static/fonts/alegreya-sans-latin-900.woff2 swh/web/static/fonts/alegreya-sans-latin-900italic.woff swh/web/static/fonts/alegreya-sans-latin-900italic.woff2 swh/web/static/fonts/fontawesome-webfont.eot swh/web/static/fonts/fontawesome-webfont.svg swh/web/static/fonts/fontawesome-webfont.ttf swh/web/static/fonts/fontawesome-webfont.woff swh/web/static/fonts/fontawesome-webfont.woff2 swh/web/static/fonts/octicons.eot swh/web/static/fonts/octicons.svg swh/web/static/fonts/octicons.ttf swh/web/static/fonts/octicons.woff swh/web/static/fonts/octicons.woff2 swh/web/static/fonts/open-iconic.eot swh/web/static/fonts/open-iconic.otf swh/web/static/fonts/open-iconic.svg swh/web/static/fonts/open-iconic.ttf swh/web/static/fonts/open-iconic.woff swh/web/static/img/arrow-up-small.png swh/web/static/img/swh-api.png swh/web/static/img/swh-browse.png swh/web/static/img/swh-logo.png swh/web/static/img/swh-logo.svg swh/web/static/img/swh-spinner-small.gif swh/web/static/img/swh-spinner.gif swh/web/static/img/swh-vault.png swh/web/static/img/icons/swh-logo-32x32.png swh/web/static/img/icons/swh-logo-archive-180x180.png swh/web/static/img/icons/swh-logo-archive-192x192.png swh/web/static/img/icons/swh-logo-archive-270x270.png swh/web/static/img/logos/debian.png +swh/web/static/img/logos/framagit.png swh/web/static/img/logos/github.png swh/web/static/img/logos/gitlab.svg swh/web/static/img/logos/gitorious.png swh/web/static/img/logos/gnu.png swh/web/static/img/logos/googlecode.png swh/web/static/img/logos/hal.png swh/web/static/img/logos/inria.jpg swh/web/static/img/logos/pypi.svg swh/web/static/js/admin.ac5a04913bf38982529f.js swh/web/static/js/admin.ac5a04913bf38982529f.js.map swh/web/static/js/browse.9ba7883c3d18be681809.js swh/web/static/js/browse.9ba7883c3d18be681809.js.map swh/web/static/js/highlightjs.a2e4fed4ce9d03b2da89.js swh/web/static/js/highlightjs.a2e4fed4ce9d03b2da89.js.map swh/web/static/js/org.c3e4dd76523902db7b0c.js swh/web/static/js/org.c3e4dd76523902db7b0c.js.map swh/web/static/js/origin.e7684418b774e1097763.js swh/web/static/js/origin.e7684418b774e1097763.js.map swh/web/static/js/pdf.worker.min.js swh/web/static/js/pdfjs.bc91edd29009e6c3f86c.js swh/web/static/js/pdfjs.bc91edd29009e6c3f86c.js.map swh/web/static/js/revision.03921ca25d2089f04fc8.js swh/web/static/js/revision.03921ca25d2089f04fc8.js.map swh/web/static/js/showdown.f808751e1dc358e76139.js swh/web/static/js/showdown.f808751e1dc358e76139.js.map swh/web/static/js/vault.5532c6dc35752f9278c4.js swh/web/static/js/vault.5532c6dc35752f9278c4.js.map swh/web/static/js/vendors.ff029d7ebf0bd7c01ea3.js swh/web/static/js/vendors.ff029d7ebf0bd7c01ea3.js.map swh/web/static/js/webapp.79871acb3092080bb5ff.js swh/web/static/js/webapp.79871acb3092080bb5ff.js.map swh/web/templates/coverage.html swh/web/templates/error.html swh/web/templates/homepage.html swh/web/templates/layout.html swh/web/templates/login.html swh/web/templates/logout.html swh/web/templates/admin/deposit.html swh/web/templates/admin/origin-save.html swh/web/templates/api/api.html swh/web/templates/api/apidoc.html swh/web/templates/api/endpoints.html swh/web/templates/browse/branches.html swh/web/templates/browse/browse.html swh/web/templates/browse/content.html swh/web/templates/browse/directory.html swh/web/templates/browse/help.html swh/web/templates/browse/layout.html swh/web/templates/browse/origin-save.html swh/web/templates/browse/origin-visits.html swh/web/templates/browse/person.html swh/web/templates/browse/release.html swh/web/templates/browse/releases.html swh/web/templates/browse/revision-log.html swh/web/templates/browse/revision.html swh/web/templates/browse/search.html swh/web/templates/browse/vault-ui.html swh/web/templates/includes/apidoc-header.html swh/web/templates/includes/apidoc-header.md swh/web/templates/includes/breadcrumbs.html swh/web/templates/includes/content-display.html swh/web/templates/includes/directory-display.html swh/web/templates/includes/empty-snapshot.html swh/web/templates/includes/global-modals.html swh/web/templates/includes/http-error.html swh/web/templates/includes/readme-display.html swh/web/templates/includes/show-metadata.html swh/web/templates/includes/show-swh-ids.html swh/web/templates/includes/snapshot-context.html swh/web/templates/includes/take-new-snapshot.html swh/web/templates/includes/top-navigation.html swh/web/templates/includes/vault-create-tasks.html swh/web/tests/__init__.py swh/web/tests/conftest.py swh/web/tests/data.py swh/web/tests/strategies.py swh/web/tests/testcase.py swh/web/tests/admin/__init__.py swh/web/tests/admin/test_origin_save.py swh/web/tests/api/__init__.py swh/web/tests/api/test_api_lookup.py swh/web/tests/api/test_apidoc.py swh/web/tests/api/test_apiresponse.py swh/web/tests/api/test_utils.py swh/web/tests/api/views/__init__.py swh/web/tests/api/views/test_content.py swh/web/tests/api/views/test_directory.py swh/web/tests/api/views/test_identifiers.py swh/web/tests/api/views/test_origin.py swh/web/tests/api/views/test_origin_save.py swh/web/tests/api/views/test_person.py swh/web/tests/api/views/test_release.py swh/web/tests/api/views/test_revision.py swh/web/tests/api/views/test_snapshot.py swh/web/tests/api/views/test_stat.py swh/web/tests/api/views/test_vault.py swh/web/tests/browse/__init__.py swh/web/tests/browse/test_utils.py swh/web/tests/browse/views/__init__.py swh/web/tests/browse/views/test_content.py swh/web/tests/browse/views/test_directory.py swh/web/tests/browse/views/test_identifiers.py swh/web/tests/browse/views/test_origin.py swh/web/tests/browse/views/test_origin_save.py swh/web/tests/browse/views/test_person.py swh/web/tests/browse/views/test_release.py swh/web/tests/browse/views/test_revision.py swh/web/tests/browse/views/data/__init__.py swh/web/tests/browse/views/data/content_test_data.py swh/web/tests/browse/views/data/directory_test_data.py swh/web/tests/browse/views/data/iso-8859-1_encoded_content swh/web/tests/browse/views/data/origin_test_data.py swh/web/tests/browse/views/data/release_test_data.py swh/web/tests/browse/views/data/revision_test_data.py swh/web/tests/browse/views/data/swh-logo.png swh/web/tests/common/__init__.py swh/web/tests/common/test_converters.py swh/web/tests/common/test_highlightjs.py swh/web/tests/common/test_origin_visits.py swh/web/tests/common/test_query.py swh/web/tests/common/test_service.py swh/web/tests/common/test_templatetags.py swh/web/tests/common/test_throttling.py swh/web/tests/common/test_utils.py swh/web/tests/resources/repos/highlightjs-line-numbers.js.zip swh/web/tests/resources/repos/highlightjs-line-numbers.js_visit2.zip swh/web/tests/resources/repos/libtess2.zip swh/web/tests/resources/repos/repo_with_submodules.tgz \ No newline at end of file diff --git a/swh/web/api/views/origin.py b/swh/web/api/views/origin.py index 7885a74d..a689fc2a 100644 --- a/swh/web/api/views/origin.py +++ b/swh/web/api/views/origin.py @@ -1,436 +1,436 @@ # Copyright (C) 2015-2018 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information from distutils.util import strtobool from swh.web.common import service from swh.web.common.exc import BadInputExc from swh.web.common.origin_visits import get_origin_visits from swh.web.common.utils import reverse from swh.web.api.apidoc import api_doc from swh.web.api.apiurls import api_route from swh.web.api.views.utils import api_lookup def _enrich_origin(origin): if 'id' in origin: o = origin.copy() o['origin_visits_url'] = \ reverse('api-origin-visits', url_args={'origin_id': origin['id']}) return o return origin @api_route(r'/origins/', 'api-origins') @api_doc('/origins/', noargs=True) def api_origins(request): """ .. http:get:: /api/1/origins/ Get list of archived software origins. Origins are sorted by ids before returning them. - :query int origin_from: The minimum id of the origins to return - (default to 1) + :query int origin_from: The first origin id that will be included + in returned results (default to 1) :query int origin_count: The maximum number of origins to return (default to 100, can not exceed 10000) :>jsonarr number id: the origin unique identifier :>jsonarr string origin_visits_url: link to in order to get information about the visits for that origin :>jsonarr string type: the type of software origin (possible values are ``git``, ``svn``, ``hg``, ``deb``, ``pypi``, ``ftp`` or ``deposit``) :>jsonarr string url: the origin canonical url :reqheader Accept: the requested response content type, either ``application/json`` (default) or ``application/yaml`` :resheader Content-Type: this depends on :http:header:`Accept` header of request :resheader Link: indicates that a subsequent or previous result page are available and contains the urls pointing to them **Allowed HTTP Methods:** :http:method:`get`, :http:method:`head`, :http:method:`options` :statuscode 200: no error **Example:** .. parsed-literal:: :swh_web_api:`origins?origin_from=50000&origin_count=500` """ # noqa origin_from = int(request.query_params.get('origin_from', '1')) origin_count = int(request.query_params.get('origin_count', '100')) origin_count = min(origin_count, 10000) results = api_lookup( service.lookup_origins, origin_from, origin_count+1, enrich_fn=_enrich_origin) response = {'results': results, 'headers': {}} if len(results) > origin_count: origin_from = results.pop()['id'] response['headers']['link-next'] = reverse( 'api-origins', query_params={'origin_from': origin_from, 'origin_count': origin_count}) return response @api_route(r'/origin/(?P[0-9]+)/', 'api-origin') @api_route(r'/origin/(?P[a-z]+)/url/(?P.+)/', 'api-origin') @api_doc('/origin/') def api_origin(request, origin_id=None, origin_type=None, origin_url=None): """ .. http:get:: /api/1/origin/(origin_id)/ Get information about a software origin. :param int origin_id: a software origin identifier :>json number id: the origin unique identifier :>json string origin_visits_url: link to in order to get information about the visits for that origin :>json string type: the type of software origin (possible values are ``git``, ``svn``, ``hg``, ``deb``, ``pypi``, ``ftp`` or ``deposit``) :>json string url: the origin canonical url :reqheader Accept: the requested response content type, either ``application/json`` (default) or ``application/yaml`` :resheader Content-Type: this depends on :http:header:`Accept` header of request **Allowed HTTP Methods:** :http:method:`get`, :http:method:`head`, :http:method:`options` :statuscode 200: no error :statuscode 404: requested origin can not be found in the archive **Example:** .. parsed-literal:: :swh_web_api:`origin/1/` .. http:get:: /api/1/origin/(origin_type)/url/(origin_url)/ Get information about a software origin. :param string origin_type: the origin type (possible values are ``git``, ``svn``, ``hg``, ``deb``, ``pypi``, ``ftp`` or ``deposit``) :param string origin_url: the origin url :>json number id: the origin unique identifier :>json string origin_visits_url: link to in order to get information about the visits for that origin :>json string type: the type of software origin :>json string url: the origin canonical url :reqheader Accept: the requested response content type, either ``application/json`` (default) or ``application/yaml`` :resheader Content-Type: this depends on :http:header:`Accept` header of request **Allowed HTTP Methods:** :http:method:`get`, :http:method:`head`, :http:method:`options` :statuscode 200: no error :statuscode 404: requested origin can not be found in the archive **Example:** .. parsed-literal:: :swh_web_api:`origin/git/url/https://github.com/python/cpython/` """ # noqa ori_dict = { 'id': int(origin_id) if origin_id else None, 'type': origin_type, 'url': origin_url } ori_dict = {k: v for k, v in ori_dict.items() if ori_dict[k]} if 'id' in ori_dict: error_msg = 'Origin with id %s not found.' % ori_dict['id'] else: error_msg = 'Origin with type %s and URL %s not found' % ( ori_dict['type'], ori_dict['url']) return api_lookup( service.lookup_origin, ori_dict, notfound_msg=error_msg, enrich_fn=_enrich_origin) @api_route(r'/origin/search/(?P.+)/', 'api-origin-search') @api_doc('/origin/search/') def api_origin_search(request, url_pattern): """ .. http:get:: /api/1/origin/search/(url_pattern)/ Search for software origins whose urls contain a provided string pattern or match a provided regular expression. The search is performed in a case insensitive way. :param string url_pattern: a string pattern or a regular expression :query int offset: the number of found origins to skip before returning results :query int limit: the maximum number of found origins to return :query boolean regexp: if true, consider provided pattern as a regular expression and search origins whose urls match it :query boolean with_visit: if true, only return origins with at least one visit by Software heritage :>jsonarr number id: the origin unique identifier :>jsonarr string origin_visits_url: link to in order to get information about the visits for that origin :>jsonarr string type: the type of software origin :>jsonarr string url: the origin canonical url :reqheader Accept: the requested response content type, either ``application/json`` (default) or ``application/yaml`` :resheader Content-Type: this depends on :http:header:`Accept` header of request **Allowed HTTP Methods:** :http:method:`get`, :http:method:`head`, :http:method:`options` :statuscode 200: no error **Example:** .. parsed-literal:: :swh_web_api:`origin/search/python/?limit=2` """ # noqa result = {} offset = int(request.query_params.get('offset', '0')) limit = int(request.query_params.get('limit', '70')) regexp = request.query_params.get('regexp', 'false') with_visit = request.query_params.get('with_visit', 'false') results = api_lookup(service.search_origin, url_pattern, offset, limit, bool(strtobool(regexp)), bool(strtobool(with_visit)), enrich_fn=_enrich_origin) nb_results = len(results) if nb_results == limit: query_params = {} query_params['offset'] = offset + limit query_params['limit'] = limit query_params['regexp'] = regexp result['headers'] = { 'link-next': reverse('api-origin-search', url_args={'url_pattern': url_pattern}, query_params=query_params) } result.update({ 'results': results }) return result @api_route(r'/origin/metadata-search/', 'api-origin-metadata-search') @api_doc('/origin/metadata-search/', noargs=True) def api_origin_metadata_search(request): """ .. http:get:: /api/1/origin/metadata-search/ Search for software origins whose metadata (expressed as a JSON-LD/CodeMeta dictionary) match the provided criteria. For now, only full-text search on this dictionary is supported. :query str fulltext: a string that will be matched against origin metadata; results are ranked and ordered starting with the best ones. :query int limit: the maximum number of found origins to return (bounded to 100) :>jsonarr number origin_id: the origin unique identifier :>jsonarr dict metadata: metadata of the origin (as a JSON-LD/CodeMeta dictionary) :>jsonarr string from_revision: the revision used to extract these metadata (the current HEAD or one of the former HEADs) :>jsonarr dict tool: the tool used to extract these metadata :reqheader Accept: the requested response content type, either ``application/json`` (default) or ``application/yaml`` :resheader Content-Type: this depends on :http:header:`Accept` header of request **Allowed HTTP Methods:** :http:method:`get`, :http:method:`head`, :http:method:`options` :statuscode 200: no error **Example:** .. parsed-literal:: :swh_web_api:`origin/metadata-search/?limit=2&fulltext=Jane%20Doe` """ # noqa fulltext = request.query_params.get('fulltext', None) limit = min(int(request.query_params.get('limit', '70')), 100) if not fulltext: content = '"fulltext" must be provided and non-empty.' raise BadInputExc(content) results = api_lookup(service.search_origin_metadata, fulltext, limit) return { 'results': results, } @api_route(r'/origin/(?P[0-9]+)/visits/', 'api-origin-visits') @api_doc('/origin/visits/') def api_origin_visits(request, origin_id): """ .. http:get:: /api/1/origin/(origin_id)/visits/ Get information about all visits of a software origin. Visits are returned sorted in descending order according to their date. :param int origin_id: a software origin identifier :query int per_page: specify the number of visits to list, for pagination purposes :query int last_visit: visit to start listing from, for pagination purposes :reqheader Accept: the requested response content type, either ``application/json`` (default) or ``application/yaml`` :resheader Content-Type: this depends on :http:header:`Accept` header of request :resheader Link: indicates that a subsequent result page is available and contains the url pointing to it :>jsonarr string date: ISO representation of the visit date (in UTC) :>jsonarr number id: the unique identifier of the origin :>jsonarr string origin_visit_url: link to :http:get:`/api/1/origin/(origin_id)/visit/(visit_id)/` in order to get information about the visit :>jsonarr string snapshot: the snapshot identifier of the visit :>jsonarr string snapshot_url: link to :http:get:`/api/1/snapshot/(snapshot_id)/` in order to get information about the snapshot of the visit :>jsonarr string status: status of the visit (either **full**, **partial** or **ongoing**) :>jsonarr number visit: the unique identifier of the visit **Allowed HTTP Methods:** :http:method:`get`, :http:method:`head`, :http:method:`options` :statuscode 200: no error :statuscode 404: requested origin can not be found in the archive **Example:** .. parsed-literal:: :swh_web_api:`origin/1/visits/` """ # noqa result = {} origin_id = int(origin_id) per_page = int(request.query_params.get('per_page', '10')) last_visit = request.query_params.get('last_visit') if last_visit: last_visit = int(last_visit) def _lookup_origin_visits( origin_id, last_visit=last_visit, per_page=per_page): all_visits = get_origin_visits({'id': origin_id}) all_visits.reverse() visits = [] if not last_visit: visits = all_visits[:per_page] else: for i, v in enumerate(all_visits): if v['visit'] == last_visit: visits = all_visits[i+1:i+1+per_page] break for v in visits: yield v def _enrich_origin_visit(origin_visit): ov = origin_visit.copy() ov['origin_visit_url'] = reverse('api-origin-visit', url_args={'origin_id': origin_id, 'visit_id': ov['visit']}) snapshot = ov['snapshot'] if snapshot: ov['snapshot_url'] = reverse('api-snapshot', url_args={'snapshot_id': snapshot}) else: ov['snapshot_url'] = None return ov results = api_lookup(_lookup_origin_visits, origin_id, notfound_msg='No origin {} found'.format(origin_id), enrich_fn=_enrich_origin_visit) if results: nb_results = len(results) if nb_results == per_page: new_last_visit = results[-1]['visit'] query_params = {} query_params['last_visit'] = new_last_visit if request.query_params.get('per_page'): query_params['per_page'] = per_page result['headers'] = { 'link-next': reverse('api-origin-visits', url_args={'origin_id': origin_id}, query_params=query_params) } result.update({ 'results': results }) return result @api_route(r'/origin/(?P[0-9]+)/visit/(?P[0-9]+)/', 'api-origin-visit') @api_doc('/origin/visit/') def api_origin_visit(request, origin_id, visit_id): """ .. http:get:: /api/1/origin/(origin_id)/visit/(visit_id)/ Get information about a specific visit of a software origin. :param int origin_id: a software origin identifier :param int visit_id: a visit identifier :reqheader Accept: the requested response content type, either ``application/json`` (default) or ``application/yaml`` :resheader Content-Type: this depends on :http:header:`Accept` header of request :>json string date: ISO representation of the visit date (in UTC) :>json number origin: the origin unique identifier :>json string origin_url: link to get information about the origin :>jsonarr string snapshot: the snapshot identifier of the visit :>jsonarr string snapshot_url: link to :http:get:`/api/1/snapshot/(snapshot_id)/` in order to get information about the snapshot of the visit :>json string status: status of the visit (either **full**, **partial** or **ongoing**) :>json number visit: the unique identifier of the visit **Allowed HTTP Methods:** :http:method:`get`, :http:method:`head`, :http:method:`options` :statuscode 200: no error :statuscode 404: requested origin or visit can not be found in the archive **Example:** .. parsed-literal:: :swh_web_api:`origin/1500/visit/1/` """ # noqa def _enrich_origin_visit(origin_visit): ov = origin_visit.copy() ov['origin_url'] = reverse('api-origin', url_args={'origin_id': ov['origin']}) snapshot = ov['snapshot'] if snapshot: ov['snapshot_url'] = reverse('api-snapshot', url_args={'snapshot_id': snapshot}) else: ov['snapshot_url'] = None return ov return api_lookup( service.lookup_origin_visit, int(origin_id), int(visit_id), notfound_msg=('No visit {} for origin {} found' .format(visit_id, origin_id)), enrich_fn=_enrich_origin_visit) diff --git a/swh/web/common/middlewares.py b/swh/web/common/middlewares.py index 19cc8f9d..b2dc3e82 100644 --- a/swh/web/common/middlewares.py +++ b/swh/web/common/middlewares.py @@ -1,66 +1,65 @@ # Copyright (C) 2018 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information from bs4 import BeautifulSoup from htmlmin import minify class HtmlPrettifyMiddleware(object): """ Django middleware for prettifying generated HTML in development mode. """ def __init__(self, get_response): self.get_response = get_response def __call__(self, request): response = self.get_response(request) if 'text/html' in response.get('Content-Type', ''): response.content = \ BeautifulSoup(response.content, 'lxml').prettify() return response class HtmlMinifyMiddleware(object): """ Django middleware for minifying generated HTML in production mode. """ def __init__(self, get_response=None): self.get_response = get_response def __call__(self, request): response = self.get_response(request) if 'text/html' in response.get('Content-Type', ''): try: - minified_html = minify(response.content.decode('utf-8'), - convert_charrefs=False) + minified_html = minify(response.content.decode('utf-8')) response.content = minified_html.encode('utf-8') except Exception: pass return response class ThrottlingHeadersMiddleware(object): """ Django middleware for inserting rate limiting related headers in HTTP response. """ def __init__(self, get_response=None): self.get_response = get_response def __call__(self, request): resp = self.get_response(request) if 'RateLimit-Limit' in request.META: resp['X-RateLimit-Limit'] = request.META['RateLimit-Limit'] if 'RateLimit-Remaining' in request.META: resp['X-RateLimit-Remaining'] = request.META['RateLimit-Remaining'] if 'RateLimit-Reset' in request.META: resp['X-RateLimit-Reset'] = request.META['RateLimit-Reset'] return resp diff --git a/swh/web/common/migrations/0004_auto_20190204_1324.py b/swh/web/common/migrations/0004_auto_20190204_1324.py new file mode 100644 index 00000000..6be22fef --- /dev/null +++ b/swh/web/common/migrations/0004_auto_20190204_1324.py @@ -0,0 +1,25 @@ +# Copyright (C) 2019 The Software Heritage developers +# See the AUTHORS file at the top-level directory of this distribution +# License: GNU Affero General Public License version 3, or any later version +# See top-level LICENSE file for more information + +# flake8: noqa + +from __future__ import unicode_literals + +from django.db import migrations, models + + +class Migration(migrations.Migration): + + dependencies = [ + ('swh.web.common', '0003_saveoriginrequest_loading_task_status'), + ] + + operations = [ + migrations.AlterField( + model_name='saveoriginrequest', + name='loading_task_status', + field=models.TextField(choices=[('not created', 'not created'), ('not yet scheduled', 'not yet scheduled'), ('scheduled', 'scheduled'), ('succeed', 'succeed'), ('failed', 'failed'), ('running', 'running')], default='not created'), + ), + ] diff --git a/swh/web/common/models.py b/swh/web/common/models.py index 297c322b..18de0631 100644 --- a/swh/web/common/models.py +++ b/swh/web/common/models.py @@ -1,90 +1,92 @@ # Copyright (C) 2018 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information from django.db import models class SaveAuthorizedOrigin(models.Model): """ Model table holding origin urls authorized to be loaded into the archive. """ url = models.CharField(max_length=200, null=False) class Meta: app_label = 'swh.web.common' db_table = 'save_authorized_origin' def __str__(self): return self.url class SaveUnauthorizedOrigin(models.Model): """ Model table holding origin urls not authorized to be loaded into the archive. """ url = models.CharField(max_length=200, null=False) class Meta: app_label = 'swh.web.common' db_table = 'save_unauthorized_origin' def __str__(self): return self.url SAVE_REQUEST_ACCEPTED = 'accepted' SAVE_REQUEST_REJECTED = 'rejected' SAVE_REQUEST_PENDING = 'pending' SAVE_REQUEST_STATUS = [ (SAVE_REQUEST_ACCEPTED, SAVE_REQUEST_ACCEPTED), (SAVE_REQUEST_REJECTED, SAVE_REQUEST_REJECTED), (SAVE_REQUEST_PENDING, SAVE_REQUEST_PENDING) ] SAVE_TASK_NOT_CREATED = 'not created' SAVE_TASK_NOT_YET_SCHEDULED = 'not yet scheduled' SAVE_TASK_SCHEDULED = 'scheduled' SAVE_TASK_SUCCEED = 'succeed' SAVE_TASK_FAILED = 'failed' +SAVE_TASK_RUNNING = 'running' SAVE_TASK_STATUS = [ (SAVE_TASK_NOT_CREATED, SAVE_TASK_NOT_CREATED), (SAVE_TASK_NOT_YET_SCHEDULED, SAVE_TASK_NOT_YET_SCHEDULED), (SAVE_TASK_SCHEDULED, SAVE_TASK_SCHEDULED), (SAVE_TASK_SUCCEED, SAVE_TASK_SUCCEED), - (SAVE_TASK_FAILED, SAVE_TASK_FAILED) + (SAVE_TASK_FAILED, SAVE_TASK_FAILED), + (SAVE_TASK_RUNNING, SAVE_TASK_RUNNING) ] class SaveOriginRequest(models.Model): """ Model table holding all the save origin requests issued by users. """ id = models.BigAutoField(primary_key=True) request_date = models.DateTimeField(auto_now_add=True) origin_type = models.CharField(max_length=200, null=False) origin_url = models.CharField(max_length=200, null=False) status = models.TextField(choices=SAVE_REQUEST_STATUS, default=SAVE_REQUEST_PENDING) loading_task_id = models.IntegerField(default=-1) visit_date = models.DateTimeField(null=True) loading_task_status = models.TextField(choices=SAVE_TASK_STATUS, default=SAVE_TASK_NOT_CREATED) class Meta: app_label = 'swh.web.common' db_table = 'save_origin_request' ordering = ['-id'] def __str__(self): return str({'id': self.id, 'request_date': self.request_date, 'origin_type': self.origin_type, 'origin_url': self.origin_url, 'status': self.status, 'loading_task_id': self.loading_task_id, 'visit_date': self.visit_date}) diff --git a/swh/web/common/origin_save.py b/swh/web/common/origin_save.py index facb7606..67e77e48 100644 --- a/swh/web/common/origin_save.py +++ b/swh/web/common/origin_save.py @@ -1,357 +1,397 @@ # Copyright (C) 2018 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information from bisect import bisect_right +from datetime import datetime, timezone from django.core.exceptions import ObjectDoesNotExist from django.core.exceptions import ValidationError from django.core.validators import URLValidator from swh.web import config from swh.web.common import service from swh.web.common.exc import BadInputExc, ForbiddenExc from swh.web.common.models import ( SaveUnauthorizedOrigin, SaveAuthorizedOrigin, SaveOriginRequest, SAVE_REQUEST_ACCEPTED, SAVE_REQUEST_REJECTED, SAVE_REQUEST_PENDING, SAVE_TASK_NOT_YET_SCHEDULED, SAVE_TASK_SCHEDULED, - SAVE_TASK_SUCCEED, SAVE_TASK_FAILED + SAVE_TASK_SUCCEED, SAVE_TASK_FAILED, SAVE_TASK_RUNNING ) from swh.web.common.origin_visits import get_origin_visits from swh.web.common.utils import parse_timestamp from swh.scheduler.utils import create_oneshot_task_dict scheduler = config.scheduler() def get_origin_save_authorized_urls(): """ Get the list of origin url prefixes authorized to be immediately loaded into the archive (whitelist). Returns: list: The list of authorized origin url prefix """ return [origin.url for origin in SaveAuthorizedOrigin.objects.all()] def get_origin_save_unauthorized_urls(): """ Get the list of origin url prefixes forbidden to be loaded into the archive (blacklist). Returns: list: the list of unauthorized origin url prefix """ return [origin.url for origin in SaveUnauthorizedOrigin.objects.all()] def can_save_origin(origin_url): """ Check if a software origin can be saved into the archive. Based on the origin url, the save request will be either: * immediately accepted if the url is whitelisted * rejected if the url is blacklisted * put in pending state for manual review otherwise Args: origin_url (str): the software origin url to check Returns: str: the origin save request status, either **accepted**, **rejected** or **pending** """ # origin url may be blacklisted for url_prefix in get_origin_save_unauthorized_urls(): if origin_url.startswith(url_prefix): return SAVE_REQUEST_REJECTED # if the origin url is in the white list, it can be immediately saved for url_prefix in get_origin_save_authorized_urls(): if origin_url.startswith(url_prefix): return SAVE_REQUEST_ACCEPTED # otherwise, the origin url needs to be manually verified return SAVE_REQUEST_PENDING # map origin type to scheduler task # TODO: do not hardcode the task name here # TODO: unlock hg and svn loading once the scheduler # loading tasks are available in production _origin_type_task = { 'git': 'origin-update-git', # 'hg': 'origin-load-hg', # 'svn': 'origin-load-svn' } # map scheduler task status to origin save status _save_task_status = { 'next_run_not_scheduled': SAVE_TASK_NOT_YET_SCHEDULED, 'next_run_scheduled': SAVE_TASK_SCHEDULED, 'completed': SAVE_TASK_SUCCEED, 'disabled': SAVE_TASK_FAILED } def get_savable_origin_types(): return sorted(list(_origin_type_task.keys())) def _check_origin_type_savable(origin_type): """ Get the list of software origin types that can be loaded through a save request. Returns: list: the list of saveable origin types """ allowed_origin_types = ', '.join(get_savable_origin_types()) if origin_type not in _origin_type_task: raise BadInputExc('Origin of type %s can not be saved! ' 'Allowed types are the following: %s' % (origin_type, allowed_origin_types)) _validate_url = URLValidator(schemes=['http', 'https', 'svn', 'git']) def _check_origin_url_valid(origin_url): try: _validate_url(origin_url) except ValidationError: raise BadInputExc('The provided origin url (%s) is not valid!' % origin_url) -def _get_visit_date_for_save_request(save_request): +def _get_visit_info_for_save_request(save_request): visit_date = None + visit_status = None try: origin = {'type': save_request.origin_type, 'url': save_request.origin_url} origin_info = service.lookup_origin(origin) origin_visits = get_origin_visits(origin_info) visit_dates = [parse_timestamp(v['date']) for v in origin_visits] i = bisect_right(visit_dates, save_request.request_date) if i != len(visit_dates): visit_date = visit_dates[i] + visit_status = origin_visits[i]['status'] + if origin_visits[i]['status'] == 'ongoing': + visit_date = None except Exception: pass - return visit_date + return visit_date, visit_status + + +def _check_visit_update_status(save_request, save_task_status): + visit_date, visit_status = _get_visit_info_for_save_request(save_request) + save_request.visit_date = visit_date + # visit has been performed, mark the saving task as succeed + if visit_date and visit_status is not None: + save_task_status = SAVE_TASK_SUCCEED + elif visit_status == 'ongoing': + save_task_status = SAVE_TASK_RUNNING + else: + time_now = datetime.now(tz=timezone.utc) + time_delta = time_now - save_request.request_date + # consider the task as failed if it is still in scheduled state + # one week after its submission + if time_delta.days > 7: + save_task_status = SAVE_TASK_FAILED + return visit_date, save_task_status def _save_request_dict(save_request, task=None): must_save = False visit_date = save_request.visit_date + # save task still in scheduler db if task: save_task_status = _save_task_status[task['status']] if save_task_status in (SAVE_TASK_FAILED, SAVE_TASK_SUCCEED) \ and not visit_date: - visit_date = _get_visit_date_for_save_request(save_request) + visit_date, _ = _get_visit_info_for_save_request(save_request) save_request.visit_date = visit_date must_save = True # Ensure last origin visit is available in database # before reporting the task execution as successful if save_task_status == SAVE_TASK_SUCCEED and not visit_date: save_task_status = SAVE_TASK_SCHEDULED - if save_request.loading_task_status != save_task_status: - save_request.loading_task_status = save_task_status - must_save = True - if must_save: - save_request.save() + # Check tasks still marked as scheduled / not yet scheduled + if save_task_status in (SAVE_TASK_SCHEDULED, + SAVE_TASK_NOT_YET_SCHEDULED): + visit_date, save_task_status = _check_visit_update_status( + save_request, save_task_status) + + # save task may have been archived else: save_task_status = save_request.loading_task_status + if save_task_status in (SAVE_TASK_SCHEDULED, + SAVE_TASK_NOT_YET_SCHEDULED): + visit_date, save_task_status = _check_visit_update_status( + save_request, save_task_status) + + else: + save_task_status = save_request.loading_task_status + + if save_request.loading_task_status != save_task_status: + save_request.loading_task_status = save_task_status + must_save = True + + if must_save: + save_request.save() return {'origin_type': save_request.origin_type, 'origin_url': save_request.origin_url, 'save_request_date': save_request.request_date.isoformat(), 'save_request_status': save_request.status, 'save_task_status': save_task_status, 'visit_date': visit_date.isoformat() if visit_date else None} def create_save_origin_request(origin_type, origin_url): """ Create a loading task to save a software origin into the archive. This function aims to create a software origin loading task trough the use of the swh-scheduler component. First, some checks are performed to see if the origin type and url are valid but also if the the save request can be accepted. If those checks passed, the loading task is then created. Otherwise, the save request is put in pending or rejected state. All the submitted save requests are logged into the swh-web database to keep track of them. Args: origin_type (str): the type of origin to save (currently only ``git`` but ``svn`` and ``hg`` will soon be available) origin_url (str): the url of the origin to save Raises: BadInputExc: the origin type or url is invalid ForbiddenExc: the provided origin url is blacklisted Returns: dict: A dict describing the save request with the following keys: * **origin_type**: the type of the origin to save * **origin_url**: the url of the origin * **save_request_date**: the date the request was submitted * **save_request_status**: the request status, either **accepted**, **rejected** or **pending** * **save_task_status**: the origin loading task status, either **not created**, **not yet scheduled**, **scheduled**, **succeed** or **failed** """ _check_origin_type_savable(origin_type) _check_origin_url_valid(origin_url) save_request_status = can_save_origin(origin_url) task = None # if the origin save request is accepted, create a scheduler # task to load it into the archive if save_request_status == SAVE_REQUEST_ACCEPTED: # create a task with high priority kwargs = {'priority': 'high'} # set task parameters according to the origin type if origin_type == 'git': kwargs['repo_url'] = origin_url elif origin_type == 'hg': kwargs['origin_url'] = origin_url elif origin_type == 'svn': kwargs['origin_url'] = origin_url kwargs['svn_url'] = origin_url sor = None # get list of previously sumitted save requests current_sors = \ list(SaveOriginRequest.objects.filter(origin_type=origin_type, origin_url=origin_url)) can_create_task = False # if no save requests previously submitted, create the scheduler task if not current_sors: can_create_task = True else: # get the latest submitted save request sor = current_sors[0] # if it was in pending state, we need to create the scheduler task # and update the save request info in the database if sor.status == SAVE_REQUEST_PENDING: can_create_task = True # a task has already been created to load the origin elif sor.loading_task_id != -1: # get the scheduler task and its status tasks = scheduler.get_tasks([sor.loading_task_id]) task = tasks[0] if tasks else None task_status = _save_request_dict(sor, task)['save_task_status'] # create a new scheduler task only if the previous one has been # already executed if task_status == SAVE_TASK_FAILED or \ task_status == SAVE_TASK_SUCCEED: can_create_task = True sor = None else: can_create_task = False if can_create_task: # effectively create the scheduler task task_dict = create_oneshot_task_dict( _origin_type_task[origin_type], **kwargs) task = scheduler.create_tasks([task_dict])[0] # pending save request has been accepted if sor: sor.status = SAVE_REQUEST_ACCEPTED sor.loading_task_id = task['id'] sor.save() else: sor = SaveOriginRequest.objects.create(origin_type=origin_type, origin_url=origin_url, status=save_request_status, # noqa loading_task_id=task['id']) # noqa # save request must be manually reviewed for acceptation elif save_request_status == SAVE_REQUEST_PENDING: # check if there is already such a save request already submitted, # no need to add it to the database in that case try: sor = SaveOriginRequest.objects.get(origin_type=origin_type, origin_url=origin_url, status=save_request_status) # if not add it to the database except ObjectDoesNotExist: sor = SaveOriginRequest.objects.create(origin_type=origin_type, origin_url=origin_url, status=save_request_status) # origin can not be saved as its url is blacklisted, # log the request to the database anyway else: sor = SaveOriginRequest.objects.create(origin_type=origin_type, origin_url=origin_url, status=save_request_status) if save_request_status == SAVE_REQUEST_REJECTED: raise ForbiddenExc('The origin url is blacklisted and will not be ' 'loaded into the archive.') return _save_request_dict(sor, task) def get_save_origin_requests_from_queryset(requests_queryset): """ Get all save requests from a SaveOriginRequest queryset. Args: requests_queryset (django.db.models.QuerySet): input SaveOriginRequest queryset Returns: list: A list of save origin requests dict as described in :func:`swh.web.common.origin_save.create_save_origin_request` """ task_ids = [] for sor in requests_queryset: task_ids.append(sor.loading_task_id) requests = [] if task_ids: tasks = scheduler.get_tasks(task_ids) tasks = {task['id']: task for task in tasks} for sor in requests_queryset: sr_dict = _save_request_dict(sor, tasks.get(sor.loading_task_id)) requests.append(sr_dict) return requests def get_save_origin_requests(origin_type, origin_url): """ Get all save requests for a given software origin. Args: origin_type (str): the type of the origin origin_url (str): the url of the origin Raises: BadInputExc: the origin type or url is invalid Returns: list: A list of save origin requests dict as described in :func:`swh.web.common.origin_save.create_save_origin_request` """ _check_origin_type_savable(origin_type) _check_origin_url_valid(origin_url) sors = SaveOriginRequest.objects.filter(origin_type=origin_type, origin_url=origin_url) return get_save_origin_requests_from_queryset(sors) diff --git a/swh/web/common/origin_visits.py b/swh/web/common/origin_visits.py index 9bb1c7cb..34e86230 100644 --- a/swh/web/common/origin_visits.py +++ b/swh/web/common/origin_visits.py @@ -1,186 +1,188 @@ # Copyright (C) 2018 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information import math from django.core.cache import cache from swh.web.common.exc import NotFoundExc from swh.web.common.utils import parse_timestamp def get_origin_visits(origin_info): """Function that returns the list of visits for a swh origin. That list is put in cache in order to speedup the navigation in the swh web browse ui. Args: origin_info (dict): dict describing the origin to fetch visits from Returns: list: A list of dict describing the origin visits with the following keys: * **date**: UTC visit date in ISO format, * **origin**: the origin id * **status**: the visit status, either **full**, **partial** or **ongoing** * **visit**: the visit id Raises: NotFoundExc: if the origin is not found """ from swh.web.common import service cache_entry_id = 'origin_%s_visits' % origin_info['id'] cache_entry = cache.get(cache_entry_id) if cache_entry: last_visit = cache_entry[-1]['visit'] new_visits = list(service.lookup_origin_visits(origin_info['id'], last_visit=last_visit)) if not new_visits: - return cache_entry + last_snp = service.lookup_latest_origin_snapshot(origin_info['id']) + if not last_snp or last_snp['id'] == cache_entry[-1]['snapshot']: + return cache_entry origin_visits = [] per_page = service.MAX_LIMIT last_visit = None while 1: visits = list(service.lookup_origin_visits(origin_info['id'], last_visit=last_visit, per_page=per_page)) origin_visits += visits if len(visits) < per_page: break else: if not last_visit: last_visit = per_page else: last_visit += per_page def _visit_sort_key(visit): ts = parse_timestamp(visit['date']).timestamp() return ts + (float(visit['visit']) / 10e3) for v in origin_visits: if 'metadata' in v: del v['metadata'] origin_visits = [dict(t) for t in set([tuple(d.items()) for d in origin_visits])] origin_visits = sorted(origin_visits, key=lambda v: _visit_sort_key(v)) cache.set(cache_entry_id, origin_visits) return origin_visits def get_origin_visit(origin_info, visit_ts=None, visit_id=None, snapshot_id=None): """Function that returns information about a visit for a given origin. The visit is retrieved from a provided timestamp. The closest visit from that timestamp is selected. Args: origin_info (dict): a dict filled with origin information (id, url, type) visit_ts (int or str): an ISO date string or Unix timestamp to parse Returns: A dict containing the visit info as described below:: {'origin': 2, 'date': '2017-10-08T11:54:25.582463+00:00', 'metadata': {}, 'visit': 25, 'status': 'full'} """ visits = get_origin_visits(origin_info) if not visits: if 'type' in origin_info and 'url' in origin_info: message = ('No visit associated to origin with' ' type %s and url %s!' % (origin_info['type'], origin_info['url'])) else: message = ('No visit associated to origin with' ' id %s!' % origin_info['id']) raise NotFoundExc(message) if snapshot_id: visit = [v for v in visits if v['snapshot'] == snapshot_id] if len(visit) == 0: if 'type' in origin_info and 'url' in origin_info: message = ('Visit for snapshot with id %s for origin with type' ' %s and url %s not found!' % (snapshot_id, origin_info['type'], origin_info['url'])) else: message = ('Visit for snapshot with id %s for origin with' ' id %s not found!' % (snapshot_id, origin_info['id'])) raise NotFoundExc(message) return visit[0] if visit_id: visit = [v for v in visits if v['visit'] == int(visit_id)] if len(visit) == 0: if 'type' in origin_info and 'url' in origin_info: message = ('Visit with id %s for origin with type %s' ' and url %s not found!' % (visit_id, origin_info['type'], origin_info['url'])) else: message = ('Visit with id %s for origin with id %s' ' not found!' % (visit_id, origin_info['id'])) raise NotFoundExc(message) return visit[0] if not visit_ts: # returns the latest full visit when no timestamp is provided for v in reversed(visits): if v['status'] == 'full': return v return visits[-1] parsed_visit_ts = math.floor(parse_timestamp(visit_ts).timestamp()) visit_idx = None for i, visit in enumerate(visits): ts = math.floor(parse_timestamp(visit['date']).timestamp()) if i == 0 and parsed_visit_ts <= ts: return visit elif i == len(visits) - 1: if parsed_visit_ts >= ts: return visit else: next_ts = math.floor( parse_timestamp(visits[i+1]['date']).timestamp()) if parsed_visit_ts >= ts and parsed_visit_ts < next_ts: if (parsed_visit_ts - ts) < (next_ts - parsed_visit_ts): visit_idx = i break else: visit_idx = i+1 break if visit_idx is not None: visit = visits[visit_idx] while visit_idx < len(visits) - 1 and \ visit['date'] == visits[visit_idx+1]['date']: visit_idx = visit_idx + 1 visit = visits[visit_idx] return visit else: if 'type' in origin_info and 'url' in origin_info: message = ('Visit with timestamp %s for origin with type %s ' 'and url %s not found!' % (visit_ts, origin_info['type'], origin_info['url'])) else: message = ('Visit with timestamp %s for origin with id %s ' 'not found!' % (visit_ts, origin_info['id'])) raise NotFoundExc(message) diff --git a/swh/web/misc/coverage.py b/swh/web/misc/coverage.py index ff04bb91..0b88c071 100644 --- a/swh/web/misc/coverage.py +++ b/swh/web/misc/coverage.py @@ -1,70 +1,76 @@ # Copyright (C) 2018 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information from django.shortcuts import render from django.views.decorators.clickjacking import xframe_options_exempt # Current coverage list of the archive # TODO: Retrieve that list dynamically instead of hardcoding it _code_providers = [ { 'provider_url': 'https://www.debian.org/', 'provider_logo': 'img/logos/debian.png', 'provider_info': 'source packages from the Debian distribution ' '(continuously archived)', }, + { + 'provider_url': 'https://framagit.org/', + 'provider_logo': 'img/logos/framagit.png', + 'provider_info': 'public repositories from Framagit ' + '(continuously archived)', + }, { 'provider_url': 'https://github.com', 'provider_logo': 'img/logos/github.png', 'provider_info': 'public repositories from GitHub ' '(continuously archived)', }, { 'provider_url': 'https://gitlab.com', 'provider_logo': 'img/logos/gitlab.svg', 'provider_info': 'public repositories from GitLab ' '(continuously archived)', }, { 'provider_url': 'https://gitorious.org/', 'provider_logo': 'img/logos/gitorious.png', 'provider_info': 'public repositories from the former Gitorious code ' 'hosting service', }, { 'provider_url': 'https://code.google.com/archive/', 'provider_logo': 'img/logos/googlecode.png', 'provider_info': 'public repositories from the former Google Code ' 'project hosting service', }, { 'provider_url': 'https://www.gnu.org', 'provider_logo': 'img/logos/gnu.png', 'provider_info': 'releases from the GNU project (as of August 2015)', }, { 'provider_url': 'https://hal.archives-ouvertes.fr/', 'provider_logo': 'img/logos/hal.png', 'provider_info': 'scientific software source code deposited in the ' 'open archive HAL' }, { 'provider_url': 'https://gitlab.inria.fr', 'provider_logo': 'img/logos/inria.jpg', 'provider_info': 'public repositories from Inria GitLab ' '(continuously archived)', }, { 'provider_url': 'https://pypi.org', 'provider_logo': 'img/logos/pypi.svg', 'provider_info': 'source packages from the Python Packaging Index ' '(continuously archived)', }, ] @xframe_options_exempt def swh_coverage(request): return render(request, 'coverage.html', {'providers': _code_providers}) diff --git a/swh/web/static/img/logos/framagit.png b/swh/web/static/img/logos/framagit.png new file mode 100644 index 00000000..0b78ed4f Binary files /dev/null and b/swh/web/static/img/logos/framagit.png differ diff --git a/swh/web/tests/api/views/test_origin_save.py b/swh/web/tests/api/views/test_origin_save.py index a724cc24..9648006c 100644 --- a/swh/web/tests/api/views/test_origin_save.py +++ b/swh/web/tests/api/views/test_origin_save.py @@ -1,248 +1,248 @@ # Copyright (C) 2018 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU Affero General Public License version 3, or any later version # See top-level LICENSE file for more information from datetime import datetime, timedelta from django.utils import timezone from rest_framework.test import APITestCase from unittest.mock import patch from swh.web.common.utils import reverse from swh.web.common.models import ( SaveUnauthorizedOrigin, SaveOriginRequest, SAVE_REQUEST_ACCEPTED, SAVE_REQUEST_REJECTED, SAVE_REQUEST_PENDING ) from swh.web.common.models import ( SAVE_TASK_NOT_CREATED, SAVE_TASK_NOT_YET_SCHEDULED, SAVE_TASK_SCHEDULED, SAVE_TASK_FAILED, SAVE_TASK_SUCCEED ) from swh.web.tests.testcase import WebTestCase class SaveApiTestCase(WebTestCase, APITestCase): @classmethod def setUpTestData(cls): # noqa: N802 SaveUnauthorizedOrigin.objects.create( url='https://github.com/user/illegal_repo') SaveUnauthorizedOrigin.objects.create( url='https://gitlab.com/user_to_exclude') def test_invalid_origin_type(self): url = reverse('api-save-origin', url_args={'origin_type': 'foo', 'origin_url': 'https://github.com/torvalds/linux'}) # noqa response = self.client.post(url) self.assertEqual(response.status_code, 400) def test_invalid_origin_url(self): url = reverse('api-save-origin', url_args={'origin_type': 'git', 'origin_url': 'bar'}) response = self.client.post(url) self.assertEqual(response.status_code, 400) def check_created_save_request_status(self, mock_scheduler, origin_url, scheduler_task_status, expected_request_status, expected_task_status=None, visit_date=None): if not scheduler_task_status: mock_scheduler.get_tasks.return_value = [] else: mock_scheduler.get_tasks.return_value = \ [{ 'priority': 'high', 'policy': 'oneshot', 'type': 'origin-update-git', 'arguments': { 'kwargs': { 'repo_url': origin_url }, 'args': [] }, 'status': scheduler_task_status, 'id': 1, }] mock_scheduler.create_tasks.return_value = \ [{ 'priority': 'high', 'policy': 'oneshot', 'type': 'origin-update-git', 'arguments': { 'kwargs': { 'repo_url': origin_url }, 'args': [] }, 'status': 'next_run_not_scheduled', 'id': 1, }] url = reverse('api-save-origin', url_args={'origin_type': 'git', 'origin_url': origin_url}) - with patch('swh.web.common.origin_save._get_visit_date_for_save_request') as mock_visit_date: # noqa - mock_visit_date.return_value = visit_date + with patch('swh.web.common.origin_save._get_visit_info_for_save_request') as mock_visit_date: # noqa + mock_visit_date.return_value = (visit_date, None) response = self.client.post(url) if expected_request_status != SAVE_REQUEST_REJECTED: self.assertEqual(response.status_code, 200) self.assertEqual(response.data['save_request_status'], expected_request_status) self.assertEqual(response.data['save_task_status'], expected_task_status) else: self.assertEqual(response.status_code, 403) def check_save_request_status(self, mock_scheduler, origin_url, expected_request_status, expected_task_status, scheduler_task_status='next_run_not_scheduled', # noqa visit_date=None): mock_scheduler.get_tasks.return_value = \ [{ 'priority': 'high', 'policy': 'oneshot', 'type': 'origin-update-git', 'arguments': { 'kwargs': { 'repo_url': origin_url }, 'args': [] }, 'status': scheduler_task_status, 'id': 1, }] url = reverse('api-save-origin', url_args={'origin_type': 'git', 'origin_url': origin_url}) - with patch('swh.web.common.origin_save._get_visit_date_for_save_request') as mock_visit_date: # noqa - mock_visit_date.return_value = visit_date + with patch('swh.web.common.origin_save._get_visit_info_for_save_request') as mock_visit_date: # noqa + mock_visit_date.return_value = (visit_date, None) response = self.client.get(url) self.assertEqual(response.status_code, 200) save_request_data = response.data[0] self.assertEqual(save_request_data['save_request_status'], expected_request_status) self.assertEqual(save_request_data['save_task_status'], expected_task_status) # Check that save task status is still available when # the scheduler task has been archived mock_scheduler.get_tasks.return_value = [] response = self.client.get(url) self.assertEqual(response.status_code, 200) save_request_data = response.data[0] self.assertEqual(save_request_data['save_task_status'], expected_task_status) @patch('swh.web.common.origin_save.scheduler') def test_save_request_rejected(self, mock_scheduler): origin_url = 'https://github.com/user/illegal_repo' self.check_created_save_request_status(mock_scheduler, origin_url, None, SAVE_REQUEST_REJECTED) self.check_save_request_status(mock_scheduler, origin_url, SAVE_REQUEST_REJECTED, SAVE_TASK_NOT_CREATED) @patch('swh.web.common.origin_save.scheduler') def test_save_request_pending(self, mock_scheduler): origin_url = 'https://unkwownforge.com/user/repo' self.check_created_save_request_status(mock_scheduler, origin_url, None, SAVE_REQUEST_PENDING, SAVE_TASK_NOT_CREATED) self.check_save_request_status(mock_scheduler, origin_url, SAVE_REQUEST_PENDING, SAVE_TASK_NOT_CREATED) @patch('swh.web.common.origin_save.scheduler') def test_save_request_succeed(self, mock_scheduler): origin_url = 'https://github.com/Kitware/CMake' self.check_created_save_request_status(mock_scheduler, origin_url, None, SAVE_REQUEST_ACCEPTED, SAVE_TASK_NOT_YET_SCHEDULED) self.check_save_request_status(mock_scheduler, origin_url, SAVE_REQUEST_ACCEPTED, SAVE_TASK_SCHEDULED, scheduler_task_status='next_run_scheduled') # noqa self.check_save_request_status(mock_scheduler, origin_url, SAVE_REQUEST_ACCEPTED, SAVE_TASK_SCHEDULED, scheduler_task_status='completed', visit_date=None) # noqa visit_date = datetime.now(tz=timezone.utc) + timedelta(hours=1) self.check_save_request_status(mock_scheduler, origin_url, SAVE_REQUEST_ACCEPTED, SAVE_TASK_SUCCEED, scheduler_task_status='completed', visit_date=visit_date) # noqa @patch('swh.web.common.origin_save.scheduler') def test_save_request_failed(self, mock_scheduler): origin_url = 'https://gitlab.com/inkscape/inkscape' self.check_created_save_request_status(mock_scheduler, origin_url, None, SAVE_REQUEST_ACCEPTED, SAVE_TASK_NOT_YET_SCHEDULED) self.check_save_request_status(mock_scheduler, origin_url, SAVE_REQUEST_ACCEPTED, SAVE_TASK_SCHEDULED, scheduler_task_status='next_run_scheduled') # noqa self.check_save_request_status(mock_scheduler, origin_url, SAVE_REQUEST_ACCEPTED, SAVE_TASK_FAILED, scheduler_task_status='disabled') # noqa @patch('swh.web.common.origin_save.scheduler') def test_create_save_request_only_when_needed(self, mock_scheduler): - origin_url = 'https://gitlab.com/webpack/webpack' + origin_url = 'https://github.com/webpack/webpack' SaveOriginRequest.objects.create(origin_type='git', origin_url=origin_url, status=SAVE_REQUEST_ACCEPTED, # noqa loading_task_id=56) self.check_created_save_request_status(mock_scheduler, origin_url, 'next_run_not_scheduled', SAVE_REQUEST_ACCEPTED, SAVE_TASK_NOT_YET_SCHEDULED) sors = list(SaveOriginRequest.objects.filter(origin_type='git', origin_url=origin_url)) self.assertEqual(len(sors), 1) self.check_created_save_request_status(mock_scheduler, origin_url, 'next_run_scheduled', SAVE_REQUEST_ACCEPTED, SAVE_TASK_SCHEDULED) sors = list(SaveOriginRequest.objects.filter(origin_type='git', origin_url=origin_url)) self.assertEqual(len(sors), 1) visit_date = datetime.now(tz=timezone.utc) + timedelta(hours=1) self.check_created_save_request_status(mock_scheduler, origin_url, 'completed', SAVE_REQUEST_ACCEPTED, SAVE_TASK_NOT_YET_SCHEDULED, visit_date=visit_date) sors = list(SaveOriginRequest.objects.filter(origin_type='git', origin_url=origin_url)) self.assertEqual(len(sors), 2) self.check_created_save_request_status(mock_scheduler, origin_url, 'disabled', SAVE_REQUEST_ACCEPTED, SAVE_TASK_NOT_YET_SCHEDULED) sors = list(SaveOriginRequest.objects.filter(origin_type='git', origin_url=origin_url)) self.assertEqual(len(sors), 3) diff --git a/version.txt b/version.txt index 2d7ffd46..8d8c353e 100644 --- a/version.txt +++ b/version.txt @@ -1 +1 @@ -v0.0.177-0-gc7f459f3 \ No newline at end of file +v0.0.178-0-g4b8d67eb \ No newline at end of file