diff --git a/.gitignore b/.gitignore index 22f237f..9dbb22e 100644 --- a/.gitignore +++ b/.gitignore @@ -1,9 +1,11 @@ *.pyc *.sw? *~ .coverage .eggs/ __pycache__ +build dist swh.core.egg-info version.txt +.tox diff --git a/PKG-INFO b/PKG-INFO index 58bd77b..7e15c36 100644 --- a/PKG-INFO +++ b/PKG-INFO @@ -1,10 +1,59 @@ -Metadata-Version: 1.0 +Metadata-Version: 2.1 Name: swh.core -Version: 0.0.40 +Version: 0.0.41 Summary: Software Heritage core utilities Home-page: https://forge.softwareheritage.org/diffusion/DCORE/ Author: Software Heritage developers Author-email: swh-devel@inria.fr License: UNKNOWN -Description: UNKNOWN +Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest +Project-URL: Funding, https://www.softwareheritage.org/donate +Project-URL: Source, https://forge.softwareheritage.org/source/swh-core +Description: swh-core + ======== + + core library for swh's modules: + - config parser + - hash computations + - serialization + - logging mechanism + + Defines also a celery application to run concurrency tasks + + Celery use + ---------- + + ### configuration file + + worker.ini file which looks like: + + [main] + task_broker = amqp://guest@localhost// + task_modules = swh.loader.dir.tasks, swh.loader.tar.tasks, swh.loader.git.tasks + task_queues = swh_loader_tar, swh_loader_git, swh_loader_dir + task_soft_time_limit = 0 + + This file can be set in the following location: + - ~/.swh + - ~/.config/swh + - /etc/softwareheritage + + + ### run celery worker + + Sample command: + + celery worker --app=swh.core.worker \ + --pool=prefork \ + --autoscale=2,2 \ + -Ofair \ + --loglevel=info 2>&1 | tee -a swh-core-worker.log + Platform: UNKNOWN +Classifier: Programming Language :: Python :: 3 +Classifier: Intended Audience :: Developers +Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3) +Classifier: Operating System :: OS Independent +Classifier: Development Status :: 5 - Production/Stable +Description-Content-Type: text/markdown +Provides-Extra: testing diff --git a/docs/index.rst b/docs/index.rst index 1954db2..6b41dd4 100644 --- a/docs/index.rst +++ b/docs/index.rst @@ -1,17 +1,19 @@ .. _swh-core: -Software Heritage - Development Documentation -============================================= +Software Heritage - Core foundations +==================================== + +Low-level utilities and helpers used by almost all other modules in the stack. + .. toctree:: :maxdepth: 2 :caption: Contents: - Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` diff --git a/requirements-test.txt b/requirements-test.txt new file mode 100644 index 0000000..145e520 --- /dev/null +++ b/requirements-test.txt @@ -0,0 +1,2 @@ +nose +requests-mock diff --git a/setup.py b/setup.py old mode 100644 new mode 100755 index 8b2024c..5755b09 --- a/setup.py +++ b/setup.py @@ -1,30 +1,66 @@ #!/usr/bin/env python3 +# Copyright (C) 2015-2018 The Software Heritage developers +# See the AUTHORS file at the top-level directory of this distribution +# License: GNU General Public License version 3, or any later version +# See top-level LICENSE file for more information +import os from setuptools import setup, find_packages +from os import path +from io import open + +here = path.abspath(path.dirname(__file__)) + +# Get the long description from the README file +with open(path.join(here, 'README.md'), encoding='utf-8') as f: + long_description = f.read() + + +def parse_requirements(name=None): + if name: + reqf = 'requirements-%s.txt' % name + else: + reqf = 'requirements.txt' -def parse_requirements(): requirements = [] - for reqf in ('requirements.txt', 'requirements-swh.txt'): - with open(reqf) as f: - for line in f.readlines(): - line = line.strip() - if not line or line.startswith('#'): - continue - requirements.append(line) + if not os.path.exists(reqf): + return requirements + + with open(reqf) as f: + for line in f.readlines(): + line = line.strip() + if not line or line.startswith('#'): + continue + requirements.append(line) return requirements setup( name='swh.core', description='Software Heritage core utilities', + long_description=long_description, + long_description_content_type='text/markdown', author='Software Heritage developers', author_email='swh-devel@inria.fr', url='https://forge.softwareheritage.org/diffusion/DCORE/', packages=find_packages(), scripts=[], - install_requires=parse_requirements(), + install_requires=parse_requirements() + parse_requirements('swh'), setup_requires=['vcversioner'], + extras_require={'testing': parse_requirements('test')}, vcversioner={}, include_package_data=True, + classifiers=[ + "Programming Language :: Python :: 3", + "Intended Audience :: Developers", + "License :: OSI Approved :: GNU General Public License v3 (GPLv3)", + "Operating System :: OS Independent", + "Development Status :: 5 - Production/Stable", + ], + project_urls={ + 'Bug Reports': 'https://forge.softwareheritage.org/maniphest', + 'Funding': 'https://www.softwareheritage.org/donate', + 'Source': 'https://forge.softwareheritage.org/source/swh-core', + }, ) diff --git a/swh.core.egg-info/PKG-INFO b/swh.core.egg-info/PKG-INFO index 58bd77b..7e15c36 100644 --- a/swh.core.egg-info/PKG-INFO +++ b/swh.core.egg-info/PKG-INFO @@ -1,10 +1,59 @@ -Metadata-Version: 1.0 +Metadata-Version: 2.1 Name: swh.core -Version: 0.0.40 +Version: 0.0.41 Summary: Software Heritage core utilities Home-page: https://forge.softwareheritage.org/diffusion/DCORE/ Author: Software Heritage developers Author-email: swh-devel@inria.fr License: UNKNOWN -Description: UNKNOWN +Project-URL: Bug Reports, https://forge.softwareheritage.org/maniphest +Project-URL: Funding, https://www.softwareheritage.org/donate +Project-URL: Source, https://forge.softwareheritage.org/source/swh-core +Description: swh-core + ======== + + core library for swh's modules: + - config parser + - hash computations + - serialization + - logging mechanism + + Defines also a celery application to run concurrency tasks + + Celery use + ---------- + + ### configuration file + + worker.ini file which looks like: + + [main] + task_broker = amqp://guest@localhost// + task_modules = swh.loader.dir.tasks, swh.loader.tar.tasks, swh.loader.git.tasks + task_queues = swh_loader_tar, swh_loader_git, swh_loader_dir + task_soft_time_limit = 0 + + This file can be set in the following location: + - ~/.swh + - ~/.config/swh + - /etc/softwareheritage + + + ### run celery worker + + Sample command: + + celery worker --app=swh.core.worker \ + --pool=prefork \ + --autoscale=2,2 \ + -Ofair \ + --loglevel=info 2>&1 | tee -a swh-core-worker.log + Platform: UNKNOWN +Classifier: Programming Language :: Python :: 3 +Classifier: Intended Audience :: Developers +Classifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3) +Classifier: Operating System :: OS Independent +Classifier: Development Status :: 5 - Production/Stable +Description-Content-Type: text/markdown +Provides-Extra: testing diff --git a/swh.core.egg-info/SOURCES.txt b/swh.core.egg-info/SOURCES.txt index 0e8546e..e17ec53 100644 --- a/swh.core.egg-info/SOURCES.txt +++ b/swh.core.egg-info/SOURCES.txt @@ -1,43 +1,47 @@ .gitignore AUTHORS LICENSE MANIFEST.in Makefile README.md requirements-swh.txt +requirements-test.txt requirements.txt setup.py +tox.ini version.txt debian/changelog debian/compat debian/control debian/copyright debian/rules debian/source/format docs/.gitignore docs/Makefile docs/conf.py docs/index.rst docs/_static/.placeholder docs/_templates/.placeholder sql/log-schema.sql swh/__init__.py swh.core.egg-info/PKG-INFO swh.core.egg-info/SOURCES.txt swh.core.egg-info/dependency_links.txt swh.core.egg-info/requires.txt swh.core.egg-info/top_level.txt swh/core/__init__.py swh/core/api.py swh/core/api_async.py swh/core/config.py swh/core/logger.py swh/core/serializers.py swh/core/tarball.py swh/core/utils.py +swh/core/tests/__init__.py swh/core/tests/db_testing.py swh/core/tests/server_testing.py +swh/core/tests/test_api.py swh/core/tests/test_config.py swh/core/tests/test_logger.py swh/core/tests/test_serializers.py swh/core/tests/test_utils.py \ No newline at end of file diff --git a/swh.core.egg-info/requires.txt b/swh.core.egg-info/requires.txt index cdf6bd0..0f40f15 100644 --- a/swh.core.egg-info/requires.txt +++ b/swh.core.egg-info/requires.txt @@ -1,10 +1,14 @@ Flask PyYAML aiohttp arrow msgpack-python psycopg2 python-dateutil requests systemd-python vcversioner + +[testing] +nose +requests-mock diff --git a/swh/core/api.py b/swh/core/api.py index 2fc2d2d..a377781 100644 --- a/swh/core/api.py +++ b/swh/core/api.py @@ -1,145 +1,237 @@ # Copyright (C) 2015-2017 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information import collections +import functools +import inspect import json import logging import pickle import requests from flask import Flask, Request, Response from .serializers import (decode_response, encode_data_client as encode_data, msgpack_dumps, msgpack_loads, SWHJSONDecoder) class RemoteException(Exception): pass -class SWHRemoteAPI: +def remote_api_endpoint(path): + def dec(f): + f._endpoint_path = path + return f + return dec + + +class MetaSWHRemoteAPI(type): + """Metaclass for SWHRemoteAPI, which adds a method for each endpoint + of the database it is designed to access. + + See for example :class:`swh.indexer.storage.api.client.RemoteStorage`""" + def __new__(cls, name, bases, attributes): + # For each method wrapped with @remote_api_endpoint in an API backend + # (eg. :class:`swh.indexer.storage.IndexerStorage`), add a new + # method in RemoteStorage, with the same documentation. + # + # Note that, despite the usage of decorator magic (eg. functools.wrap), + # this never actually calls an IndexerStorage method. + backend_class = attributes.get('backend_class', None) + for base in bases: + if backend_class is not None: + break + backend_class = getattr(base, 'backend_class', None) + if backend_class: + for (meth_name, meth) in backend_class.__dict__.items(): + if hasattr(meth, '_endpoint_path'): + cls.__add_endpoint(meth_name, meth, attributes) + return super().__new__(cls, name, bases, attributes) + + @staticmethod + def __add_endpoint(meth_name, meth, attributes): + wrapped_meth = inspect.unwrap(meth) + + @functools.wraps(meth) # Copy signature and doc + def meth_(*args, **kwargs): + # Match arguments and parameters + post_data = inspect.getcallargs( + wrapped_meth, *args, **kwargs) + + # Remove arguments that should not be passed + self = post_data.pop('self') + post_data.pop('cur', None) + post_data.pop('db', None) + + # Send the request. + return self.post(meth._endpoint_path, post_data) + attributes[meth_name] = meth_ + + +class SWHRemoteAPI(metaclass=MetaSWHRemoteAPI): """Proxy to an internal SWH API """ + backend_class = None + """For each method of `backend_class` decorated with + :func:`remote_api_endpoint`, a method with the same prototype and + docstring will be added to this class. Calls to this new method will + be translated into HTTP requests to a remote server. + + This backend class will never be instantiated, it only serves as + a template.""" + def __init__(self, api_exception, url, timeout=None): super().__init__() self.api_exception = api_exception base_url = url if url.endswith('/') else url + '/' self.url = base_url self.session = requests.Session() self.timeout = timeout def _url(self, endpoint): return '%s%s' % (self.url, endpoint) def raw_post(self, endpoint, data, **opts): if self.timeout and 'timeout' not in opts: opts['timeout'] = self.timeout try: return self.session.post( self._url(endpoint), data=data, **opts ) except requests.exceptions.ConnectionError as e: raise self.api_exception(e) def raw_get(self, endpoint, params=None, **opts): if self.timeout and 'timeout' not in opts: opts['timeout'] = self.timeout try: return self.session.get( self._url(endpoint), params=params, **opts ) except requests.exceptions.ConnectionError as e: raise self.api_exception(e) def post(self, endpoint, data, params=None): data = encode_data(data) response = self.raw_post( endpoint, data, params=params, headers={'content-type': 'application/x-msgpack'}) return self._decode_response(response) def get(self, endpoint, params=None): response = self.raw_get(endpoint, params=params) return self._decode_response(response) def post_stream(self, endpoint, data, params=None): if not isinstance(data, collections.Iterable): raise ValueError("`data` must be Iterable") response = self.raw_post(endpoint, data, params=params) return self._decode_response(response) def get_stream(self, endpoint, params=None, chunk_size=4096): response = self.raw_get(endpoint, params=params, stream=True) return response.iter_content(chunk_size) def _decode_response(self, response): if response.status_code == 404: return None if response.status_code == 500: data = decode_response(response) if 'exception_pickled' in data: raise pickle.loads(data['exception_pickled']) else: raise RemoteException(data['exception']) # XXX: this breaks language-independence and should be # replaced by proper unserialization if response.status_code == 400: raise pickle.loads(decode_response(response)) elif response.status_code != 200: raise RemoteException( "Unexpected status code for API request: %s (%s)" % ( response.status_code, response.content, ) ) return decode_response(response) class BytesRequest(Request): """Request with proper escaping of arbitrary byte sequences.""" encoding = 'utf-8' encoding_errors = 'surrogateescape' def encode_data_server(data): return Response( msgpack_dumps(data), mimetype='application/x-msgpack', ) def decode_request(request): content_type = request.mimetype data = request.get_data() if content_type == 'application/x-msgpack': r = msgpack_loads(data) elif content_type == 'application/json': r = json.loads(data, cls=SWHJSONDecoder) else: raise ValueError('Wrong content type `%s` for API request' % content_type) return r def error_handler(exception, encoder): # XXX: this breaks language-independence and should be # replaced by proper serialization of errors logging.exception(exception) response = encoder(pickle.dumps(exception)) response.status_code = 400 return response class SWHServerAPIApp(Flask): + """For each endpoint of the given `backend_class`, tells app.route to call + a function that decodes the request and sends it to the backend object + provided by the factory. + + :param Any backend_class: The class of the backend, which will be + analyzed to look for API endpoints. + :param Callable[[], backend_class] backend_factory: A function with no + argument that returns + an instance of + `backend_class`.""" request_class = BytesRequest + + def __init__(self, *args, backend_class=None, backend_factory=None, + **kwargs): + super().__init__(*args, **kwargs) + + if backend_class is not None: + if backend_factory is None: + raise TypeError('Missing argument backend_factory') + for (meth_name, meth) in backend_class.__dict__.items(): + if hasattr(meth, '_endpoint_path'): + self.__add_endpoint(meth_name, meth, backend_factory) + + def __add_endpoint(self, meth_name, meth, backend_factory): + from flask import request + + @self.route('/'+meth._endpoint_path, methods=['POST']) + @functools.wraps(meth) # Copy signature and doc + def _f(): + # Call the actual code + obj_meth = getattr(backend_factory(), meth_name) + return encode_data_server(obj_meth(**decode_request(request))) diff --git a/swh/core/tests/__init__.py b/swh/core/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/swh/core/tests/db_testing.py b/swh/core/tests/db_testing.py index 0351f22..035945d 100644 --- a/swh/core/tests/db_testing.py +++ b/swh/core/tests/db_testing.py @@ -1,266 +1,266 @@ # Copyright (C) 2015 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information import psycopg2 import subprocess def pg_restore(dbname, dumpfile, dumptype='pg_dump'): """ Args: dbname: name of the DB to restore into dumpfile: path fo the dump file dumptype: one of 'pg_dump' (for binary dumps), 'psql' (for SQL dumps) """ assert dumptype in ['pg_dump', 'psql'] if dumptype == 'pg_dump': subprocess.check_call(['pg_restore', '--no-owner', '--no-privileges', '--dbname', dbname, dumpfile]) elif dumptype == 'psql': subprocess.check_call(['psql', '--quiet', '--no-psqlrc', '-v', 'ON_ERROR_STOP=1', '-f', dumpfile, dbname]) def pg_dump(dbname, dumpfile): subprocess.check_call(['pg_dump', '--no-owner', '--no-privileges', '-Fc', '-f', dumpfile, dbname]) def pg_dropdb(dbname): subprocess.check_call(['dropdb', dbname]) def pg_createdb(dbname): subprocess.check_call(['createdb', dbname]) def db_create(dbname, dump=None, dumptype='pg_dump'): """create the test DB and load the test data dump into it context: setUpClass """ try: pg_createdb(dbname) except subprocess.CalledProcessError: # try recovering once, in case pg_dropdb(dbname) # the db already existed pg_createdb(dbname) if dump: pg_restore(dbname, dump, dumptype) return dbname def db_destroy(dbname): """destroy the test DB context: tearDownClass """ pg_dropdb(dbname) def db_connect(dbname): """connect to the test DB and open a cursor context: setUp """ conn = psycopg2.connect('dbname=' + dbname) return { 'conn': conn, 'cursor': conn.cursor() } def db_close(conn): - """rollback current transaction and disconnet from the test DB + """rollback current transaction and disconnect from the test DB context: tearDown """ if not conn.closed: conn.rollback() conn.close() class DbTestConn: def __init__(self, dbname): self.dbname = dbname def __enter__(self): self.db_setup = db_connect(self.dbname) self.conn = self.db_setup['conn'] self.cursor = self.db_setup['cursor'] return self def __exit__(self, *_): db_close(self.conn) class DbTestContext: def __init__(self, name='softwareheritage-test', dump=None, dump_type='pg_dump'): self.dbname = name self.dump = dump self.dump_type = dump_type def __enter__(self): db_create(dbname=self.dbname, dump=self.dump, dumptype=self.dump_type) return self def __exit__(self, *_): db_destroy(self.dbname) class DbTestFixture: """Mix this in a test subject class to get DB testing support. Use the class method add_db() to add a new database to be tested. Using this will create a DbTestConn entry in the `test_db` dictionary for all the tests, indexed by the name of the database. Example: class TestDb(DbTestFixture, unittest.TestCase): @classmethod def setUpClass(cls): super().setUpClass() cls.add_db('db_name', DUMP) def setUp(self): db = self.test_db['db_name'] print('conn: {}, cursor: {}'.format(db.conn, db.cursor)) To ensure test isolation, each test method of the test case class will execute in its own connection, cursor, and transaction. Note that if you want to define setup/teardown methods, you need to explicitly call super() to ensure that the fixture setup/teardown methods are invoked. Here is an example where all setup/teardown methods are defined in a test case: class TestDb(DbTestFixture, unittest.TestCase): @classmethod def setUpClass(cls): # your add_db() calls here super().setUpClass() # your class setup code here def setUp(self): super().setUp() # your instance setup code here def tearDown(self): # your instance teardown code here super().tearDown() @classmethod def tearDownClass(cls): # your class teardown code here super().tearDownClass() """ _DB_DUMP_LIST = {} _DB_LIST = {} DB_TEST_FIXTURE_IMPORTED = True @classmethod def add_db(cls, name='softwareheritage-test', dump=None, dump_type='pg_dump'): cls._DB_DUMP_LIST[name] = (dump, dump_type) @classmethod def setUpClass(cls): for name, (dump, dump_type) in cls._DB_DUMP_LIST.items(): cls._DB_LIST[name] = DbTestContext(name, dump, dump_type) cls._DB_LIST[name].__enter__() super().setUpClass() @classmethod def tearDownClass(cls): super().tearDownClass() for name, context in cls._DB_LIST.items(): context.__exit__() def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) self.test_db = {} def setUp(self): self.test_db = {} for name in self._DB_LIST.keys(): self.test_db[name] = DbTestConn(name) self.test_db[name].__enter__() super().setUp() def tearDown(self): super().tearDown() for name in self._DB_LIST.keys(): self.test_db[name].__exit__() def reset_db_tables(self, name, excluded=None): db = self.test_db[name] conn = db.conn cursor = db.cursor cursor.execute("""SELECT table_name FROM information_schema.tables WHERE table_schema = %s""", ('public',)) tables = set(table for (table,) in cursor.fetchall()) if excluded is not None: tables -= set(excluded) for table in tables: cursor.execute('truncate table %s cascade' % table) conn.commit() class SingleDbTestFixture(DbTestFixture): """Simplified fixture like DbTest but that can only handle a single DB. Gives access to shortcuts like self.cursor and self.conn. DO NOT use this with other fixtures that need to access databases, like StorageTestFixture. The class can override the following class attributes: TEST_DB_NAME: name of the DB used for testing TEST_DB_DUMP: DB dump to be restored before running test methods; can be set to None if no restore from dump is required TEST_DB_DUMP_TYPE: one of 'pg_dump' (binary dump) or 'psql' (SQL dump) The test case class will then have the following attributes, accessible via self: dbname: name of the test database conn: psycopg2 connection object cursor: open psycopg2 cursor to the DB """ TEST_DB_NAME = 'softwareheritage-test' TEST_DB_DUMP = None TEST_DB_DUMP_TYPE = 'pg_dump' @classmethod def setUpClass(cls): cls.dbname = cls.TEST_DB_NAME cls.add_db(name=cls.TEST_DB_NAME, dump=cls.TEST_DB_DUMP, dump_type=cls.TEST_DB_DUMP_TYPE) super().setUpClass() def setUp(self): super().setUp() db = self.test_db[self.TEST_DB_NAME] self.conn = db.conn self.cursor = db.cursor diff --git a/swh/core/tests/server_testing.py b/swh/core/tests/server_testing.py index e801a3e..3187d1f 100644 --- a/swh/core/tests/server_testing.py +++ b/swh/core/tests/server_testing.py @@ -1,146 +1,149 @@ # Copyright (C) 2015-2018 The Software Heritage developers # See the AUTHORS file at the top-level directory of this distribution # License: GNU General Public License version 3, or any later version # See top-level LICENSE file for more information import abc import aiohttp import multiprocessing +import os import socket import time from urllib.request import urlopen class ServerTestFixtureBaseClass(metaclass=abc.ABCMeta): """Base class for http client/server testing implementations. Override this class to implement the following methods: - process_config: to do something needed for the server configuration (e.g propagate the configuration to other part) - define_worker_function: define the function that will actually run the server. To ensure test isolation, each test will run in a different server and a different folder. In order to correctly work, the subclass must call the parents class's setUp() and tearDown() methods. """ def setUp(self): super().setUp() self.start_server() def tearDown(self): self.stop_server() super().tearDown() def url(self): return 'http://127.0.0.1:%d/' % self.port def process_config(self): """Process the server's configuration. Do something useful for example, pass along the self.config dictionary inside the self.app. By default, do nothing. """ pass @abc.abstractmethod def define_worker_function(self, app, port): """Define how the actual implementation server will run. """ pass def start_server(self): """ Spawn the API server using multiprocessing. """ self.process = None self.process_config() # Get an available port number sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) sock.bind(('127.0.0.1', 0)) self.port = sock.getsockname()[1] sock.close() worker_fn = self.define_worker_function() self.process = multiprocessing.Process( target=worker_fn, args=(self.app, self.port) ) self.process.start() # Wait max 5 seconds for server to spawn i = 0 while i < 500: try: urlopen(self.url()) except Exception: i += 1 time.sleep(0.01) else: return def stop_server(self): """ Terminate the API server's process. """ if self.process: self.process.terminate() class ServerTestFixture(ServerTestFixtureBaseClass): """Base class for http client/server testing (e.g flask). Mix this in a test class in order to have access to an http server running in background. Note that the subclass should define a dictionary in self.config that contains the server config. And an application in self.app that corresponds to the type of server the tested client needs. To ensure test isolation, each test will run in a different server and a different folder. In order to correctly work, the subclass must call the parents class's setUp() and tearDown() methods. """ def process_config(self): # WSGI app configuration for key, value in self.config.items(): self.app.config[key] = value def define_worker_function(self): def worker(app, port): + # Make Flask 1.0 stop printing its server banner + os.environ['WERKZEUG_RUN_MAIN'] = 'true' return app.run(port=port, use_reloader=False) return worker class ServerTestFixtureAsync(ServerTestFixtureBaseClass): """Base class for http client/server async testing (e.g aiohttp). Mix this in a test class in order to have access to an http server running in background. Note that the subclass should define an application in self.app that corresponds to the type of server the tested client needs. To ensure test isolation, each test will run in a different server and a different folder. In order to correctly work, the subclass must call the parents class's setUp() and tearDown() methods. """ def define_worker_function(self): def worker(app, port): return aiohttp.web.run_app(app, port=int(port), print=lambda *_: None) return worker diff --git a/swh/core/tests/test_api.py b/swh/core/tests/test_api.py new file mode 100644 index 0000000..1bab537 --- /dev/null +++ b/swh/core/tests/test_api.py @@ -0,0 +1,84 @@ +# Copyright (C) 2018 The Software Heritage developers +# See the AUTHORS file at the top-level directory of this distribution +# License: GNU General Public License version 3, or any later version +# See top-level LICENSE file for more information + +import unittest +from nose.tools import istest + +import requests_mock +from werkzeug.wrappers import BaseResponse +from werkzeug.test import Client as WerkzeugTestClient + +from swh.core.api import ( + error_handler, encode_data_server, + remote_api_endpoint, SWHRemoteAPI, SWHServerAPIApp) + + +class ApiTest(unittest.TestCase): + @istest + def test_server(self): + testcase = self + nb_endpoint_calls = 0 + + class TestStorage: + @remote_api_endpoint('test_endpoint_url') + def test_endpoint(self, test_data, db=None, cur=None): + nonlocal nb_endpoint_calls + nb_endpoint_calls += 1 + + testcase.assertEqual(test_data, 'spam') + return 'egg' + + app = SWHServerAPIApp('testapp', + backend_class=TestStorage, + backend_factory=lambda: TestStorage()) + + @app.errorhandler(Exception) + def my_error_handler(exception): + return error_handler(exception, encode_data_server) + + client = WerkzeugTestClient(app, BaseResponse) + res = client.post('/test_endpoint_url', + headers={'Content-Type': 'application/x-msgpack'}, + data=b'\x81\xa9test_data\xa4spam') + + self.assertEqual(nb_endpoint_calls, 1) + self.assertEqual(b''.join(res.response), b'\xa3egg') + + @istest + def test_client(self): + class TestStorage: + @remote_api_endpoint('test_endpoint_url') + def test_endpoint(self, test_data, db=None, cur=None): + pass + + nb_http_calls = 0 + + def callback(request, context): + nonlocal nb_http_calls + nb_http_calls += 1 + self.assertEqual(request.headers['Content-Type'], + 'application/x-msgpack') + self.assertEqual(request.body, b'\x81\xa9test_data\xa4spam') + context.headers['Content-Type'] = 'application/x-msgpack' + context.content = b'\xa3egg' + return b'\xa3egg' + + adapter = requests_mock.Adapter() + adapter.register_uri('POST', + 'mock://example.com/test_endpoint_url', + content=callback) + + class Testclient(SWHRemoteAPI): + backend_class = TestStorage + + def __init__(self, *args, **kwargs): + super().__init__(*args, **kwargs) + self.session.mount('mock', adapter) + + c = Testclient('foo', 'mock://example.com/') + res = c.test_endpoint('spam') + + self.assertEqual(nb_http_calls, 1) + self.assertEqual(res, 'egg') diff --git a/tox.ini b/tox.ini new file mode 100644 index 0000000..7a3f221 --- /dev/null +++ b/tox.ini @@ -0,0 +1,23 @@ +[tox] +envlist=check-manifest,flake8,py3 + +[testenv:py3] +deps = + pifpaf + nose +commands = + pifpaf run postgresql -- nosetests + +[testenv:flake8] +skip_install = true +deps = + flake8 +commands = + {envpython} -m flake8 + +[testenv:check-manifest] +skip_install = true +deps = + check-manifest +commands = + {envpython} -m check_manifest {toxinidir} diff --git a/version.txt b/version.txt index 34708e2..b066bae 100644 --- a/version.txt +++ b/version.txt @@ -1 +1 @@ -v0.0.40-0-geec4699 \ No newline at end of file +v0.0.41-0-gdefc4f3 \ No newline at end of file