diff --git a/docs/archive-changelog.rst b/docs/archive-changelog.rst --- a/docs/archive-changelog.rst +++ b/docs/archive-changelog.rst @@ -54,7 +54,7 @@ `ftp.gnu.org`_, and added it to regular crawling (tracking: `T1722 `_) -* **2019-05-27:** completed a full archival of NPM_ packages andded it as a +* **2019-05-27:** completed a full archival of NPM_ packages and added it as a regularly crawled package repository (tracking: `T1378 `_) diff --git a/docs/contributing/phabricator.rst b/docs/contributing/phabricator.rst --- a/docs/contributing/phabricator.rst +++ b/docs/contributing/phabricator.rst @@ -246,7 +246,7 @@ reordering/splitting/merging commits as needed to have separate logical commits and an easy to bisect history. Update the diff :ref:`following the prior section ` -(It'd be good to let the ci build finish to make sure everything is still green). +(It'd be good to let the CI build finish to make sure everything is still green). Once you're happy you can **push to origin/master** directly, e.g.:: diff --git a/docs/glossary.rst b/docs/glossary.rst --- a/docs/glossary.rst +++ b/docs/glossary.rst @@ -80,7 +80,7 @@ A :ref:`loader ` is a component of the |swh| architecture responsible for reading a source code :term:`origin` (typically a git - reposiitory) and import or update its content in the :term:`archive` (ie. + repository) and import or update its content in the :term:`archive` (ie. add new file contents int :term:`object storage` and repository structure in the :term:`storage database`). diff --git a/docs/journal.rst b/docs/journal.rst --- a/docs/journal.rst +++ b/docs/journal.rst @@ -3,7 +3,7 @@ Software Heritage Journal --- Specifications ============================================ -The |swh| journal is a kafka_-based stream of events for every added object in +The |swh| journal is a Kafka_-based stream of events for every added object in the |swh| Archive and some of its related services, especially indexers. Each topic_ will stream added elements for a given object type according to the @@ -72,7 +72,7 @@ -Topics for Merkel-DAG objects +Topics for Merkle-DAG objects ----------------------------- These topics are for the various objects stored in the |swh| Merkle DAG, see @@ -551,7 +551,7 @@ Kafka message format -------------------- -Each value of a kafka message in a topic is a dictionary-like structure +Each value of a Kafka message in a topic is a dictionary-like structure encoded as a msgpack_ byte string. Keys are ASCII strings. @@ -586,7 +586,7 @@ Datetime ++++++++ -There are 2 type of date that can be encoded in a kafka message: +There are 2 type of date that can be encoded in a Kafka message: - dates for git-like objects (:py:class:`swh.model.model.Revision` and :py:class:`swh.model.model.Release`): these dates are part of the hash @@ -661,12 +661,12 @@ where the `` is computed from original values as a sha256 of the -orignal's `fullname`. +original's `fullname`. -.. _kafka: https://kafka.apache.org +.. _Kafka: https://kafka.apache.org .. _topic: https://kafka.apache.org/documentation/#intro_concepts_and_terms .. _msgpack: https://msgpack.org/ .. _`extended type`: https://github.com/msgpack/msgpack/blob/master/spec.md#extension-types diff --git a/docs/mirror.rst b/docs/mirror.rst --- a/docs/mirror.rst +++ b/docs/mirror.rst @@ -69,7 +69,7 @@ In order to set up a mirror of the graph, one needs to deploy a stack capable of retrieving all these topics and store their content reliably. For example a -kafka cluster configured as a replica of the main kafka broker hosted by |swh| +Kafka cluster configured as a replica of the main Kafka broker hosted by |swh| would do the job (albeit not in a very useful manner by itself). A more useful mirror can be set up using the :ref:`storage ` @@ -128,5 +128,5 @@ repository for details. -.. _kafka: https://kafka.apache.org/ +.. _Kafka: https://kafka.apache.org/ .. _msgpack: https://msgpack.org diff --git a/docs/tutorials/issue-debugging-monitoring.md b/docs/tutorials/issue-debugging-monitoring.md --- a/docs/tutorials/issue-debugging-monitoring.md +++ b/docs/tutorials/issue-debugging-monitoring.md @@ -98,7 +98,7 @@ Official documentation: -Kibana is a vizualization UI for searching through indexed logs. You can search through +Kibana is a visualization UI for searching through indexed logs. You can search through different sources of logs in the "Discover" pane. The sources configured include application logs for SWH services and system logs. You can also access dashboards shared by other on a particular topic or create our own from a saved search. @@ -136,8 +136,8 @@ Now you may want to have a customizable view of these logs, along with graphical presentations. In the "Dashboard" pane, create a new dashboard. Click "add" in the top -toolbar and select your saved search. It will appear in resizeable panel. Now doing a -search will restrict the search to the dataset cinfigured for the panels. +toolbar and select your saved search. It will appear in resizable panel. Now doing a +search will restrict the search to the dataset configured for the panels. -To create more complete vizualizations including graphs, refer to: +To create more complete visualizations including graphs, refer to: diff --git a/docs/tutorials/testing.rst b/docs/tutorials/testing.rst --- a/docs/tutorials/testing.rst +++ b/docs/tutorials/testing.rst @@ -66,7 +66,7 @@ General considerations ^^^^^^^^^^^^^^^^^^^^^^ -We mostly do functional tests, and unit-testing when more ganularity is needed. By this, +We mostly do functional tests, and unit-testing when more granularity is needed. By this, we mean that we test each functionality and invariants of a component, without isolating it from its dependencies systematically. The goal is to strike a balance between test effectiveness and test maintenance. However, the most critical parts, like the storage @@ -80,7 +80,7 @@ * One test may check multiples properties or commonly combined functionalities, if it can fit in a short descriptive name. * Organize tests in multiple modules, one for each aspect or subcomponent tested. - e.g.: initialization/configuration, db/backend, service API, utils, cli, etc. + e.g.: initialization/configuration, db/backend, service API, utils, CLI, etc. Test data ^^^^^^^^^