diff --git a/docs/images/network.png b/docs/images/network.png deleted file mode 100644 index 5018264..0000000 Binary files a/docs/images/network.png and /dev/null differ diff --git a/docs/infrastructure/index.rst b/docs/infrastructure/index.rst deleted file mode 100644 index 300e5c0..0000000 --- a/docs/infrastructure/index.rst +++ /dev/null @@ -1,15 +0,0 @@ -.. _infrastructure: - -Infrastructure -############## - -.. keep this in sync with the 'infrastructure' section in swh-docs/docs/index.rst - -This section regroups the knowledge base and procedures relative to the |swh| infrastructure management. - -.. toctree:: - :maxdepth: 2 - :titlesonly: - - service-urls - network diff --git a/docs/infrastructure/network.rst b/docs/infrastructure/network.rst deleted file mode 100644 index 3ee8d5b..0000000 --- a/docs/infrastructure/network.rst +++ /dev/null @@ -1,174 +0,0 @@ -.. _network_configuration: - -Network documentation -##################### - -.. keep this in sync with the 'infrastructure' section in swh-docs/docs/index.rst - -This section regroups the knowledge base for our network components. - - -.. toctree:: - :maxdepth: 2 - :titlesonly: - - -Network architecture -******************** - -The network is split in several VLANs provided by the INRIA network team: - -.. thumbnail:: ../images/network.png - - -Firewalls -========= - -The firewalls are 2 `OPNsense `_ VMs deployed on the PROXMOX cluster with an `High Availability `_ configuration. - -They are sharing a virtual IP on each VLAN to act as the gateway. Only one of the 2 firewalls is owning all the GW ips at the same time. The owner is called the ``PRIMARY`` - -.. list-table:: - :header-rows: 1 - - * - Nominal Role - - name (link to the inventory) - - login page - * - PRIMARY - - `pushkin `_ - - `https://pushkin.internal.softwareheritage.org `_ - * - BACKUP - - `glyptotek `_ - - `https://glyptotek.internal.softwareheritage.org `_ - - -Access to the gui of the secondary firewall ----------------------------------------------- - -The secondary firewall is not directly reachable for VPN user. -As the OpenVPN service is also running when the firewall is a backup, the packets -coming from tne VPN are routed to the local VPN on the secondary and lost. - -To access to GUI, a tunnel can be used: - - ssh -L 8443:pushkin.internal.softwareheritage.org:443 pergamon.internal.softwareheritage.org - -Once the tunnel is created, the gui is accessible at https://localhost:8443 in any browser - -Configuration backup --------------------- - -The configuration is automatically committed on a `git repository `_. -Each firewall regularly pushes its configuration on a dedicated branch of the repository. - -The configuration is visible on the `System / Configuration / Backups `_ page -of each one. - -Upgrade procedure ------------------ - -Initial status -^^^^^^^^^^^^^^ - -This is the nominal status of the firewalls: - -.. list-table:: - :header-rows: 1 - - * - Firewall - - Status - * - pushkin - - PRIMARY - * - glyptotek - - BACKUP - -Preparation -^^^^^^^^^^^ - -* Connect to the `principal `_ (pushkin here) -* Check the `CARP status `_ to ensure the firewall is the principal (must have the status MASTER for all the IPS) -* Connect to the `backup `_ (glytotek here) -* Check the `CARP status `__ to ensure the firewall is the backup (must have the status BACKUP for all the IPS) -* Ensure the 2 firewalls are in sync: - - * On the principal, go to the `High availability status `_ and force a synchronization - * click on the button on the right of ``Synchronize config to backup`` - -.. image:: ../images/infrastructure/network/sync.png - -* Switch the principal/backup to prepare the upgrade of the master - (The switch is transparent from the user perspective and can be done without service interruption) - - * [1] On the principal, go to the `Virtual IPS status `_ page - * Activate the CARP maintenance mode - - .. image:: ../images/infrastructure/network/carp_maintenance.png - - * check the status of the VIPs, they must be ``BACKUP`` on pushkin and ``PRIMARY`` on glyptotek - - -* wait a few minutes to let the monitoring detect if there are connection issues, check ssh connection on several servers on different VLANs (staging, admin, ...) - -If everything is ok, proceed to the next section. - -Upgrade the first firewall -^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Before starting this section, the firewall statuses should be: - -.. list-table:: - :header-rows: 1 - - * - Firewall - - Status - * - pushkin - - BACKUP - * - glyptotek - - PRIMARY - -If not, be sure of what you are doing and adapt the links accordingly - -* [2] go to the `System Firmware: status `_ page (pushkin here) -* Click on the ``Check for upgrades`` button - -.. image:: ../images/infrastructure/network/check_for_upgrade.png - -* follow the interface indication, one or several reboots can be necessary depending to the number of upgrade to apply - -.. image:: ../images/infrastructure/network/proceed_update.png - -* repeat from the ``Check for upgrades`` operation until there is no upgrades to apply -* Switch the principal/backup to restore ``pushkin`` as the principal: - - * on the current backup (pushkin here) go to `Virtual IPS status `_ - * [3] click on `Leave Persistent CARP Maintenance Mode` - - .. image:: ../images/infrastructure/network/reactivate_carp.png - - * refresh the page, the role should have changed from ``BACKUP`` to ``MASTER`` - * check on the other firewall, if the roles is indeed ``BACKUP`` for all the IPs - -* Wait few moment to ensure everything is ok with the new version - -Upgrade the second firewall -^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Before starting this section, the firewall statuses should be: - -.. list-table:: - :header-rows: 1 - - * - Firewall - - Status - * - pushkin - - PRIMARY - * - glyptotek - - BACKUP - -If not, be sure of what you are doing and adapt the links accordingly - -* Proceed to the second firewall upgrade - - * perform [1] on the backup (should be ``glyptotek`` here) - * perform [2] on the backup (should be ``glyptotek`` here) - * perform [3] on the backup (should be ``glyptotek`` here) diff --git a/swh/docs/sphinx/conf.py b/swh/docs/sphinx/conf.py index b2304e8..ad25f75 100755 --- a/swh/docs/sphinx/conf.py +++ b/swh/docs/sphinx/conf.py @@ -1,283 +1,286 @@ #!/usr/bin/env python3 # -*- coding: utf-8 -*- # import logging import os from typing import Dict from sphinx.ext import autodoc from swh.docs.django_settings import force_django_settings # General information about the project. project = "Software Heritage - Development Documentation" copyright = "2015-2021 The Software Heritage developers" author = "The Software Heritage developers" # -- General configuration ------------------------------------------------ # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ "sphinx.ext.autodoc", "sphinx.ext.napoleon", "sphinx.ext.intersphinx", "sphinxcontrib.httpdomain", "sphinx.ext.extlinks", "sphinxcontrib.images", "sphinxcontrib.programoutput", "sphinx.ext.viewcode", "sphinx_tabs.tabs", "sphinx_rtd_theme", "sphinx.ext.graphviz", "sphinx_click.ext", "myst_parser", "sphinx.ext.todo", "sphinx_reredirects", "swh.docs.sphinx.view_in_phabricator", # swh.scheduler inherits some attribute descriptions from celery that use # custom crossrefs (eg. :setting:`task_ignore_result`) "sphinx_celery.setting_crossref", ] # Add any paths that contain templates here, relative to this directory. templates_path = ["_templates"] # The suffix(es) of source filenames. # You can specify multiple suffix as a list of string: # source_suffix = ".rst" # The master toctree document. master_doc = "index" # A string of reStructuredText that will be included at the beginning of every # source file that is read. # A bit hackish but should work both for each swh package and the whole swh-doc rst_prolog = """ .. include:: /../../swh-docs/docs/swh_substitutions """ # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. version = "" # The full version, including alpha/beta/rc tags. release = "" # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. # # This is also used if you do content translation via gettext catalogs. # Usually you set "language" from the command line for these cases. language = "en" # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. # This patterns also effect to html_static_path and html_extra_path exclude_patterns = [ "_build", "swh-icinga-plugins/index.rst", "swh-perfecthash/index.rst", "swh-perfecthash/README.rst", "swh.loader.cvs.rcsparse.setup.rst", "apidoc/swh.loader.cvs.rcsparse.setup.rst", ] # The name of the Pygments (syntax highlighting) style to use. pygments_style = "sphinx" # If true, `todo` and `todoList` produce output, else they produce nothing. todo_include_todos = True # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. # html_theme = "sphinx_rtd_theme" html_favicon = "_static/favicon.ico" # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. # html_theme_options = { "collapse_navigation": True, "sticky_navigation": True, } html_logo = "_static/software-heritage-logo-title-motto-vertical-white.png" # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". html_static_path = ["_static"] # make logo actually appear, avoiding gotcha due to alabaster default conf. # https://github.com/bitprophet/alabaster/issues/97#issuecomment-303722935 html_sidebars = { "**": [ "about.html", "globaltoc.html", "relations.html", "sourcelink.html", "searchbox.html", ] } # If not None, a 'Last updated on:' timestamp is inserted at every page # bottom, using the given strftime format. # The empty string is equivalent to '%b %d, %Y'. html_last_updated_fmt = "%Y-%m-%d %H:%M:%S %Z" # refer to the Python standard library. intersphinx_mapping = { "python": ("https://docs.python.org/3", None), "swh-devel": ("https://docs.softwareheritage.org/devel", None), "swh-sysadm": ("https://docs.softwareheritage.org/sysadm", None), } # Redirects for pages that were moved, so we don't break external links. # Uses sphinx-reredirects redirects = { "swh-deposit/spec-api": "api/api-documentation.html", "swh-deposit/metadata": "api/metadata.html", "swh-deposit/specs/blueprint": "../api/use-cases.html", "swh-deposit/user-manual": "api/user-manual.html", + "infrastructure/index.html": "sysadm/network-architecture/index.html", + "infrastructure/network.html": "sysadm/network-architecture/index.html", + "infrastructure/service-urls.html": "sysadm/network-architecture/service-urls.html", "architecture": "architecture/overview.html", "mirror": "architecture/mirror.html", "users": "user", } # -- autodoc configuration ---------------------------------------------- autodoc_default_flags = [ "members", "undoc-members", "private-members", "special-members", ] autodoc_member_order = "bysource" autodoc_mock_imports = [ "rados", ] autoclass_content = "both" modindex_common_prefix = ["swh."] # For the todo extension. Todo and todolist produce output only if this is True todo_include_todos = True _swh_web_base_url = "https://archive.softwareheritage.org" # for the extlinks extension, sub-projects should fill that dict extlinks: Dict = { "swh_web": (f"{_swh_web_base_url}/%s", None), "swh_web_api": (f"{_swh_web_base_url}/api/1/%s", None), "swh_web_browse": (f"{_swh_web_base_url}/browse/%s", None), } # SWH_PACKAGE_DOC_TOX_BUILD environment variable is set in a tox environment # named sphinx for each swh package (except the swh-docs package itself). swh_package_doc_tox_build = os.environ.get("SWH_PACKAGE_DOC_TOX_BUILD", False) # override some configuration when building a swh package # documentation with tox to remove warnings and suppress # those related to unresolved references if swh_package_doc_tox_build: swh_substitutions = os.path.join( os.path.dirname(__file__), "../../../docs/swh_substitutions" ) rst_prolog = f".. include:: /{swh_substitutions}" suppress_warnings = ["ref.ref"] html_favicon = "" html_logo = "" class SimpleDocumenter(autodoc.FunctionDocumenter): """ Custom autodoc directive to inline the docstring of a function in a document without the signature header and with no indentation. Example of use:: .. autosimple:: swh.web.api.views.directory.api_directory """ objtype = "simple" # ensure the priority is lesser than the base FunctionDocumenter # to avoid side effects with autodoc processing priority = -1 # do not indent the content content_indent = "" # do not add a header to the docstring def add_directive_header(self, sig): pass # sphinx event handler to set adequate django settings prior reading # apidoc generated rst files when building doc to avoid autodoc errors def set_django_settings(app, env, docname): if any([pattern in app.srcdir for pattern in ("swh-web-client", "DWCLI")]): # swh-web-client is detected as swh-web by the code below but # django is not installed when building standalone swh-web-client doc return package_settings = { "auth": "swh.auth.tests.django.app.apptest.settings", "deposit": "swh.deposit.settings.development", "web": "swh.web.settings.development", } for package, settings in package_settings.items(): if any( [pattern in docname for pattern in (f"swh.{package}", f"swh-{package}")] ): force_django_settings(settings) # when building local package documentation with tox, insert glossary # content at the end of the index file in order to resolve references # to the terms it contains def add_glossary_to_index(app, docname, source): if docname == "index": glossary_path = os.path.join( os.path.dirname(__file__), "../../../docs/glossary.rst" ) with open(glossary_path, "r") as glossary: source[0] += "\n" + glossary.read() def setup(app): # env-purge-doc event is fired before source-read app.connect("env-purge-doc", set_django_settings) # add autosimple directive (used in swh-web) app.add_autodocumenter(SimpleDocumenter) # set an environment variable indicating we are currently building # the documentation os.environ["SWH_DOC_BUILD"] = "1" logger = logging.getLogger("sphinx") if swh_package_doc_tox_build: # ensure glossary will be available in package doc scope app.connect("source-read", add_glossary_to_index) # suppress some httpdomain warnings in non web packages if not any([pattern in app.srcdir for pattern in ("swh-web", "DWAPPS")]): # filter out httpdomain unresolved reference warnings # to not consider them as errors when using -W option of sphinx-build class HttpDomainRefWarningFilter(logging.Filter): def filter(self, record: logging.LogRecord) -> bool: return not record.msg.startswith("Cannot resolve reference to") # insert a custom filter in the warning log handler of sphinx logger.handlers[1].filters.insert(0, HttpDomainRefWarningFilter()) diff --git a/docs/images/infrastructure/network/carp_maintenance.png b/sysadm/images/infrastructure/network/carp_maintenance.png similarity index 100% rename from docs/images/infrastructure/network/carp_maintenance.png rename to sysadm/images/infrastructure/network/carp_maintenance.png diff --git a/docs/images/infrastructure/network/check_for_upgrade.png b/sysadm/images/infrastructure/network/check_for_upgrade.png similarity index 100% rename from docs/images/infrastructure/network/check_for_upgrade.png rename to sysadm/images/infrastructure/network/check_for_upgrade.png diff --git a/docs/images/infrastructure/network/proceed_update.png b/sysadm/images/infrastructure/network/proceed_update.png similarity index 100% rename from docs/images/infrastructure/network/proceed_update.png rename to sysadm/images/infrastructure/network/proceed_update.png diff --git a/docs/images/infrastructure/network/reactivate_carp.png b/sysadm/images/infrastructure/network/reactivate_carp.png similarity index 100% rename from docs/images/infrastructure/network/reactivate_carp.png rename to sysadm/images/infrastructure/network/reactivate_carp.png diff --git a/docs/images/infrastructure/network/sync.png b/sysadm/images/infrastructure/network/sync.png similarity index 100% rename from docs/images/infrastructure/network/sync.png rename to sysadm/images/infrastructure/network/sync.png diff --git a/sysadm/network-architecture/how-to-access-firewall-settings.rst b/sysadm/network-architecture/how-to-access-firewall-settings.rst index 47c4371..8e4927d 100644 --- a/sysadm/network-architecture/how-to-access-firewall-settings.rst +++ b/sysadm/network-architecture/how-to-access-firewall-settings.rst @@ -1,7 +1,54 @@ .. _firewall_settings: How to access firewall settings =============================== -.. todo:: - This page is a work in progress. For now, please refer to the existing documentation :ref:`swh-devel:network_configuration`. \ No newline at end of file +.. admonition:: Intended audience + :class: important + + sysadm staff members + +The firewalls are 2 `OPNsense `_ VMs deployed on the PROXMOX +cluster with an `High Availability +`_ +configuration. + +They are sharing a virtual IP on each VLAN to act as the gateway. Only one of the 2 +firewalls is owning all the GW ips at the same time. The owner is called the ``PRIMARY`` + +.. list-table:: + :header-rows: 1 + + * - Nominal Role + - name (link to the inventory) + - login page + * - PRIMARY + - `pushkin `_ + - `https://pushkin.internal.softwareheritage.org `_ + * - BACKUP + - `glyptotek `_ + - `https://glyptotek.internal.softwareheritage.org `_ + +Access to the gui of the secondary firewall +------------------------------------------- + +The secondary firewall is not directly reachable for VPN user. As the OpenVPN service is +also running when the firewall is a backup, the packets coming from tne VPN are routed +to the local VPN on the secondary and lost. + +To access to GUI, a tunnel can be used: + + ssh -L 8443:pushkin.internal.softwareheritage.org:443 pergamon.internal.softwareheritage.org + +Once the tunnel is created, the gui is accessible at https://localhost:8443 in any +browser + +Configuration backup +-------------------- + +The configuration is automatically committed on a `git repository +`_. Each firewall +regularly pushes its configuration on a dedicated branch of the repository. + +The configuration is visible on the `System / Configuration / Backups +`_ page of each one. diff --git a/sysadm/network-architecture/how-to-upgrade-firewall-os.rst b/sysadm/network-architecture/how-to-upgrade-firewall-os.rst index 0a56e34..ff243d3 100644 --- a/sysadm/network-architecture/how-to-upgrade-firewall-os.rst +++ b/sysadm/network-architecture/how-to-upgrade-firewall-os.rst @@ -1,7 +1,129 @@ .. _upgrade_firewall_os: How to upgrade firewall OS ========================== -.. todo:: - This page is a work in progress. For now, please refer to the existing documentation :ref:`swh-devel:network_configuration`. \ No newline at end of file +.. admonition:: Intended audience + :class: important + + sysadm staff members. + +Initial status +^^^^^^^^^^^^^^ + +This is the nominal status of the firewalls: + +.. list-table:: + :header-rows: 1 + + * - Firewall + - Status + * - pushkin + - PRIMARY + * - glyptotek + - BACKUP + +Preparation +^^^^^^^^^^^ + +* Connect to the `principal `_ (pushkin + here) +* Check the `CARP status + `_ to ensure the + firewall is the principal (must have the status MASTER for all the IPS) +* Connect to the `backup `_ (glytotek + here) +* Check the `CARP status + `__ to ensure the + firewall is the backup (must have the status BACKUP for all the IPS) +* Ensure the 2 firewalls are in sync: + + * On the principal, go to the `High availability status + `_ and force a + synchronization + * click on the button on the right of ``Synchronize config to backup`` + +.. image:: ../images/infrastructure/network/sync.png + +* Switch the principal/backup to prepare the upgrade of the master (The switch is + transparent from the user perspective and can be done without service interruption) + + * [1] On the principal, go to the `Virtual IPS status + `_ page + * Activate the CARP maintenance mode + + .. image:: ../images/infrastructure/network/carp_maintenance.png + + * check the status of the VIPs, they must be ``BACKUP`` on pushkin and ``PRIMARY`` on glyptotek + + +* wait a few minutes to let the monitoring detect if there are connection issues, check + ssh connection on several servers on different VLANs (staging, admin, ...) + +If everything is ok, proceed to the next section. + +Upgrade the first firewall +^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Before starting this section, the firewall statuses should be: + +.. list-table:: + :header-rows: 1 + + * - Firewall + - Status + * - pushkin + - BACKUP + * - glyptotek + - PRIMARY + +If not, be sure of what you are doing and adapt the links accordingly + +* [2] go to the `System Firmware: status + `_ page + (pushkin here) +* Click on the ``Check for upgrades`` button + +.. image:: ../images/infrastructure/network/check_for_upgrade.png + +* follow the interface indication, one or several reboots can be necessary depending to + the number of upgrade to apply + +.. image:: ../images/infrastructure/network/proceed_update.png + +* repeat from the ``Check for upgrades`` operation until there is no upgrades to apply +* Switch the principal/backup to restore ``pushkin`` as the principal: + + * on the current backup (pushkin here) go to `Virtual IPS status + `_ + * [3] click on `Leave Persistent CARP Maintenance Mode` + + .. image:: ../images/infrastructure/network/reactivate_carp.png + + * refresh the page, the role should have changed from ``BACKUP`` to ``MASTER`` + * check on the other firewall, if the roles is indeed ``BACKUP`` for all the IPs + +* Wait few moment to ensure everything is ok with the new version + +Upgrade the second firewall +^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +Before starting this section, the firewall statuses should be: + +.. list-table:: + :header-rows: 1 + + * - Firewall + - Status + * - pushkin + - PRIMARY + * - glyptotek + - BACKUP + +If not, be sure of what you are doing and adapt the links accordingly + +* Proceed to the second firewall upgrade + + * perform [1] on the backup (should be ``glyptotek`` here) + * perform [2] on the backup (should be ``glyptotek`` here) + * perform [3] on the backup (should be ``glyptotek`` here) diff --git a/sysadm/network-architecture/index.rst b/sysadm/network-architecture/index.rst index 733f8f1..411b92f 100644 --- a/sysadm/network-architecture/index.rst +++ b/sysadm/network-architecture/index.rst @@ -1,10 +1,11 @@ Network architecture ==================== .. toctree:: :titlesonly: reference-network-configuration how-to-access-firewall-settings how-to-upgrade-firewall-os + service-urls diff --git a/sysadm/network-architecture/reference-network-configuration.rst b/sysadm/network-architecture/reference-network-configuration.rst index cb555f7..5477bf6 100644 --- a/sysadm/network-architecture/reference-network-configuration.rst +++ b/sysadm/network-architecture/reference-network-configuration.rst @@ -1,63 +1,74 @@ .. _network_configuration: Reference: Network configuration ================================ +.. admonition:: Intended audience + :class: important + + sysadm staff members. + The network is split in several VLANs provided by the INRIA network team: .. thumbnail:: ../images/network.svg VLANs ----- All inter vlan communications are filtered by our firewalls `pushkin` and `glyptotek`. .. todo:: - Check the for more information. + Check the :ref:`firewall settings ` page for more information. VLAN1300 - Public network ~~~~~~~~~~~~~~~~~~~~~~~~~ -The detail of this range is available in this `VLAN1300 inventory page `_ +The detail of this range is available in this `VLAN1300 inventory page +`_ -All the inbound traffic is firewalled by the INRIA gateway. The detail of the opened ports is -visible on the private archive in the file :file:`sysadm/Software_Heritage_VLAN1300_plan.ods` +All the inbound traffic is firewalled by the INRIA gateway. The detail of the opened +ports is visible on the private archive in the file +:file:`sysadm/Software_Heritage_VLAN1300_plan.ods` Some nodes are directly exposed on this network for special needs: * moma: the main archive entry point * production workers: to have different visible ips during forge crawling -* pergamon: act as a reverse proxy for some public sites (debian repository, annex, sentry, ...) +* pergamon: act as a reverse proxy for some public sites (debian repository, annex, + sentry, ...) * forge: needs some special rules VLAN440 - Production network ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ All the nodes dedicated to the main archive are deployed in this network. -The detail of this range is available in this `VLAN440 inventory page `_ +The detail of this range is available in this `VLAN440 inventory page +`_ -For historical reasons, some admin nodes are deployed in this range (monitoring, ci, ...) -and will be progressively moved into the admin network. +For historical reasons, some admin nodes are deployed in this range (monitoring, ci, +...) and will be progressively moved into the admin network. The internal domain associted to this vlan is ``.internal.staging.swh.network`` VLAN443 - Staging network ~~~~~~~~~~~~~~~~~~~~~~~~~ -All the nodes dedicated to the staging version of the archive are deployed on this network. -POCs and temporary nodes can also take place in the range. +All the nodes dedicated to the staging version of the archive are deployed on this +network. POCs and temporary nodes can also take place in the range. -The detail of this range is visible in this `VLAN443 inventory page `_ +The detail of this range is visible in this `VLAN443 inventory page +`_ The internal domain associted to this vlan is ``.internal.staging.swh.network`` VLAN442 - Admin network ~~~~~~~~~~~~~~~~~~~~~~~ This network is dedicated for admin and support nodes. -The detail of this range is visible in this `VLAN442 inventory page `_. +The detail of this range is visible in this `VLAN442 inventory page +`_. -The internal domain associted to this vlan is ``.internal.admin.swh.network`` +The internal domain associated to this vlan is ``.internal.admin.swh.network`` diff --git a/docs/infrastructure/service-urls.rst b/sysadm/network-architecture/service-urls.rst similarity index 99% rename from docs/infrastructure/service-urls.rst rename to sysadm/network-architecture/service-urls.rst index d925ae8..5136b55 100644 --- a/docs/infrastructure/service-urls.rst +++ b/sysadm/network-architecture/service-urls.rst @@ -1,191 +1,196 @@ +.. _service-url: + Service urls -##################### +============ +.. admonition:: Intended audience + :class: important -This section regroups the urls of the services + Staff members +This section regroups the urls of the services. .. toctree:: :maxdepth: 2 :titlesonly: Staging ------- Try to use the staging environment as far as possible for your tests Public urls ~~~~~~~~~~~ +---------------------------------------+-------------------------------------------+ | Service | URL | +=======================================+===========================================+ | swh-web | https://webapp.staging.swh.network | +---------------------------------------+-------------------------------------------+ | swh-deposit | https://deposit.staging.swh.network | +---------------------------------------+-------------------------------------------+ | swh-objstorage read-only (for mirror) | https://objstorage.staging.swh.network | +---------------------------------------+-------------------------------------------+ | Journal TLS | broker1.journal.staging.swh.network:9093 | +---------------------------------------+-------------------------------------------+ Internal services ~~~~~~~~~~~~~~~~~ +--------------------------+------------------------------------------------------+--------+------------+ | Service | URL | VPN[1] | Private[2] | +==========================+======================================================+========+============+ | swh-storage | http://storage1.internal.staging.swh.network:5002 | | X | +--------------------------+------------------------------------------------------+--------+------------+ | swh-storage read-only | http://webapp.internal.staging.swh.network:5002 | X | | +--------------------------+------------------------------------------------------+--------+------------+ | swh-objstorage | http://storage1.internal.staging.swh.network:5003 | | X | +--------------------------+------------------------------------------------------+--------+------------+ | swh-objstorage read-only | http://objstorage0.internal.staging.swh.network:5003 | X | | +--------------------------+------------------------------------------------------+--------+------------+ | swh-scheduler | http://scheduler0.internal.staging.swh.network:5008 | X | | +--------------------------+------------------------------------------------------+--------+------------+ | swh-counters | http://counters0.internal.staging.swh.network:5011 | X | | +--------------------------+------------------------------------------------------+--------+------------+ | swh-search | http://webapp.internal.staging.swh.network:5010 | X | | +--------------------------+------------------------------------------------------+--------+------------+ | swh-search | http://search0.internal.staging.swh.network:5010 | | X | +--------------------------+------------------------------------------------------+--------+------------+ | swh-vault | http://vault.internal.staging.swh.network:5005 | | X | +--------------------------+------------------------------------------------------+--------+------------+ | Journal plaintext | journal1.internal.staging.swh.network:9092 | | X | +--------------------------+------------------------------------------------------+--------+------------+ | Journal internal TLS | journal1.internal.staging.swh.network:9094 | | X | +--------------------------+------------------------------------------------------+--------+------------+ SWH backends ~~~~~~~~~~~~ +--------------------+---------------------------------------------------------+--------+------------+ | Backend | URL | VPN[1] | Private[2] | +====================+=========================================================+========+============+ | RabbitMq GUI | http://scheduler0.internal.staging.swh.network:15672 | X | | +--------------------+---------------------------------------------------------+--------+------------+ | archive database | db1.internal.staging.swh.network:5432/swh | X | | +--------------------+---------------------------------------------------------+--------+------------+ | webapp database | db1.internal.staging.swh.network:5432/swh-web | X | | +--------------------+---------------------------------------------------------+--------+------------+ | deposit database | db1.internal.staging.swh.network:5432/swh-deposit | X | | +--------------------+---------------------------------------------------------+--------+------------+ | vault database | db1.internal.staging.swh.network:5432/swh-vault | X | | +--------------------+---------------------------------------------------------+--------+------------+ | scheduler database | db1.internal.staging.swh.network:5432/swh-scheduler | X | | +--------------------+---------------------------------------------------------+--------+------------+ | lister database | db1.internal.staging.swh.network:5432/swh-lister | X | | +--------------------+---------------------------------------------------------+--------+------------+ | swh-search ES | http://search-esnode0.internal.staging.swh.network:9200 | | X | +--------------------+---------------------------------------------------------+--------+------------+ | Counters redis | counters0.internal.staging.swh.network:6379 | | X | +--------------------+---------------------------------------------------------+--------+------------+ Production ---------- .. _public-urls-1: Public urls ~~~~~~~~~~~ +---------------------------------------+-----------------------------------------------+ | Service | URL | +=======================================+===============================================+ | swh-web | https://archive.softwareheritage.org | +---------------------------------------+-----------------------------------------------+ | swh-deposit | https://deposit.softwareheritage.org | +---------------------------------------+-----------------------------------------------+ | swh-objstorage read-only (for mirror) | N/A | +---------------------------------------+-----------------------------------------------+ | Journal TLS | broker[1-4].journal.softwareheritage.org:9093 | +---------------------------------------+-----------------------------------------------+ .. _internal-services-1: Internal services ~~~~~~~~~~~~~~~~~ +--------------------------+----------------------------------------------------------------+--------+------------+ | Service | URL | VPN[1] | Private[2] | +==========================+================================================================+========+============+ | swh-web test/validation | https://webapp1.internal.softwareheritage.org | X | | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-storage | http://saam.internal.softwareheritage.org:5002 | | X | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-storage read-only | http://webapp1.internal.softwareheritage.org:5002 | X | | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-storage read-only | http://moma.internal.softwareheritage.org:5002 | X | | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-objstorage | http://saam.internal.softwareheritage.org:5003 | | X | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-objstorage read-only | N/A | | | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-scheduler | http://saatchi.internal.softwareheritage.org:5008 | X | | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-counters | http://counters1.internal.softwareheritage.org:5011 | X | | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-search | http://webapp1.internal.softwareheritage.org:5010 | X | | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-search | http://moma.internal.softwareheritage.org:5010 | X | | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-search | http://search1.internal.softwareheritage.org:5010 | | X | +--------------------------+----------------------------------------------------------------+--------+------------+ | swh-vault | http://vangogh.euwest.azure.internal.softwareheritage.org:5005 | | X | +--------------------------+----------------------------------------------------------------+--------+------------+ | Journal plaintext | kafka[1-4].internal.softwareheritage.org:9092 | | X | +--------------------------+----------------------------------------------------------------+--------+------------+ | Journal internal TLS | kafka[1-4].internal.softwareheritage.org:9094 | X | | +--------------------------+----------------------------------------------------------------+--------+------------+ .. _swh-backends-1: SWH backends ~~~~~~~~~~~~ +--------------------------+-----------------------------------------------------------------------+--------+------------+ | Backend | URL | VPN[1] | Private[2] | +==========================+=======================================================================+========+============+ | RabbitMq GUI | http://saatchi.internal.softwareheritage.org:15672 | X | | +--------------------------+-----------------------------------------------------------------------+--------+------------+ | archive database replica | somerset.internal.softwareheritage.org:5432/softwareheritage | X | | +--------------------------+-----------------------------------------------------------------------+--------+------------+ | archive database main | belvedere.internal.softwareheritage.org:5432/softwareheritage | X | | +--------------------------+-----------------------------------------------------------------------+--------+------------+ | webapp database main | belvedere.internal.softwareheritage.org:5432/swh-web | X | | +--------------------------+-----------------------------------------------------------------------+--------+------------+ | scheduler database | belvedere.internal.softwareheritage.org:5432/swh-scheduler | X | | +--------------------------+-----------------------------------------------------------------------+--------+------------+ | lister database | belvedere.internal.softwareheritage.org:5432/swh-lister | X | | +--------------------------+-----------------------------------------------------------------------+--------+------------+ | deposit database | belvedere.internal.softwareheritage.org:5432/softwareheritage-deposit | X | | +--------------------------+-----------------------------------------------------------------------+--------+------------+ | vault database | belvedere.internal.softwareheritage.org:5432/swh-vault | X | | +--------------------------+-----------------------------------------------------------------------+--------+------------+ | swh-search ES | http://search-esnode[1-3].internal.softwareheritage.org:9200 | | X | +--------------------------+-----------------------------------------------------------------------+--------+------------+ | Counters redis | counters1.internal.softwareheritage.org:6379 | | X | +--------------------------+-----------------------------------------------------------------------+--------+------------+ Other tools ----------- +-------------------+-------------------------------------------------------+--------------------+--------+------------+ | Tool | URL | Public | VPN[1] | Private[2] | +===================+=======================================================+====================+========+============+ | grafana | https://grafana.softwareheritage.org | X | | | +-------------------+-------------------------------------------------------+--------------------+--------+------------+ | Kibana | http://kibana0.internal.softwareheritage.org:5601 | | X | | +-------------------+-------------------------------------------------------+--------------------+--------+------------+ | Log Elasticsearch | http://search[1-3].internal.softwareheritage.org:9200 | | X | | +-------------------+-------------------------------------------------------+--------------------+--------+------------+ | C.M.A.K. | http://getty.internal.softwareheritage.org:9000 | | X | | +-------------------+-------------------------------------------------------+--------------------+--------+------------+ | Sentry | https://sentry.softwareheritage.org | X (authentication) | | | +-------------------+-------------------------------------------------------+--------------------+--------+------------+ [1] VPN: URL only accessible when connected to the SoftwareHeritage VPN [2] Private: URL only accessible from the internal network, i.e nor public neither accessible through the VPN.